Designing Software Under HIPAA: When Architecture Meets Responsibility

Designing Software Under HIPAA: When Architecture Meets Responsibility
Photo by Atikah Akhtar / Unsplash

One thing I’ve always enjoyed about software engineering is the design phase.

Before writing any code, there is this moment where you think about the system as a whole: the architecture, the constraints, the trade-offs. It feels a little like designing a city before anyone builds the buildings. If the design is good, the system grows naturally. If the design is poor, no amount of clever code can save it. During my Master's in Artificial Intelligence, I took a course called Program Design Paradigm, where we explored different ways of structuring software systems. At the time, it felt theoretical discussing abstraction layers, modularity, and design patterns. But once I started working on real systems, especially in healthcare, those lessons suddenly became very real.


When Software Handles Real People's Data

When I started working at Lexington Periodontics & Implantology as an AI Specialist and IT support, I began developing small internal tools to improve workflow and automation.

Things like:

  • automation scripts for internal processes  
  • analytics dashboards  
  • small AI tools for documentation  
  • workflow optimizations for clinic operations  

These are the kinds of tools many engineers build every day. But healthcare introduces a constraint that changes everything.

HIPAA compliance

Suddenly, the system you're designing isn't just moving data around. It is handling Protected Health Information (PHI) information that represents real people, their medical histories, their treatments, and their identities. At that moment, software engineering stops being just an engineering exercise.

It becomes a responsibility.


Thinking About Design Differently

In most hobby projects or prototypes, the questions we ask are usually:

  • Will this feature work?
  • Is it fast enough?
  • Does the architecture scale?

But when you’re working with healthcare systems, the questions shift.

Before writing code, I started asking things like:

  • What data do we actually need?
  • Who should be allowed to access it?
  • What happens if this system is compromised?
  • How do we audit every access?
  • Can this system operate without exposing patient data?

These questions change the way you design systems from the ground up.


What HIPAA Means for Engineers

HIPAA (Health Insurance Portability and Accountability Act) is often discussed from a legal perspective. But for engineers, it translates into technical constraints that influence architecture.

The most important one is the HIPAA Security Rule, which defines safeguards that software systems must implement. From an engineering perspective, several principles become critical.


Least Privilege Access

Not everyone should see everything. A receptionist scheduling appointments should not have access to full patient medical records.

A typical system might enforce something like:

Role Access Level
Dentist Full clinical and patient data
Receptionist Appointment scheduling only
Billing Staff Insurance and billing records
IT Staff System logs and technical data

This usually leads to Role-Based Access Control (RBAC) layers inside the application.


Encryption Everywhere

Another major principle is protecting data both at rest and in transit. This means things like:

  • TLS secured APIs
  • encrypted databases
  • encrypted backups
  • secure credential storage

A typical data path in a secure system might look something like this:

Client → HTTPS → API Layer → Application Services → Encrypted Database

Encryption does not solve every problem, but it dramatically reduces the damage if something goes wrong.


Audit Trails

One thing HIPAA strongly requires is traceability.

If someone accesses patient information, the system should be able to answer:

  • Who accessed the data
  • when it was accessed
  • What was viewed or modified

A log entry might look something like:

{
  "timestamp": "2026-03-04T10:21:43Z",
  "user": "dentist_102",
  "action": "VIEW_PATIENT_RECORD",
  "patient_id": "87421"
}

Good logging isn’t just about debugging. In regulated systems, it becomes part of compliance and accountability.


The Minimum Necessary Principle

One of the most interesting ideas in HIPAA is the Minimum Necessary Rule. The system should only access the minimum amount of information required to perform a task.

For example, an analytics system might only need aggregated appointment statistics rather than full patient records. That difference might seem small, but architecturally it leads to very different designs. Instead of querying everything:

SELECT * FROM patient_records;

You design systems that request only the necessary fields:

SELECT appointment_time, procedure_type
FROM patient_records
WHERE patient_id = ?;

This reduces risk significantly if the system is ever compromised.


Security by Design

One of the biggest lessons I learned working in healthcare systems is this:

Security cannot be added later.

It has to be part of the design from the very beginning.This philosophy is often called:

  • Security by Design
  • Privacy by Design
  • Defense in Depth

The idea is simple:

Even if one layer of security fails, other layers still protect the system.


How This Changed My Approach to Software

Working in healthcare forced me to think differently about software architecture. Earlier, my focus was often on:

  • performance
  • experimentation
  • rapid prototyping

Now, I find myself thinking more about:

  • trust
  • data boundaries
  • auditability
  • system resilience

Before writing code, I spend much more time thinking about what should not happen, not just what should. Ironically, these constraints often lead to better architecture overall.


Final Thoughts

Designing software in regulated environments like healthcare teaches an important lesson. Software systems are not just technical artifacts. They interact with real human lives and sensitive information.

When engineers build these systems, they are not just writing code. They are guarding trust. And in my experience, that responsibility ultimately leads to more thoughtful and disciplined engineering.