Building a health tech application without HIPAA compliance is building a product you cannot legally sell to hospitals, clinics, insurers, or any entity that handles protected health information. HIPAA is not optional in health tech. It is the baseline requirement that determines whether your product is viable.
We have built HIPAA compliant applications from the ground up and have retrofitted existing products to meet compliance requirements. The technical work is substantial but well defined. Here is what you need to know.
What HIPAA Requires
HIPAA (Health Insurance Portability and Accountability Act) protects Protected Health Information (PHI), which is any individually identifiable health information. That includes obvious data like medical records, diagnoses, and prescriptions, but also less obvious data like appointment schedules, billing records, insurance IDs, and even the fact that someone is a patient at a particular provider.
HIPAA has three rules that affect your application:
The Privacy Rule governs who can access PHI and under what circumstances. It establishes the "minimum necessary" standard: you should only access the minimum amount of PHI needed for the task at hand.
The Security Rule mandates administrative, physical, and technical safeguards for electronic PHI (ePHI). This is where the engineering work lives.
The Breach Notification Rule requires notification of affected individuals, the Department of Health and Human Services, and potentially the media within 60 days of discovering a breach affecting 500 or more individuals.
Penalties range from $100 to $50,000 per violation, up to $1.5 million per year per violation category. Criminal penalties include fines up to $250,000 and imprisonment. These are not theoretical. The HHS Office for Civil Rights actively investigates and penalizes violations.
The Technical Safeguards
The Security Rule specifies technical safeguards that your application must implement.
Encryption
Data at rest: All ePHI must be encrypted using AES 256 or equivalent. This includes your primary database, backups, file storage, and any caches or temporary storage that might contain PHI. Use your cloud provider's encryption services (AWS KMS, Google Cloud KMS, Azure Key Vault) and verify encryption is enabled on every storage layer.
Data in transit: TLS 1.2 or higher for all connections. No exceptions. Internal service to service communication must also be encrypted, not just client facing endpoints. This includes database connections, message queues, and internal API calls.
Encryption key management: Keys must be stored separately from the data they protect. Rotate keys on a defined schedule. Maintain audit logs of key access. Never store encryption keys in source code, environment variables accessible to developers, or alongside encrypted data.
Access Controls
Unique user identification. Every user who accesses ePHI must have a unique identifier. No shared accounts, no generic "admin" logins. This is non negotiable for audit trail purposes.
Role based access control (RBAC). Users should only access the ePHI they need for their role. A billing clerk does not need access to clinical notes. A nurse does not need access to payment information. Your system architecture must enforce this at the data layer, not just the UI layer. Hiding a button is not access control.
Emergency access procedures. You need a documented "break the glass" procedure for accessing PHI in emergencies that bypasses normal access controls. This access must be logged and reviewed.
Automatic logoff. Sessions must time out after a defined period of inactivity. For applications handling PHI, 15 minutes is a common threshold.
Multi factor authentication. While not explicitly required by HIPAA, it is considered a reasonable and appropriate safeguard by auditors and the OCR. If you do not implement MFA, be prepared to document why it is not reasonable for your use case.
Audit Controls
This is where health tech applications differ most from standard SaaS products. Every access to ePHI must be logged. Not just writes. Reads too.
Your audit logging system must capture:
- Who accessed the data (user ID)
- What data was accessed (record identifiers)
- When it was accessed (timestamp)
- How it was accessed (application, API, direct database query)
- What action was taken (view, create, modify, delete, export)
These logs must be tamper proof, retained for at least six years (HIPAA's record retention requirement), and available for review. Store audit logs in a separate system from your application data so that a compromise of the application does not compromise the audit trail.
Building comprehensive audit logging from the start is critical. Retrofitting it onto an existing application with complex data access patterns is one of the most expensive compliance tasks we encounter. We wrote about this challenge in our API design best practices guide, where logging and observability are core architectural concerns.
Integrity Controls
You must implement mechanisms to verify that ePHI has not been improperly altered or destroyed. This means:
Database integrity checks including checksums or hashes for critical records. Backup verification to confirm that backups are complete and restorable. Version history for clinical records so that modifications are tracked and previous states are recoverable.
Transmission Security
Beyond TLS, you need to verify the identity of systems exchanging ePHI. Mutual TLS (mTLS) between services, API authentication for all endpoints, and certificate pinning for mobile applications accessing PHI.
Business Associate Agreements
If you are building a product that handles PHI on behalf of a healthcare provider, you are a Business Associate under HIPAA. You must sign a Business Associate Agreement (BAA) with every covered entity you work with and with every sub processor (cloud provider, email service, etc.) that has access to PHI.
Critical: not every cloud service offers a BAA. AWS, Google Cloud, and Microsoft Azure all offer BAAs, but only for specific services within their platforms. You must verify that every service you use is covered under the BAA. Using an AWS service not covered by their BAA means that data stored there is not HIPAA compliant regardless of your encryption and access controls.
This is where many startups make expensive mistakes. They build on a platform, assume the BAA covers everything, and later discover that a specific database service or storage tier is excluded. Verify before you build.
Infrastructure Requirements
Dedicated or isolated environments. PHI should not be stored on shared infrastructure where other tenants could theoretically access it. Use dedicated database instances, isolated VPCs, and private subnets. Our cloud and DevOps team designs infrastructure specifically for compliance constrained workloads.
Backup and disaster recovery. HIPAA requires a contingency plan including data backup, disaster recovery, and emergency mode operations. Your recovery time objective (RTO) and recovery point objective (RPO) must be documented and tested. Annual disaster recovery testing is a practical minimum.
Physical safeguards. If you use cloud infrastructure from a major provider with a BAA, they handle physical security. If you have any on premises components, you need physical access controls, workstation security, and device and media controls.
Mobile Application Considerations
If your health tech product includes a mobile app, additional considerations apply:
Local data storage. Minimize PHI stored on the device. If you must store it locally, use the platform's secure storage (iOS Keychain, Android Keystore) and encrypt the data. Our approach to building secure mobile applications covers these patterns in detail.
Remote wipe capability. If a device is lost or stolen, you need the ability to remotely clear PHI from the application.
Screenshot prevention. Prevent the OS from capturing screenshots of screens displaying PHI. Both iOS and Android provide mechanisms for this.
Biometric authentication. Leverage device biometrics (Face ID, fingerprint) as a factor for accessing PHI within the app.
The Cost and Timeline
Building a HIPAA compliant health tech application from scratch typically adds 30 to 50 percent to development costs compared to a non compliant equivalent. The major cost drivers are audit logging infrastructure, encryption key management, access control systems, and the testing and documentation required to validate compliance.
For a typical health tech MVP, expect:
- Engineering: $80,000 to $250,000 depending on complexity
- Legal and compliance consulting: $15,000 to $40,000
- Infrastructure: 20 to 40 percent premium over non compliant hosting due to dedicated instances and enhanced monitoring
- Timeline: 4 to 8 months for an MVP with proper compliance controls
Retrofitting HIPAA compliance onto an existing application costs 2 to 5 times more than building it in from the start. The audit logging alone can require restructuring your entire data access layer.
Common Failure Points
Logging PHI in error messages. Your application throws an exception that includes a patient name or medical record number, and it ends up in an unencrypted log aggregator without a BAA. This is a breach.
Developer access to production data. Developers should not have access to production ePHI. Use synthetic or de identified data for development and testing. If production access is needed for debugging, it must be logged, time limited, and justified.
Unencrypted backups. Your database is encrypted, but your backup process exports to an unencrypted S3 bucket. Verify encryption at every stage of your data lifecycle.
Third party analytics. Sending ePHI to analytics platforms without a BAA. This includes sending user identifiers that could be linked back to health information.
If you are building a health tech application and need the infrastructure, architecture, and engineering to meet HIPAA requirements from day one, talk to our team. Getting compliance right at the start saves you from a costly and disruptive retrofit later.