SimonMed Imaging Breach: What Developers Building Health Apps Need to Do Now
SimonMed's breach exposed patient records including SSNs and insurance data. If you're building anything that touches health data, here's what you need to get right before you ship.
Yumi Hirasako
Security Researcher
SimonMed Imaging — one of the largest outpatient imaging networks in the US — disclosed a breach that exposed patient records including names, Social Security numbers, dates of birth, insurance information, and medical records. Real healthcare data breach territory. The kind of data that has a half-life measured in decades, not months.
If you're building a healthcare app, a health-adjacent SaaS, or really anything that stores a field like insurance_id or appointment_date — this breach is directly relevant to your codebase.
What happened at SimonMed
Attackers accessed SimonMed's systems between late 2024 and early 2025. The exposed data included Protected Health Information (PHI) — the regulatory category that makes healthcare breaches categorically more expensive than a typical data leak.
SimonMed reported the breach to the Department of Health and Human Services (HHS). Under HIPAA, that notification is mandatory whenever 500 or more individuals are affected. The affected patient count ran into the hundreds of thousands.
The breach details aren't fully public, but the exposure type is telling: SSNs, dates of birth, insurance data, and medical records all appeared in the incident report. That combination enables identity fraud, insurance fraud, and highly targeted phishing campaigns — all at once.
Why health data is different from every other breach
Medical records don't rotate like passwords. You can generate a new password. You can't generate a new date of birth.
Your diagnosis history doesn't expire. Your SSN doesn't change. Your insurance ID is tied to your identity for years. A breach of health data doesn't have a one-year shelf life — it has a lifetime shelf life. That permanence is exactly why healthcare remains one of the most targeted sectors year after year.
According to IBM's Cost of a Data Breach Report, healthcare breaches cost an average of $9.77 million per incident — more than double the cross-industry average. That's not just fines. It's breach notifications, credit monitoring services, legal defense, and HIPAA enforcement actions from HHS's Office for Civil Rights (OCR).
The developer angle: you may be handling PHI without realizing it
You don't need to build an electronic health record (EHR) system to be in HIPAA's scope.
HIPAA applies to any app that stores, processes, or transmits Protected Health Information — and the definition is broader than most developers expect.
You're likely handling PHI if your app stores any of these:
- Appointment dates linked to a patient name
- Insurance member IDs or group numbers
- Diagnosis codes or treatment history
- Any field that could be used to infer a health condition (e.g., a pharmacy order, a specialist referral)
- Anything that links a user to a specific healthcare provider
The rule of thumb: if a combination of fields in your database could answer the question "what medical thing happened to this specific person," you're probably in PHI territory.
What PHI looks like in a database schema
Here's a simplified table you might build for a healthcare-adjacent scheduling app:
-- A table that looks harmless but is full of PHI
CREATE TABLE patient_appointments (
id uuid PRIMARY KEY DEFAULT gen_random_uuid(),
patient_id uuid NOT NULL REFERENCES profiles(id),
provider_id uuid NOT NULL,
scheduled_at timestamptz NOT NULL,
appointment_type text NOT NULL, -- "cardiology consult", "MRI", "oncology follow-up"
insurance_member_id text,
notes text,
created_at timestamptz DEFAULT now()
);The appointment_type column alone can reveal diagnoses. The insurance_member_id is PHI by itself. Combined with patient_id and scheduled_at, this table is a HIPAA-regulated dataset.
Here's what HIPAA requires for a table like this — and what it looks like in Supabase:
-- Step 1: Enable RLS. No exceptions.
ALTER TABLE patient_appointments ENABLE ROW LEVEL SECURITY;
-- Step 2: Only the patient can read their own appointments
CREATE POLICY "patients_read_own_appointments"
ON patient_appointments
FOR SELECT
USING (patient_id = auth.uid());
-- Step 3: Only authorized providers can insert
CREATE POLICY "providers_insert_appointments"
ON patient_appointments
FOR INSERT
WITH CHECK (
EXISTS (
SELECT 1 FROM provider_profiles
WHERE user_id = auth.uid()
AND is_verified = true
)
);
-- Step 4: Audit log — who accessed what and when
CREATE TABLE phi_access_log (
id uuid PRIMARY KEY DEFAULT gen_random_uuid(),
accessor_id uuid NOT NULL,
table_name text NOT NULL,
record_id uuid NOT NULL,
action text NOT NULL, -- 'SELECT', 'UPDATE', 'DELETE'
accessed_at timestamptz DEFAULT now()
);HIPAA doesn't mandate a specific database technology or SQL dialect. What it mandates is the outcome: access controls, encryption, and audit trails. The SQL above achieves those outcomes in PostgreSQL/Supabase.
The minimum viable HIPAA checklist for developers
This isn't a legal compliance guide — get an attorney for that. This is the technical floor you need to clear before shipping anything that touches PHI.
Encryption at rest
Supabase encrypts data at rest by default using AES-256. AWS RDS does too. If you're running self-hosted PostgreSQL, you need to configure this explicitly — it's not on by default.
Know whether your hosting provider encrypts at rest. Don't assume.
Encryption in transit
All traffic must use TLS. No HTTP. This means:
- Your API calls to the database use SSL connections (check your connection string for
sslmode=require) - Your app is served over HTTPS with a valid certificate
- Any internal service-to-service calls (worker to API, API to third-party) use TLS
# BAD: No SSL mode specified — connection may fall back to plaintext
postgresql://user:pass@host:5432/dbname
# GOOD: SSL required explicitly
postgresql://user:pass@host:5432/dbname?sslmode=requireAccess controls
Row Level Security (RLS) on every table that holds PHI. Least-privilege service accounts — your app's database user should only have the permissions it actually needs.
If your app uses a single service_role key for everything and bypasses RLS, that's a HIPAA problem waiting to happen.
Audit logs
HIPAA requires you to track who accessed what and when. A phi_access_log table like the one above is the starting point. You can implement it with a PostgreSQL trigger:
-- Trigger that logs every SELECT on patient_appointments
CREATE OR REPLACE FUNCTION log_phi_access()
RETURNS event_trigger AS $$
BEGIN
-- In practice, use application-level logging or pgaudit extension
INSERT INTO phi_access_log (accessor_id, table_name, record_id, action)
VALUES (auth.uid(), TG_TABLE_NAME, OLD.id, TG_OP);
END;
$$ LANGUAGE plpgsql SECURITY DEFINER;For production, look at the pgaudit extension — it's available on Supabase and gives you statement-level audit logging without custom triggers.
Breach notification
HIPAA requires notifying HHS and affected individuals within 60 days of discovering a breach. If you're storing PHI, you need:
- A process to detect unauthorized access (logs, alerts, anomaly detection)
- A legal contact or attorney who understands HIPAA notifications
- Documented incident response procedures
The notification requirement isn't optional. SimonMed's HHS filing is public record precisely because of this rule.
The BAA gap most developers never think about
Here's the part that catches developers by surprise: Business Associate Agreements.
A BAA is a contract. When you use a third-party service that stores or processes PHI on your behalf, HIPAA requires that you have a signed BAA with that vendor. Without it, you're in violation — even if the vendor's platform is technically secure.
Common services developers use that require a BAA if you're storing PHI:
| Service | BAA Available? |
|---|---|
| Supabase | Yes — Team and Enterprise plans |
| Vercel | Yes — Enterprise plan |
| AWS | Yes — included in AWS accounts |
| Resend | Check directly — varies by plan |
| Twilio | Yes — available on request |
| Google Cloud | Yes — included |
The typical developer flow: spin up a Supabase project, deploy to Vercel, add Resend for transactional email, and ship. Never once ask about BAAs. If any of those services is storing PHI — even just an appointment confirmation email with a patient name — you need signed BAAs with each one.
OWASP's Healthcare Application Security Cheat Sheet covers the BAA requirement alongside the technical controls in more detail.
What the SimonMed breach means for your next PR
SimonMed is a large organization with dedicated IT and compliance teams. They still got breached. Healthcare data is worth enough on the dark web that attackers specifically target it.
If you're building anything that touches health data — even a small scheduling feature, even a form that collects insurance information — the security bar is higher than for a typical SaaS. The data doesn't expire. The liability doesn't either.
The good news: the technical controls aren't exotic. RLS, TLS, audit logs, and encrypted storage are table stakes for any serious app. The difference in a healthcare context is that these aren't optional nice-to-haves. They're the floor.
Data Hogo can scan your repo for hardcoded PHI patterns, missing RLS policies on tables that look like they store health data, unencrypted database connection strings, and other issues that show up consistently in healthcare-adjacent codebases. If you're shipping something in this space and want to know where you stand before your users do, that's what the scanner is for.
TL;DR
- SimonMed Imaging disclosed a breach exposing names, SSNs, dates of birth, insurance data, and medical records affecting hundreds of thousands of patients
- PHI doesn't expire — a healthcare breach creates identity fraud risk for years, not months, which is why healthcare breaches average $9.77M in total cost
- You may be handling PHI without knowing it — appointment types, insurance IDs, and fields that link users to healthcare providers all qualify
- The technical floor: RLS on every PHI table, TLS everywhere (check your connection strings), encryption at rest (know your host's default), and audit logs
- The BAA gap is real — using Supabase, Vercel, or Resend to store PHI without signed Business Associate Agreements puts you in violation before you even write bad code
- HIPAA requires breach notification within 60 days — you need a process to detect unauthorized access, not just prevent it
- Supabase offers BAAs on Team and Enterprise plans — this isn't a reason to avoid Supabase, it's a reason to pick the right plan before you store patient data
FAQ
Does HIPAA apply to my app if I'm not a hospital?
If your app stores, processes, or transmits Protected Health Information (PHI) and you're a covered entity or business associate under HIPAA, the rules apply regardless of company size. If your app touches appointment data, insurance IDs, diagnoses, or similar fields, consult a HIPAA attorney.
What is PHI?
Protected Health Information is any individually identifiable health information — names, addresses, dates, SSNs, medical record numbers, health plan IDs, diagnoses, treatment history — when linked to a specific person.
Does Supabase support HIPAA compliance?
Supabase offers BAAs (Business Associate Agreements) on their Team and Enterprise plans. Encryption at rest and in transit are enabled by default. RLS and audit logging are your responsibility to configure correctly.
Related Posts
The Qantas Breach: 5.7 Million Records Lost Through a Third-Party Integration
Qantas didn't get hacked — a connected system did. 5.7M customer records were exposed through a Salesforce-integrated third-party. Here's what developers who use integrations need to audit.
16 Billion Passwords Leaked: What Developers Need to Do Right Now
16 billion credentials just hit the dark web. Most login endpoints have no rate limiting. Here's the exact attack chain targeting your /login route — and the fixes to stop it.
BPFDoor: The Linux Backdoor Behind the SK Telecom Breach (And What Your Server Can't See)
BPFDoor hit 27 million SK Telecom users by hiding inside the Linux kernel. No open ports. No suspicious process names. Traditional antivirus sees nothing. Here's what backend devs need to know.