🩺 Your Health Data Is Worth 50x More Than Your Credit Card
The uncomfortable truth about how your medical data is being traded while you're only worried about financial breaches
Have you ever gotten that call from the bank asking if it was really you who made a suspicious purchase? Or had your card blocked after a series of transactions? Probably yes. And you were thankful for that protection (well, sometimes not… these days I mostly get those calls from scammers, but still…)
But when was the last time your health app asked if it was really you checking your glucose levels? Or when your health insurance double-checked if it was actually you sharing your medical history with someone else?
Yeah, never happened, right? Exactly.
There’s an uncomfortable truth no one is talking about: your health data is far more valuable than your financial data. Not to you. To the market.
And right now, your data is leaking through fitness apps, insurance platforms, wearables, health chatbots, telemedicine tools, genetic testing sites, period trackers, and mental health startups that pretend to care about you.
We call it “sensitive data.” But we treat it like it’s public.
The Illegal Market You Don’t Know About
Here’s the absurd reality: a single piece of medical data can be worth 10 to 50 times more than a credit card number on the black market. A stolen card goes for about U$ 5 to U$20. But a full medical record? It can sell for up to U$ 500.
I’ll share all the sources and references at the end of the post, but let’s keep going…
Why is that? It’s simple: your card has a limit and can be canceled in minutes. Your health data is permanent.
You can’t “cancel” your history of depression. You can’t issue a new genetic code. You can’t reset your treatment history.
In 2015, nearly 100 million medical records were compromised globally. In 2023, that number only grew. And by just four months into 2025, multiple healthcare breaches had already affected millions of patients. One breach at Community Health Center, Inc. in January impacted over one million people, exposing medical records and Social Security numbers. Another involved LSC and affected approximately 1.6 million individuals, including patients and staff.
While you’re worrying about your American Express, someone out there might be buying data about:
That time you searched for panic attacks
Your anxiety treatment
Your IVF cycles
Your blood pressure medication
Your family history of genetic diseases
Where Your Data Is Leaking (Right Now)
That “free” meditation app? It’s selling your anxiety patterns.
Your smartwatch that tracks your heart rate? It’s sharing anomalies with third parties.
Your genetic test to find out your roots? It’s feeding commercial databases.
Is it really worth it just to find out you’re 1% Danish and 2% distant cousin of a Viking?
Your period tracker? It’s building behavioral profiles that are worth gold.
Privacy laws try to protect this as “sensitive data,” but in reality, some companies just say “we’ve anonymized everything” and keep going , and it shouldn’t work that way.
Let’s be honest: anonymization can be a convenient lie, almost a buzzword that many use without understanding. Because once data is truly anonymized, it’s no longer personal data, and it shouldn’t even be part of the regulatory conversation.
Sometimes, when someone says “we’ve anonymized the data,” all they really know is their own limited view of the data access, and they assume it’s anonymized.
On this topic, I’m sharing two articles I published:
Reidentification is easier than ever. Especially when you combine location, behavior, shopping patterns, and device IDs. A Harvard University study found that 87% of Americans can be identified using just ZIP code, birth date, and gender.
Exposure of confidential conditions (HIV) and discrimination: In Barueri (SP, Brazil), an HIV-positive patient had his medical records leaked on the municipal health portal, allowing coworkers to discover his condition with a simple search. He faced awkward situations and workplace discrimination, eventually being fired shortly after the exposure. The case ended up in court: in 2021, the São Paulo State Court ordered the municipality to pay R$20,000 in moral damages. More broadly, unauthorized disclosure of someone's health status (especially in stigmatized conditions like HIV) can lead to job loss or insurance denial, a pattern also seen in the data breach involving thousands of HIV patient records in Singapore.
Blackmail and psychological trauma: In Finland, the 2020 breach of over 22,000 therapy records from the Vastaamo psychotherapy clinic caused irreparable harm. Hackers stole session transcripts and demanded ransom from victims, threatening to publish personal therapy notes. As a direct result, many patients experienced severe anxiety and emotional distress at the thought of their most personal secrets being exposed. There are reports that some victims died by suicide, overwhelmed by the violation of their medical privacy. The Vastaamo case tragically shows how breaking health confidentiality can have direct, devastating consequences on people’s lives.
Commercial harassment and scams: Victims of health data leaks often report harassment after their information is exposed. Once in the wrong hands, medical data can be exploited by businesses and scammers: people begin receiving unwanted calls offering products or services tied to their conditions, or fake contacts pretending to be healthcare professionals. The breach of confidentiality exposes users to fraud and aggressive marketing. In one case, failures in healthcare systems led to a spike in spam and scam attempts targeting patients whose data had leaked. These incidents show that beyond abstract risks, there are very real financial and emotional impacts for those whose medical information ends up on the black market.
What’s at Stake (And It’s Not Just Your Money)
Imagine your health history gets leaked. Who’s at risk?
You, at your next job interview, when the recruiter “just happens” to find out about your burnout treatment.
Your child, trying to get insurance 15 years from now, when the insurer “coincidentally” knows about the genetic risk you found through that “fun” DNA test.
Your friend, going through cancer treatment, and suddenly being bombarded with “alternative cures” because someone sold her profile to supplement companies.
The original video is here https://www.youtube.com/watch?v=I7z3YfUvZHY - I tried to put some subtitles for you, but feel free to check it directly from Yt
And finally, all of us, when these leaks become training data for the next “revolutionary” AI product, reinforcing medical biases, especially racial, gender-based, and socioeconomic.
This is no longer just about identity theft. It’s about damage to reputation, social scoring, behavioral manipulation, and medical discrimination.
In Europe, under the GDPR, health data is part of the “special category.” In Brazil, the LGPD also classifies it as sensitive. In the U.S., HIPAA covers part of the healthcare sector. In Japan, the APPI requires clear consent. In South Korea, biometric data has its own set of rules.
All around the world, the message is the same: health is sensitive data, yet it keeps leaking as if it didn’t matter.
Anthem, Inc. (USA): In October 2018, health insurer Anthem agreed to pay a $16 million fine after the largest medical data breach ever recorded in the U.S. Hackers exposed the personal and health information of nearly 79 million people in a cyberattack that occurred in 2015. It was the largest penalty ever imposed for violations of the HIPAA privacy rule at the time, highlighting the severity of the incident.
Premera Blue Cross (USA): In September 2020, health insurance provider Premera Blue Cross was fined $6.85 million in a settlement with U.S. authorities. The company suffered a breach that compromised data from over 10.4 million individuals, stemming from security failures identified after a 2014 attack. This was the second-largest penalty ever imposed for violating U.S. health data protection regulations.
Barreiro Montijo Hospital (Portugal): The first fine under the EU’s GDPR in Portugal came in October 2018 against a public hospital. Centro Hospitalar Barreiro-Montijo was fined €400,000 by the national data protection authority (CNPD) due to inadequate access controls that allowed unauthorized access to patient records. An audit revealed nearly 1,000 active doctor-level user accounts, even though the hospital had only 296 doctors, enabling unauthorized staff to view sensitive medical data.
Haga Hospital (Netherlands): In June 2019, Haga Hospital in The Hague was fined €460,000 following a high-profile privacy breach. Around 85 staff members accessed the medical file of a celebrity patient without authorization, violating access restrictions. The Dutch Data Protection Authority issued the fine due to internal security lapses that enabled the breach. This became the first major GDPR-related penalty for a hospital in the Netherlands and served as a warning to European healthcare providers about access control.
GoodRx (USA): In February 2023, health app company GoodRx (which offers telemedicine services and medication coupons) was fined $1.5 million by the FTC in a landmark case. The company had shared users’ sensitive health information with marketing platforms like Facebook and Google without proper notice or consent. This was the first FTC enforcement under the Health Breach Notification Rule, punishing a failure to safeguard patient data in health apps.
So, what happens when insurance companies buy these so-called "de-identified" datasets to build risk profiles? When mental health apps share user data with third parties without truly informed consent? When AI models trained on health data end up reproducing and amplifying bias? Or worse! when governments use "health data collection" as a cover for surveillance?
These aren't just technical issues. They’re deep ethical and societal questions.
What We Need to Do Now
We don’t need more empty policies. We need real consequences.
For those building products, we need product teams who truly understand what “sensitive” means, and who respect privacy by design.
We need regulators with technical teams, not just legal departments.
And above all, we need to stop pretending privacy is about hiding something. It’s about control. About power. About the freedom to exist without being profiled in your most vulnerable moments.
Because once you realize your health data is worth more than your financial data, you start asking different questions. And demanding different protections.
Next time an app asks for access to your health data, ask yourself: are they offering the same level of security as your bank?
If not, why are you trusting them with something far more valuable than your money?
Think about it.
References:
World Privacy Forum: Report about the price of health data in the illegal market
Harvard University: EReidentification of data
Princeton Univ: Research about anon limits
CISOMAG: Personal Medical Data Is Worth More Than Financial Data
D Magazine: Why Medical Data Is 50 Times More Valuable Than a Credit Card
Safeguarding Virtual Healthcare: A Novel Attacker-Centric Model for Data Security and Privacy
What if you have neither lol
You mentioned HIPAA but you missed an important piece... HIPAA was not designed for privacy, it is a way of companies sharing your data by default without your consent. It is at the heart of the problem.