Privacy

You're logging your health. Someone else is keeping the records.

Millions of people track their health daily without knowing their data is readable on company servers. Don't confuse trust with a guarantee.

Just you and the app. Or so it feels.
Just you and the app. Or so it feels.
Photo by Tim Durgan / Unsplash
Anna Adamczyk Nyskra Founder
Published 19.03.2026
Updated Original

Overview

You’re using a mood, health, lifestyle app. You’re filling out someone else’s database. Both things are true at the same time. Let’s look at this.

The Benefit: Why we use personal apps

Why do we even use these apps?

It splits into a few flavors:

  1. Awareness: “I want to understand my patterns” (sleep, mood, energy)
  2. Accountability: “I need something to keep me on track” (habits, water, medication)
  3. Prediction: “I want to anticipate what’s coming” (cycle, symptoms, energy dips)
  4. Validation: “I want to see progress” (weight, fitness, streaks)

However, the core reason is the same: people want to know themselves better and improve their quality of life.

And these apps do seem to provide benefit. At least people show it with their time, money and data they invest in those apps.

Tracking what you eat, how you sleep, how much you move - it used to take real effort with pen, paper oder excel tables. Now there’s an app for all of it. The appeal is real, but so is the database.

Behind nearly every health app, there is someone else’s database. Wait what? Why are we talking about databases now? Let me explain.

The Flip Side: Filling out someone else’s database

Most health apps store your data on their servers in a form they can read. And server here is basically just ‘the company’s computer’.

Even worse: Many of these apps don’t even offer sync across devices or sharing. At least then the trade would make at least some sense - your data for functionality. But it still wouldn’t be the deal for you.

So why is your data readable on their servers at all? There’s no good answer - not for you anyway.

Tracking what you eat, when you eat, how you sleep, when you sleep, how much you move, and how you feel doing all of this, how you feel just living - it lands on a computer you have no control over, readable by anyone with access, and sometimes, by people who shouldn’t have access at all.

This means, once you log a piece of information it is out of your hands what happens with it. Tomorrow, in a year or in 10 years.

Your mood log, your sleep score, your medication reminder, your cycle data - it’s not locked away in some vault. It sits in a database table, like a row in a spreadsheet. A column for date, a column for what you logged, a column for who you are.

But isn’t the data encrypted?

Sure, some will say the data is encrypted. And technically, that’s not wrong. But what the companies behind the apps won’t tell you straight is that they usually also have the encryption keys. Which means they can read it whenever they want. The lock exists. They just also have a key.

While you logged it at midnight, half asleep, phone screen dimmed. Just you and the app. It felt like a private thought. But the moment you hit save, it traveled across the internet and landed on a server in a data center you’ve never heard of, in a city you’ve probably never been to. It felt private. It wasn’t.

And yet, you’ll probably still open the app tomorrow. Because the service is useful enough. And you kinda lack alternatives. That’s the real tension.

But what happens if one day, you want to stop tracking whatever health metric you tracked?

You can delete the app in two seconds. However, that only removes it from your phone.

The data you logged, every entry, every date, every symptom, is still sitting on the company’s computers. Deleting the app doesn’t delete the database about you.

“But I can request they delete my data.” Yes. It’s a reassuring idea. Especially when we start to feel how little control we actually have.

So you formally request deletion. Most privacy policies give companies weeks or months to comply. Ok, fair enough, we think.

Company policy. Legislation. Enough to have your data work in your best interest?

But all of this, when we look at it honestly, is just a promise. Every time a company says “don’t misuse user data”, it is nice and all. But company policies change fast. It has happened too often for us to believe otherwise - in the end, they are just a pinky promise.

What about the next line of defense? Legislation. Yes, legislation like GDPR in Europe and HIPAA in the US set out to prevent misuse of this data.

And here’s the part worth sitting with. The fact that these laws exist in the first place only confirms what we already sense: our data is valuable and vulnerable. Otherwise, why would governments bother? And why would some companies bother to break the law and get the legal headaches otherwise?

Let’s be real: companies knowingly violate data protection legislation. Why? Probably not because everyone at the company is stupid and it happens by accident, but because it’s profitable. Sometimes they get penalized: fines, legal costs, bad press, some users leaving. But selling or misusing the data for e.g. targeted ads still seems to outweigh the costs.

Can legislation really stand against companies that can afford to pay the fine, swallow the bad press, and keep enough users to make it worthwhile?

Also let’s not forget, laws can change, too.

So if company policies are a promise and legislation is a stronger promise. But both can change. Both can be broken. And both can be ignored when the profit is high enough.

And yet, the appeal remains

But lets not forget, there is also the other side: the whole reason we started logging in the first place was to better our lives. The data is supposed to serve us. That’s the point. Otherwise why bother?

So if neither company policy nor legislation is a guarantee your data works in your best interest - what is? The only thing that comes close to a guarantee is when “we won’t” becomes “we can’t. And that’s an engineering problem, not a legal one.

Overwhelming? Yeah. But not knowing won’t change the reality. Knowing changes what you’ll tolerate. And intolerance, at scale, is what moves markets. And after all, knowledge is power - and so is being aware of what happens to your personal information.

All I am saying is this: When choosing your technology, don’t confuse trust with a guarantee. And be aware of the difference between ‘we won’t’ and ‘we can’t.’ Even if you haven’t found an alternative yet that offers what you need.

Awareness is where change begins.

That’s why you log into your health app after all, no?

Coming Soon

We turn we won't into we can't.

Follow us for early access and the reveal.