Your watch buzzes, your phone lights up, and before you finish your first cup of coffee a score has already told you how you slept, how you should train, and how you are “really” doing. It feels like data-driven self-care. But what if the simple act of checking is changing the state you are trying to measure, and quietly teaching you to trust numbers and feeds more than your own nervous system?
In this episode, we walk through the rooms of the health attention economy: wearables that start talking like judges, experts who cannot all be right at once, feeds that learn your fears, health marketing built on expensive persuasion playbooks, search results that bury nuance under ads and AI summaries, and a growing synthetic crowd of AI generated content that looks like real people telling real stories.
The uncomfortable through-line is that the system only works because we cooperate with it.
What we cover:
How tiny, repeated overrides of your own signals (“the score says rest” or “the score says push”) train you to doubt your body
Why interpretation multiplies faster than experiments when you stack lab tests, experts, and protocols
How “doing research” online turns your fears and uncertainties into fuel for someone else’s business model
Where AI fits in, both as a useful tool and as an engine for synthetic consensus that is hard to spot
Three simple experiments that start to return the interpretive layer to you:
The One Sentence Gate
Seven Days Without Health Feeds
One Voice, One Change, Four Weeks
Sources & Links
Derived from Mark’s essay: When Your Health Becomes Someone Else’s Business Model at HealthUnderControl.com
More audio and essays at Mark’s Substack: HealthUnderControl.com
Mark’s practice (the HUC philosophy in action): Unblocked.Health









