Who Builds the Roads Decides Where You Go
Infocracy, the architecture of choice, and four lines of digital defence
You reach for your phone in the morning to check the weather. Twenty minutes later you have no idea how you ended up going from 18 degrees Celsius and a 15 km/h northwest wind to elevated blood pressure from the opinions of a politician you never even followed. You didn’t click wrong. The system guided you. Not by force. Not by command. Just by quietly rebuilding the roads you travel.
What infocracy is – and why it’s not what you think
When most people hear the word manipulation, they picture propaganda. Crude, visible, unmistakable. Someone telling us what to think.
Infocracy works differently. More subtly. More effectively.
Its instrument is not the command but the architecture of choice — a concept developed academically by economist Richard Thaler and legal scholar Cass Sunstein in Nudge (2008). Their core thesis: the way options are arranged determines choices just as much as – often more than – the content of those options themselves. They don’t take the wheel from your hands. They quietly reroute the roads, shift the signs, and move the guardrails.
In the digital environment, this architecture has reached industrial scale.
Recommendation algorithms, search rankings, the mechanics of likes, notification schemes, the autoplay of the next video — these are not neutral tools. They are environmental parameters that someone configured with intent. By changing these parameters, the behavior of millions of people can be altered without any of them feeling any pressure.
Shoshana Zuboff puts it precisely in The Age of Surveillance Capitalism (2019): it’s not about what they know about us. It’s about what they do with it — how our data is turned into an instrument for shaping our future behavior.
Why “disconnecting” is not the answer
When I tell people this, the first reaction is predictable: “So log off. Delete Instagram. Stop reading the news.”
I understand. But this is not a solution.
Our physical environment is now so thoroughly formatted by the digital that disconnecting would mean giving up our place in society. Access to services, ordering, shopping, the political mood of our surroundings, the information shaping the decisions of our colleagues and clients — all of it flows through channels that someone is configuring. Refusing to walk roads because they are poorly marked is not freedom. It’s isolation.
The answer is not escape. It’s conscious movement through an environment we understand.
Four lines of defense
1. Algorithmic well-poisoning. Be unpredictable.
Algorithms need us to be readable. The more precisely they can predict our behavior, the more effectively they can guide us. Readable means manageable.
Judith Donath of MIT Media Lab, who has long studied digital identity and profiling, shows how online systems constantly collect signals: what we read, for how long, what we click on, what we skip – assembling profiles that over time become more accurate than our own self-perception.
The defense: introduce noise into that model. Occasionally search deliberately for a topic completely outside your interests. Read something you would never click on voluntarily. Follow sources with different viewpoints – not to adopt them, but to break the profile the system is building about you. If the algorithm recommends something with 99% accuracy, that 1% is your freedom. Protect it.
2. Building friction. Refuse convenience.
Infocracy loves frictionless design — design without resistance. The less we have to think at every click, the more smoothly the environment leads us where it needs us to go. Nir Eyal in Hooked (2014) details the mechanisms by which technology companies build habitual behavior. He wrote it as a manual for designers. For us, it reads as a manual for defence.
In practice: introduce deliberate obstacles into your digital habits. Never click “Next video” or “You might also like.” If you want information, go and get it actively. Use search engines that don’t profile you (DuckDuckGo, Brave Search). Turn off like counts where platforms allow it — they are calibrated to trigger conformity through social proof. When you can’t see what’s popular, you have to decide for yourself whether something is good.
And return to RSS. We choose the sources. We set the order. Not the algorithm.
3. De-automating reactions. System 1 as the target.
The mechanics of likes and shares don’t attack our reason. They attack our emotions.
Daniel Kahneman in Thinking, Fast and Slow distinguishes two modes of thinking. System 1 is fast, automatic, emotional — it reacts before we have time to think. System 2 is slow, analytical, effortful. Digital platforms are built on System 1. Anger, outrage, fear, ridicule — these are emotions that drive immediate reaction, sharing, engagement. The algorithm amplifies them deliberately because they generate activity.
The defence is simple, but requires practice: the five-second rule. Before reacting to something, sharing it, or feeling anger at a post, stop and ask: Why did the algorithm show me this right now? What emotion is it trying to provoke? Who does this serve? Recognizing manipulation in real time immediately reduces its effectiveness — not because we become immune, but because we engage System 2 where the system is counting on System 1.
4. Technical hygiene — diversifying your tools.
We cannot allow a single entity to own our entire digital life.
When one browser, one platform, and one ecosystem are used for work, leisure, social interaction, and information consumption, we create a complex, coherent profile that is extraordinarily valuable to infocrats. The solution is not paranoia — it’s hygiene.
Separate your identities: one browser for work, another for leisure, another for social networks. Use extensions to block trackers (uBlock Origin is a good start). The goal is not to become invisible. It’s to stop giving one player a complete map of your life.
The analog community as the last line of defence
Infocracy loses its power wherever direct human interaction exists that it cannot measure and parameterize.
Hannah Arendt in The Origins of Totalitarianism showed — through far more brutal examples — that power is maintained primarily by isolating individuals from one another. When people lose direct connection with each other, they become vulnerable to narratives arriving from above or through mediated channels.
Local communities, direct conversations, physical meetings — this is not nostalgia. It’s a rational defensive strategy. Not every conversation needs to happen on a platform that measures, rates, and monetizes it. Sometimes it’s enough to meet and talk.
Conclusion: Being a system error as a form of civic courage
The goal is not to become a digital hermit. That is neither realistic nor necessary.
The goal is to become a system error in the algorithm — someone who cannot be reliably categorized, predicted, and guided. This requires conscious effort: a little less comfort, a little less friction-free flow, five seconds of pause where the system is counting on an immediate reaction.
In an environment where the architecture of choice shapes the behavior of the masses, the ability to move deliberately — rather than merely react — is a form of freedom. And perhaps also a form of civic courage.
Want to go deeper? I recommend starting with these sources:
- Richard Thaler & Cass Sunstein – Nudge (2008)
- Eli Pariser – The Filter Bubble (2011)
- Daniel Kahneman – Thinking, Fast and Slow (2011)
- Shoshana Zuboff – The Age of Surveillance Capitalism (2019)
- Nir Eyal – Hooked (2014) — read it as a warning, not a manual
