Manipulation machine: How a techie turned teen social media experiments into global behavioral control

A Substack post by writer EKO, titled “Rethink Democracy,” shines a spotlight on something both fascinating and deeply concerning.

It tells the story of how one Silicon Valley entrepreneur spent more than a decade perfecting the art of behavioral manipulation, first on teenagers … and now on the world’s largest communication platforms, such as X and Facebook.

It’s a story that begins innocently in Berkeley. In 2011, 22-year-old student Nikita Bier launched Politify, a website designed to help voters understand how government policies might affect them financially. The project went viral, gaining four million users in a month. That was huge back then. It was lauded as a breakthrough in civic education.

But as EKO points out, Bier’s real discovery wasn’t about better informing voters. It was about how easily they could be influenced and manipulated. His takeaway was that Americans, more than any other people in the Western world, will vote against their own financial self-interest, and that the right emotional cues can be deployed to override voters’ logic.

After Politify, Bier turned his attention to a new kind of experiment: psychological engagement. In 2017, he launched tbh (“to be honest”), a social app marketed as an anti-bullying tool for high schoolers. The idea was to encourage teens send anonymous compliments through polls: “Who’s the most creative?” or “Who’s the best friend?”

But the data collection behind it was the real mission, and it was extensive. The app tracked users’ contacts, locations, and reaction times and measured not happiness, but dependency. Bier and his team studied which questions caused anxiety, which rewards triggered excitement, and which notifications made kids come back for more.

As EKO notes, it wasn’t about kindness. It was about learning how to engineer emotional addiction and dopamine release.

Facebook bought tbh within months, not for the app itself, which was soon shut down, but for the behavioral data and the methods it had unleashed.

Bier’s next creation, Gas, used nearly the same model: anonymous polls and dopamine-based feedback loops with an added a feature called “God Mode,” allowing users to pay for hints about who had voted for them. The app monetized insecurity.

When rumors spread that Gas was linked to human trafficking, the project went sideways. The claims turned out to be false, but parents’ reactions revealed something important: The app felt predatory even before the hoax about trafficking.

As EKO writes, “Parents couldn’t articulate why, but they sensed something wrong.”

Gas was eventually acquired by Discord, which, like Facebook, appeared more interested in the behavioral techniques than the app itself.

Today, Nikita Bier is Head of Product at X, formerly known as Twitter. Elon Musk brought him on board in 2024, hoping to make the platform more “engaging” and keep users from drifting to competitors.

But according to EKO’s report, Bier’s influence has taken X in the opposite direction of Musk’s promises of free speech and transparency. Users have begun noticing sudden drops in post visibility, shadowbans without explanation, and algorithms that amplify selfies and memes while suppressing journalism and external links.

The same methods once used to hook high-schoolers are now being applied to adults. We, too, have dopamine addiction tendencies.

EKO argues that what Bier built isn’t merely a social-media system – it’s a behavioral control infrastructure that shapes how people think, feel, and react, often without realizing it.

The implications reach beyond politics. If algorithms determine what we see, what we believe, and even what we fear, then free choice becomes an illusion.

EKO warns: “You think you’re participating. You’re being managed. You think you’re being heard. You’re being suppressed.”

It’s a wake-up call for every generation, but especially for parents and grandparents. The younger generation is growing up in digital environments that are not neutral; they are designed to harvest attention, addict people, and manipulate behavior.

This matters because informed consent depends on access to truth. When platforms hide or distort that truth, citizens – voters – become subjects of the algorithm.

What began as an experiment to help voters has now evolved into an ecosystem that can subtly guide decisions, emotions, and even elections.

As EKO concludes, Elon Musk promised a digital town square. But the platform risks becoming “a beautiful prison” where data mining is the goal, engagement replaces understanding, and addiction replaces free choice.

For those of us who care about liberty, faith, and the next generation, that’s not just a tech story. It’s a warning about how we are all being manipulated.

Latest Post

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *