The Algorithm Is Watching: Are You Still You If Your Choices Aren’t Yours?
- Urvee Nikam
- Aug 4
- 4 min read
There’s a strange kind of déjà vu that comes with scrolling.
You open your phone and suddenly you’re being served ads for shoes you only thought about. A video shows up that describes your exact mood. The music? Your vibe. The comment section? Your brain, basically.
It’s not really magic, just the algorithm. And it knows you.
But here’s the part that feels like a quiet horror story:
If your feed is shaping what you see, feel, and want—are you still the one choosing?

The Personalised Prison
Algorithms don’t just reflect your preferences—they shape them. They predict what will keep you engaged and feed you more of it. This is called algorithmic nudging: subtle pushes toward specific content, behaviors, or even beliefs.
On TikTok or Instagram, it might be a style, a celebrity, a joke format. On Spotify, it's a genre you didn’t know you liked—until it rewired your taste. On Netflix, it's dystopian documentaries right after your late-night spiral.
Every interaction—clicks, likes, comments—feeds back into a system that learns you, curates you, and then gently steers you.
Welcome to the Echo Chamber
At first, personalization feels like comfort. Why wouldn’t it? Endless content curated just for you knowing you're going to like it and want to see more. But soon it can trap you in what's called a filter bubble—a digital space where you only see versions of your own opinions, beliefs, and biases reflected back. Our feeds filter out anything unexpected, challenging, or new. The more the algorithm “understands” you, the fewer chances you get to surprise yourself.
This curated reality creates what’s known as an echo chamber, where your beliefs, preferences, and behaviors are continuously reinforced by similar content. It’s not really that the algorithm is trying to mislead you—it’s just doing its job: optimizing for your engagement. But in doing so, it limits your exposure to diverse viewpoints, reducing your digital world into a narrow reflection of what you’ve already shown interest in. The changes are gradual, nudging you toward content that feels right—even when it slowly rewrites what “right” means to you.
Are We Choosing, or Just Reacting?
Psychologists say we like to believe we have free will, but behavioral data shows we’re more influenced than anyone would like to admit.
This brings us to a philosophical crossroads: if algorithms are subtly guiding your decisions, influencing your taste, and feeding your thoughts—how much agency do you really have?
Social platforms hire behavioral scientists to design features that keep you hooked. They study cognitive bias, emotional triggers, and habit loops. These systems aren’t neutral—they’re optimized to keep you scrolling, liking and consuming. The tension here lies not in the existence of algorithms, but in the lack of transparency. Without knowing how these systems work, we may find ourselves shaped by forces we don’t even see.
The Fight for Agency
That being said, the solution isn’t to reject technology entirely—it’s to engage with it just a little more consciously. Understanding that the algorithm is not neutral empowers us to push back. Follow accounts outside your comfort zone. Seek out different perspectives. Question why something showed up in your feed. Small actions like these disrupt the loop and give you back a sense of control.
Algorithms are powerful tools. They can help us discover art, ideas, communities, and joy. But left unchecked, they can also subtly steer us into a digital version of ourselves that’s less about who we are—and more about what we’ve been taught to want.
So… Who’s Really in Control?
We live in a world where algorithms shape more than just our feeds—they shape our routines, our tastes, and even how we see ourselves. But they don’t get the final say. Our choices, even the small ones—what we click, what we scroll past, what we question—still matter.
It helps to remember that we’re not fixed data points. We’re constantly learning, shifting, and growing in ways no system can fully predict. And that unpredictability? It’s not a flaw—it’s what makes us human.
Understanding how algorithms work isn’t just about tech literacy. It’s about staying present in our own lives. When we recognize the patterns, we get to decide when to follow them—and when to break them.
Because at the end of the day, the algorithm might know what we liked yesterday. But only we get to decide who we want to be tomorrow.
Reference List
Harris, Tristan (2020). Tristan Harris. [online] Tristanharris.com. Available at: https://www.tristanharris.com/#podcast
Pariser, E. (2011). The Filter Bubble: What the Internet Is Hiding from You. [online] New York: Penguin Press. Available at: https://hci.stanford.edu/courses/cs047n/readings/The_Filter_Bubble.pdf.
Vogels, E.A., Gelles-Watnick, R. and Massarat, N. (2022). Teens, social media and technology 2022. [online] Pew Research Center. Available at: https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/.
Wikipedia Contributors (2019). Filter bubble. [online] Wikipedia. Available at: https://en.wikipedia.org/wiki/Filter_bubble.
Zeynep Tufekci (2015). Algorithmic Harms Beyond Facebook and Google: Emergent Challenges of Computational Agency. [online] Colorado Law Scholarly Commons. Available at: https://scholar.law.colorado.edu/ctlj/vol13/iss2/4/.
Zuboff, S. (2019). The Age of Surveillance Capitalism: the Fight for a Human Future at the New Frontier of Power. New York: Public Affairs.
.png)





Comments