
My kids have been deep into a 90s-movie phase lately — which is surreal because these were my high-school movies and now apparently they’re “vintage.”
So we’ve been watching The Truman Show, Dark City, and The Matrix together. The themes in these movies always felt familiar — identity, perception, reality, control. None of that is new.
But watching them again today, after decades of living inside social media algorithms, and with a head full of Jung from a college philosophy class I once took (plus a recent binge of Jungian podcasts), those same themes land differently now. Not as some big revelation, more like:
“Oh right, these stories were always about reality being shaped for us.
We just finally have the language — algorithms, feeds, filter bubbles — to see how literal that is.”
The movies didn’t change.
The backdrop did.
The Truman Show: We Always Knew the World Was Curated

The premise of The Truman Show is obvious: Truman lives in a world built for him, curated to keep him predictable, edited so he never quite questions the setup too deeply. We’ve always understood that as a critique of TV and media culture.
But now it reads almost like a prototype for algorithmic reality.
Jung’s concept of the Persona — “the mask designed on the one hand to make a definite impression upon others, and on the other to conceal the true nature of the individual” — is basically Truman’s whole life. Seahaven is the mask.
And just like our lives online, Truman’s world runs on:
- rewarded behaviors
- suppressed deviations
- invisible curation
- “reality” delivered as a storyline
Christof isn’t that different from an engagement-optimized feed.
He just has better control over the weather.
Modern research backs this up:
- Annette Markham’s The Algorithmic Self describes how social platforms “co-construct identity and relational meaning” through their interfaces and algorithms.
- Work on social-media self-presentation shows we gradually perform the versions of ourselves that get algorithmically rewarded, and let the rest atrophy.
But it goes deeper than self-presentation. The algorithms don’t just reflect what we want — they actively shape what we want through operant conditioning at scale:
- Variable reward schedules: Pull-to-refresh mimics slot machine mechanics. You never know when the next dopamine hit is coming, so you keep checking. Research from Stanford’s Persuasive Technology Lab shows this is designed to create compulsive checking behavior.
- Social validation loops: Likes, hearts, and engagement metrics trigger the same reward pathways as gambling. Studies by neuroscientist Adam Alter show that these feedback loops can be just as addictive as substance use.
- Personalized content optimization: Each interaction teaches the algorithm more about what keeps you specifically engaged. It’s not curating for what you need — it’s optimizing for what keeps you scrolling.
The movie always knew reality could be curated.
We just didn’t have Instagram, TikTok, and a For You page yet.
Dark City: Reality Rewritten, Same as It Ever Was

Dark City always felt like a fever dream: a city that rearranges itself atmidnight, strangers in long coats, people waking up with new memories and identities someone else has assigned to them.
The themes were always there:
- memory isn’t fixed
- environment shapes identity
- narratives can be swapped in and out
- you might be living inside someone else’s experiment
From a Jungian angle, it’s thick with:
- The Shadow — forces shaping you from the dark, beyond your awareness
- The Collective Unconscious — the Strangers pulling archetypal fragments from a shared human pool
- Identity as fluid — built from interchangeable stories
One Jungian reading literally describes Dark City as “a myth about the construction of identity through unseen forces” — which is exactly how modern recommendation systems feel when you zoom out.
Today, we don’t have Strangers tuning our buildings, but:
- our informational “city” gets rewritten constantly
- what’s “normal” shifts based on trending topics
- our memories resurface via “On This Day” and platform reminders
- identity gets shaped by what content we’re shown and what gets attention
Every feed refresh is a tiny retuning of reality.
And these adjustments are happening at unprecedented scale and speed. Consider:
- Predictive targeting: Algorithms don’t just respond to your behavior — they anticipate it. Research later leveraged by Cambridge Analytica, like Kosinski et al.’s 2013 study on Facebook likes, showed that with just 70 Facebook likes, algorithms could predict your traits better than your friends. With 300 likes, better than your spouse.
- Micro-targeting emotions: A 2014 Facebook study on massive-scale emotional contagion showed they could manipulate users’ emotional states by selectively showing them positive or negative content. Nearly 700,000 users had their feeds altered without consent. The effect was measurable and real.
- Behavior modification at scale: B.J. Fogg’s work on “captology” (computers as persuasive technology), especially his book Persuasive Technology: Using Computers to Change What We Think and Do, shows that digital systems can systematically change attitudes and behaviors through careful design of triggers, abilities, and motivations.
The Strangers in Dark City had to manually inject memories one person at a time.
Modern algorithms do it automatically, to billions, in real-time, learning and adapting as they go.
These movies always understood that memory and identity were malleable.
Algorithms just made the mechanism explicit — and continuous.
The Matrix: The Simulation Metaphor That Only Got More Accurate

When The Matrix dropped, the core metaphor landed instantly: what if the world you take as “real” is actually a system designed to keep you docile?
That resonated long before TikTok existed.
Through a Jungian lens:
- Neo’s awakening is individuation — the ego waking up to the deeper Self.
- Agent Smith is the system’s Shadow — pure repression and rage, no soul.
- The Architect is the dry, over-rational Senex archetype, obsessed with order.
- The Matrix itself is a kind of Persona projected over humanity.
The line:
“The world that has been pulled over your eyes to blind you from the truth”
hits differently when you’ve watched people inhabit completely different realities depending on what their feeds serve them.
And here’s where modern research clicks in:
- Shoshana Zuboff’s book The Age of Surveillance Capitalism describes how platforms collect “behavioral surplus” — excess data from our online lives — and use it to predict and shape our future behavior.
- Writers like Urbano Reviglio argue for “algorithmic sovereignty” — the idea that whoever controls personalization algorithms effectively controls perception.
Back in 1999, the Matrix was a metaphor.
In 2025, it’s uncomfortably close to a product spec for engagement optimization.
The parallels to today’s algorithmic systems are striking:
- Reinforcement learning loops: Modern recommendation engines use the same principles as the Matrix’s control system — reward behaviors that keep you plugged in, suppress behaviors that threaten disengagement. YouTube’s algorithm, for instance, has been shown to progressively recommend more extreme content because it drives longer watch times.
- A/B testing reality: Platforms constantly run experiments on users without explicit consent. Instagram tests different UI patterns, feed algorithms, and notification triggers across user segments to see what maximizes “engagement” (read: time spent). You’re not just in an experiment — you’re in hundreds simultaneously.
- The illusion of choice: Just as Neo initially chose to take the blue pill every day without knowing it, we “choose” to scroll, click, and engage — but those choices are architected by systems designed to make certain behaviors feel frictionless and inevitable while others feel difficult or invisible.
Research by Tristan Harris and the Center for Humane Technology documents how design choices aren’t neutral — they’re persuasive by intent. The infinite scroll, autoplay, and notification patterns are all engineered to bypass conscious decision-making and create habitual, automated behavior.
Back in 1999, the Matrix was a metaphor.
In 2025, it’s uncomfortably close to a product spec for engagement optimization.
The 90s Weren’t Predicting Algorithms — They Were Wrestling With Reality Itself

Here’s the key point for me:
These movies didn’t “predict the algorithmic future.”
They were already wrestling with how reality gets constructed.
The late 90s were a weird moment:
- the internet was arriving
- media went 24/7
- reality TV normalized being watched
- personalization quietly began
- identity started to move online
- people felt more and more like life was mediated through screens
Jung said the collective unconscious tends to cough up new myths when a culture is at a crossroads. That’s what these films were:
- The Truman Show: the curated world, the mask you live inside
- Dark City: reality as a remixable construct, identity as assigned
- The Matrix: waking up from the system’s story into something more real
Those themes always resonated because they were never just about technology.
They were about us.
We just didn’t yet have clear language for “algorithmic feed as reality engine.”
Algorithmic Reality Is Just the New Skin on the Same Old Themes

Fast-forward to now, and we live inside realities shaped by:
- recommendation systems
- filter bubbles and echo chambers
- outrage and virality loops
- AI-generated content
- memetic identity and online tribes
Not because there’s a single villain in a control room, but because:
- there’s money in keeping us scrolling
- there’s power in shaping what we see
- there’s little transparency in how any of it works
Research like Jane Clapp’s Social Media and the Collective Unconscious makes the connection explicit, arguing that social media algorithms “colonize the psyche” and affect individuation — Jung’s term for becoming a whole, integrated person.
But let’s get specific about how algorithms drive behavior:
The Mechanics of Algorithmic Behavior Modification
Research like Jane Clapp’s Social Media and the Collective Unconscious makes the connection explicit, arguing that social media algorithms “colonize the psyche” and affect individuation — Jung’s term for becoming a whole, integrated person.
1. Attention Hijacking
Sean Parker, Facebook’s first president, admitted the platform was designed to exploit “a vulnerability in human psychology” by giving users “a little dopamine hit” to keep them coming back. The goal wasn’t to add value to your life — it was to “consume as much of your time and conscious attention as possible.”
2. Polarization by Design
Research from MIT shows that false news spreads six times faster than true news on Twitter. Why? Because algorithms optimize for engagement, and outrage drives engagement. Content that makes you angry, fearful, or tribal gets amplified. Moderate, nuanced takes get buried. The algorithm doesn’t care about truth — it cares about clicks.
3. The Feedback Loop of Self
Algorithms create what researcher Zeynep Tufekci calls “algorithmic confounding” — the platform shows you content based on past behavior, which shapes your future behavior, which trains the algorithm to show you more of that content. You become trapped in an increasingly narrow representation of yourself. The algorithm doesn’t show you who you are — it shows you who you’re becoming under its influence.
4. Manufactured Urgency
Red notification badges, “streaks,” limited-time stories, “X people are looking at this” — these are all engineered to create artificial scarcity and FOMO (fear of missing out). Research by Dr. Larry Rosen shows these tactics trigger the same stress responses as actual emergencies.
5. Dark Patterns and Consent Theater
Ever notice how it’s one click to accept cookies but requires navigating three menus to reject them? Or how it’s easy to sign up but deliberately hard to delete your account? These aren’t accidents — they’re “dark patterns,” intentionally deceptive design choices that manipulate users into behaviors that benefit the platform.
Truman had Christof.
Murdoch had the Strangers.
Neo had the Matrix.
We have the feed.
The stories aren’t more relevant now because we found some hidden meaning.
They’re more relevant because the world finally caught up to their metaphors.
Real-World Examples: When the Algorithm Rewrites Reality

The abstract becomes concrete when you see specific cases:
The YouTube Radicalization Pipeline
Researchers at Data & Society documented how YouTube’s recommendation algorithm systematically guides users from mainstream content toward increasingly extreme viewpoints. Someone watching a video about fitness might get recommended Jordan Peterson, then anti-feminist content, then white nationalist material — not because they sought it out, but because the algorithm learned this pathway keeps people watching longer. The Guardian reported that YouTube’s own internal research confirmed this pattern but the company continued using the engagement-optimizing algorithm.
TikTok’s Manufactured Realities
In 2021, The Wall Street Journal created bot accounts with different interests and tracked how quickly TikTok’s algorithm created completely different realities. A “sad teen” bot was fed content about depression and self-harm within 30 minutes. A “fitness” bot saw only workout culture and body optimization. Each lived in a completely different world, algorithmically tuned. The app wasn’t showing them reality — it was creating their reality based on early signals.
Facebook’s Emotional Contagion
In their 2012 study (published in 2014), Facebook deliberately manipulated 689,003 users’ news feeds to show either more positive or more negative content. The result? Users’ own posts shifted to match the emotional tone they’d been fed. The algorithm didn’t just predict emotions — it caused them. The study’s conclusion was chilling: “Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.”
The Filter Bubble Election
Research from the 2016 and 2020 US elections showed that people on opposite sides of the political spectrum weren’t just disagreeing about interpretation — they were seeing fundamentally different facts because algorithmic feeds served them completely different information ecosystems. You weren’t in a debate with your uncle at Thanksgiving — you were in different simulations that happened to use the same words.
These aren’t edge cases or bugs.
This is the system working as designed.
Staying Awake Inside the Construct

The goal here isn’t to run off to a cabin, smash your phone, and live off grid (though, you know, sounds nice some days). It’s more like applying Jung’s advice to a digital context:
Know your Persona
Notice what version of yourself the platforms reward. Don’t confuse that with your Self.Watch your Shadow
Algorithms love your projections and your outrage. That’s high-engagement fuel.Diversify your input
Curate against the algorithm sometimes. Seek out voices and ideas it won’t naturally show you.Choose boredom on purpose
Your inner life needs space that isn’t constantly being filled by the next thing.Stay aware, not paranoid
You don’t have to see puppeteers everywhere. Just remember: someone — or something — is always deciding what you see first.
These 90s movies were never just about cool sci-fi twists.
They were about a deeper question:
When so much of what we see, remember, and believe is shaped for us,
how do we stay in touch with what’s actually real — and who we actually are?
We’ve always felt that tension.
We just finally have the technology — and the vocabulary — to see how high the stakes really are.