Sora Isn't the Problem: It's the Mirror
The Real Problem
I finally got access to Sora in TikTok form, and something clicked. This isn't annoying because Sora is bad. It's annoying because it's honest.
The Core Problem: Friction Is Gone
Here's the insight: lying used to have a cost. If you wanted to fabricate video evidence, you had to work. Film it. Edit it. Make it convincing. That work was friction. Friction meant there was a penalty for lying—time, effort, risk of being caught in the production.
Sora removes that penalty entirely.
Now you prompt an AI. You want a fake historical event? Seconds. Celebrity deepfake? Done. False testimony on video? Trivial. The cost of lying just collapsed to zero. It's now easier to fabricate than to capture reality.
This changes everything.
Why This Breaks the System
Social media has always optimized for engagement over truth. But engagement-over-truth only works if truth is expensive enough to be selective. You can't put everything on the feed. You have to choose.
With friction, lying is selective. You lie when it matters, when the payoff justifies the work. The system survives because most content is still just... regular stuff. People sharing their lives. Creators doing legitimate work.
Sora removes that selection. Now lying is free. So the system optimizes: if truth and lies cost the same to produce, and if engagement cares about neither, then the algorithm picks based on one thing: what keeps you scrolling?
The answer isn't truth. It's novelty. Spectacle. Uncanniness. Deepfakes of celebrities doing weird things are more engaging than someone filming their actual day.
Once friction is gone, the system must fill the feed with fabrication. Not because creators are evil. But because the incentives make fabrication cheaper than reality.
What This Looks Like
Echo chambers accelerate. TikTok already concentrates algorithmic curation into bubbles. Sora makes that bubble self-reinforcing: same IP, same memes, same likenesses, generated infinitely. Homogeneity at scale. The machine feeding itself.
Truth becomes optional. Audiences stop sorting by "real vs. fake" and start sorting by "entertaining vs. dull." The Verge reported getting trapped in scroll loops of deepfaked celebrities and fabricated moments—even knowing they were AI. The uncanniness is the feature. Plausibility doesn't require proof anymore.
Persuasion decouples from evidence. Short-form video already rewards punchy narratives over grounded reporting. Sora makes that worse: you can visually assert anything now. Any scene. Any dialogue. Any context. Spectacle beats verification. Virality beats truth.
The Epiphany
Here's what hit me: Sora isn't breaking social media. It's just making visible what was always there.
We built platforms optimized for engagement. Then we added algorithms designed to concentrate attention. Then we added infinite scroll and habit-forming UI. Then we added unlimited content generation via AI.
At each step, we told ourselves it was fine. "Engagement metrics are just incentives." "Echo chambers are just efficiency." "Disinformation is a problem we'll solve later."
But watching Sora—watching people trapped in loops of fabricated celebrity cameos—you can't pretend anymore. The problem isn't that AI is too good at generating video. The problem is that friction is gone. When lying costs nothing and engagement doesn't care about truth, the system must fill itself with slop.
That's not a bug. That's the machine working exactly as designed.
What Friction Did For Us
We didn't realize it, but friction was a natural check on this. Video production was slow. Editing took time. Deepfakes required skill. There was a price for lying on camera. Most people paid it in truth instead.
Now there is no price. Lying is as effortless as telling the truth. The system was never equipped to handle that. It was designed assuming some friction. Some cost to fabrication. Some incentive to be selective about deception.
Sora removes that assumption.
The result is what you're seeing: infinite scroll of fabricated content, algorithmically sorted for engagement, presented in a UI designed for habit formation, watched by people who stopped caring if it's real.
That's not a problem with Sora. That's what happens when you remove friction from a system designed to exploit engagement above all else.
The machine didn't break. It just stopped pretending.
Related posts:
- AI The New Backdoor Layoff - How companies use AI as cover for not hiring
- On Technological Stratification - How AI widens inequality and access gaps
- Perplexity replaces Google Search and Apple News for me - How AI is changing information consumption
- Google AI Overviews are cutting web traffic in half - The impact of AI on traditional web patterns