Smithereens: Black Mirror, Can It Happen?

Before we dive into the events of Smithereens, let’s flash back to when this episode first aired: June 5, 2019.

In 2019, guided meditation apps like Headspace and Calm surged in popularity. Tech giants like Google and Salesforce began integrating meditation into their wellness programs. By the end of the year, the top 10 meditation apps had pulled in nearly $195 million in revenue—a 52% increase from the year before.

That same year, Uber made headlines with one of the decade’s biggest IPOs, debuting at $45 a share and securing a valuation north of $80 billion. But the milestone was messy. Regulators, drivers, and safety advocates pushed back after a fatal 2018 crash in Tempe, Arizona, where one of the company’s self-driving cars struck and killed a pedestrian during testing.

Inside tech companies, the culture was shifting. While perks like catered meals and gym memberships remained, a wave of employee activism surged. Workers staged walkouts at Google and other firms, and in 2019, the illusion of the perfect tech workplace began to crack.

Meanwhile, 2019 set the stage for the global rollout of 5G, promising faster, smarter connectivity. But it also sparked geopolitical tensions, as the U.S. banned Chinese company Huawei from its networks, citing national security threats. 

Over it all loomed a small circle of tech billionaires. In 2019, Jeff Bezos held the title of the richest man alive with a net worth of $131 billion. Bill Gates followed, hovering between $96 and $106 billion. Mark Zuckerberg’s wealth was estimated between $62 and $64 billion, while Elon Musk, still years away from topping the charts, sat at around $25 to $30 billion.

And that brings us to this episode of Black Mirror, Season 5,  Episode 2: Smithereens

This episode pulls us into the high-stakes negotiation between personal grief and corporate power, where a rideshare driver takes an intern hostage—not for ransom, but for answers.

What happens when the tools meant to connect us become the things that break us?

It forces us to consider:  Do tech CEOs hold too much power, enough to override governments, manipulate systems, and play god?

And are we all just one buzz, one glance, one distracted moment away from irreversible change?

In this video, we’ll unpack the episode’s key themes and examine whether these events have happened in the real world—and if not, whether or not it is plausible. Let’s go!

Addicted by Design

In Smithereens, we follow Chris, a man tormented by the loss of his fiancée, who died in a car crash caused by a single glance at his phone. The episode unfolds in a world flooded by noise: the pings of updates, the endless scroll, the constant itch to check in. And at the center of it all is Smithereen, a fictional social media giant clearly modeled after Twitter.

Like Twitter, Smithereen was built to connect. But as CEO Billy Bauer admits, “It was supposed to be different.” It speaks to how platforms born from good intentions become hijacked by business models that reward outrage, obsession, and engagement at all costs.

A 2024 study featured by TechPolicy Press followed 252 Twitter users in the U.S., gathering over 6,000 responses—and the findings were clear: the platform consistently made people feel worse, no matter their background or personality. By 2025, 65% of users aged 18 to 34 say they feel addicted to its real-time feeds and dopamine-fueled design.

Elon Musk’s $44 billion takeover of Twitter in 2022 was framed as a free speech mission. Musk gutted safety teams, reinstated banned accounts, and renamed the platform “X.” What was once a digital town square transformed into a volatile personal experiment.

This accelerated the emergence of alternatives. Bluesky, a decentralized platform created by former Twitter CEO Jack Dorsey, aims to avoid the mistakes of its predecessor. With over 35 million users as of mid-2025, it promises transparency and ethical design—but still faces the same existential challenge: can a social app grow without exploiting attention?

In 2025, whistleblower Sarah Wynn-Williams testified before the U.S. Senate that Meta—Facebook’s parent company— had systems capable of detecting when teens felt anxious or insecure, then targeted them with beauty and weight-loss ads at their most vulnerable moments. Meta knew the risks. They chose profit anyway.

Meanwhile, a brain imaging study in China’s Tianjin Normal University found that short video apps like TikTok activate the same brain regions linked to gambling. Infinite scroll. Viral loops. Micro-rewards. The science behind addiction is now product strategy.

To help users take control of their app use, Instagram, TikTok, and Facebook offer screen-time dashboards and limit-setting features. But despite these tools, most people aren’t logging off. The average user still spends more than 2 hours and 21 minutes a day on social media with Gen Z clocking in at nearly 4 hours. It appears that self-monitoring features alone aren’t enough to break the cycle of compulsive scrolling.

What about regulations? 

A 2024 BBC Future article explores this question through the lens of New York’s SAFE Kids Act, set to take effect in 2025. This will require parental consent for algorithmic feeds, limit late-night notifications to minors, and tighten age verification. But experts warn: without a global, systemic shift, these measures are just patches on a sinking ship.

Of all the Black Mirror episodes, Smithereens may feel the most real—because it already is. These platforms don’t just consume our time—they consume our attention, our emotions, even our grief. Like Chris holding Jaden, the intern, at gunpoint, we’ve become hostages to the very systems that promised connection.

Billionaire God Mode

When the situation escalated in the episode, Billy Bauer activates God Mode, bypassing his own team to monitor the situation in real time and speak directly with Chris. 

In doing so, he reveals the often hidden power tech CEOs wield behind the scenes, along with the heavy ethical burden that comes with it. It hints at the master key built into their creations and the control embedded deep within the design of modern technology.

No one seems to wield “God Mode” in the real world quite like Elon Musk—able to bend markets, sway public discourse, and even shape government policy with a single tweet or private meeting.

The reason is simple: Musk had built an empire. 

In 2025, Tesla secured the largest U.S. State Department contract of the year: a $400 million deal for armored electric vehicles. 

Additionally, through SpaceX’s satellite network Starlink, Musk played an outsized role in Ukraine’s war against Russia, enabling drone strikes, real-time battlefield intelligence, and communication under siege. 

Starlink also provided emergency internet access to tens of thousands of users during blackouts in Iran and Israel, becoming an uncensored digital lifeline—one that only Musk could switch on or off.

But with that power comes scrutiny. Musk’s involvement in the Department of Government Efficiency—ironically dubbed “Doge”—was meant to streamline bureaucracy. Instead, it sowed dysfunction. Critics argue he treated government like another startup to be disrupted. Within months—after failing to deliver the promised $2 trillion in savings and amid mounting chaos—Donald Trump publicly distanced himself from Elon Musk and ultimately removed him from the post, temporarily ending the alliance between the world’s most powerful man and its richest.

It’s not just Musk. Other tech CEOs like Mark Zuckerberg have also shaped public discourse in quiet, powerful ways. In 2021, whistleblower Frances Haugen exposed Facebook’s secret “XCheck” system—a program that allowed approximately 6 million high-profile users to bypass the platform’s own rules. Celebrities and politicians—including Donald Trump—were able to post harmful content without facing the same moderation as regular users, a failure that ultimately contributed to the January 6 Capitol riots.

Amid the hostage standoff and the heavy hand of tech surveillance, one moment stands out: Chris begs Billy to help a grieving mother, Hayley. And Billy listens. He uses his “God Mode” to offer her closure by giving her access to her late daughter’s Persona account. 

In Germany, a landmark case began in 2015 when the parents of a 15-year-old girl who died in a 2012 train accident sought access to her Facebook messages to determine whether her death was accidental or suicide. A lower court initially ruled in their favor, stating that digital data, like a diary, could be inherited. The case saw multiple appeals, but in 2018, Germany’s highest court issued a final ruling: the parents had the right to access their daughter’s Facebook account.

In response to growing legal battles and emotional appeals from grieving families, platforms like Meta, Apple, and Google have since introduced “Digital Legacy” policies. These allow users to designate someone to manage or access their data after death, acknowledging that our digital lives don’t simply disappear when we do.

In real life, “God Mode” tools exist at many tech companies. Facebook engineers have used internal dashboards to track misinformation in real time. Leaked documents from Twitter revealed an actual “God Mode” that allowed employees to tweet from any account. These systems are designed for testing or security—but they also represent concentrated power with little external oversight.

And so we scroll.

We scroll through curated feeds built by teams we’ll never meet and governed by CEOs who rarely answer to anyone. These platforms know what we watch, where we go, and how we feel. They don’t just reflect the world—we live in the one they’ve built.

And if someone holds the key to everything—who’s watching the one who holds the key?

Deadly Distractions

In Smithereens, Chris loses his fiancée to a single glance at his phone. A notification. An urge. A reminder that in a world wired for attention, even a moment of distraction can cost everything.

In 2024, distracted driving killed around 3,000 people in the U.S.—about eight lives lost every single day—and injured over 400,000 more

Of these, cellphone use is a major factor: NHTSA data shows that cellphones were involved in about 12% of fatal distraction-affected crashes. This means that, in recent years, over 300 to 400 lives are lost annually in the U.S. specifically due to phone-related distracted driving accidents. 

While drunk driving still causes more total deaths, texting while driving is now one of the most dangerous behaviors behind the wheel—raising the risk of a crash by 23 times.

In April 2014, Courtney Ann Sanford’s final Facebook post read: “The Happy song makes me so HAPPY!” Moments later, her car veered across the median and slammed into a truck. She died instantly. Investigators found she had been taking a selfie and updating her status while driving.

Around the world, laws are evolving to address the dangers of distracted driving. In the United States, most states have banned texting while driving—with 48 or 49 states, plus Washington D.C. and other territories, prohibiting text messaging for all drivers, and hands-free laws expanding to more jurisdictions. 

 In Europe, the UK issues £200 fines and six penalty points for distracted driving. Spain and Italy have fines starting around €200—and in Italy, proposed hikes could push that up to €1,697. The current highest fine is in Queensland, Australia, where drivers caught texting or scrolling can face fines up to $1,250

To combat phone use behind the wheel, law enforcement in Australia and Europe now deploys AI-powered cameras that scan drivers in real time. Mounted on roadsides or mobile units, these high-res systems catch everything from texting to video calls. If AI flags a violation, a human officer reviews it before issuing a fine.

As for the role of tech companies? While features like Apple’s “Do Not Disturb While Driving” mode exist, they’re voluntary. No country has yet held a tech firm legally accountable for designing apps that lure users into dangerous distractions. Public pressure is building, but regulation lags behind reality.

In Smithereens, the crash wasn’t just a twist of fate—it was the inevitable outcome of a system designed to capture and hold our attention: algorithms crafted to hijack our minds, interfaces engineered for compulsion, and a culture that prizes being always-on, always-engaged, always reachable. And in the end, it’s not just Chris’s life that’s blown to smithereens—it’s our fragile illusion of control, shattered by the very tech we once believed we could master.

We tap, scroll, and swipe—chasing tiny bursts of dopamine, one notification at a time. Chris’s story may be fiction, but the danger it exposes is all too real. It’s in the car beside you. It’s in your own hands as you fall asleep. We can’t even go to the bathroom without it anymore. No hostage situation is needed to reveal the cost—we’re living it every day.

Join my YouTube community for insights on writing, the creative process, and the endurance needed to tackle big projects. Subscribe Now!

For more writing ideas and original stories, please sign up for my mailing list. You won’t receive emails from me often, but when you do, they’ll only include my proudest works.

Striking Vipers: Black Mirror, Can It Happen?

Before we discuss the events in Striking Vipers, let’s flashback to when this episode was first released: June 5, 2019.

In 2019, the Movember Foundation ran a global campaign for men’s health with celebrities like Stephen Fry, Bear Grylls, Stephen Merchant, and Nicole Scherzinger using humorous videos and social media to encourage men to talk about their health. 

Back in 2019, consumer VR was caught between promise and practicality. Premium headsets like the Oculus Rift demanded expensive, high-powered PCs, pushing total setup costs over $1,500. Meanwhile, budget-friendly options like Samsung Gear VR delivered underwhelming performance. With few blockbuster games to drive demand, mainstream adoption stalled. As a result, companies like IMAX closed their VR divisions.

Still, VR found new life in enterprise applications. Walmart used VR training modules to boost employee retention and immerse staff in real-world scenarios, while sectors like healthcare and manufacturing also adopted VR for training simulations

At the same time, 2019 marked significant milestones for LGBTQ+ visibility. Elliot Page (then Ellen) was a vocal advocate for gender-affirming care, Lil Nas X came out as gay during the peak of “Old Town Road”, and Pete Buttigieg launched his historic campaign as the first openly gay U.S. presidential candidate.

And that brings us to this episode of Black Mirror. Episode 1 of Season 5: Striking Vipers.

This episode welcomes us into a digital world where friendship, desire, and identity collide. Through the lens of a VR fighting game turned emotional crucible, the episode explores how immersive tech can both reveal and distort our deepest needs, leaving us with some unsettling questions: 

What happens when technology offers a more fulfilling life than reality? Can a digital body expose truths we’re too afraid to face in the physical world? And as virtual experiences grow more vivid, are we prepared for the emotional and ethical consequences they bring?

In this video, we’ll unpack the episode’s key themes and examine whether these events have happened in the real world—and if not, whether or not it is plausible. Let’s go!

Blurred Realities

When Karl gives Danny a birthday gift—Striking Vipers X, a hyper-realistic VR fighting game—their casual nostalgia takes an unexpected turn. In this game, players don’t just control avatars; they fully inhabit them, experiencing every physical sensation their characters feel. 

As their in-game battles escalate into a sexual relationship, the emotional intensity of their connection begins to strain Danny’s marriage and forces both men to confront their desires, identities, and the blurry lines between reality and fantasy.

While today’s VR systems don’t yet plug directly into our brains, the separation between real and virtual intimacy is growing increasingly thin. New technology like haptic suits and internet-connected sex toys like teledildonics lets people feel touch and physical sensations from far away. Companies like Kiiroo offer Bluetooth-enabled devices that sync with a partner’s movements or online media, making remote intimacy physically real. 

In a 2023 survey, a staggering 41% of users say they’ve fallen in love in virtual reality—and it’s not about looks. In fact, two-thirds of those who’ve fallen in love in VR say their partner’s physical sex doesn’t even matter. 

However, the darker side of immersive technology is getting harder to overlook. Many VR platforms quietly collect personal data—like your heart rate, facial expressions, and even brain activity—often without users fully understanding or consenting. 

According to the American Association for Marriage and Family Therapy, up to a third of internet users go online for sexual reasons—and nearly 1 in 5 can become addicted to it. As internet use becomes more common, more couples are running into serious issues like trust problems, emotional distance, and even breakups because of online infidelity.

A 2017 Deseret News survey revealed striking gender and generational divides in what people consider cheating. Women were significantly more likely than men to label both online and offline behaviors as “always cheating”—59% of women, compared to just 42% of men, said that sending flirty messages crosses the line, while 70% of women said simply having an online dating profile counts as infidelity.

In a survey of 91 women and 3 men affected by a partner’s cybersex addiction, 68% described sexual problems in their relationship directly related to the addiction. About 22% said the addiction was a major reason for separation or divorce. 

Age also played a role in how people view cheating. Surprisingly, millennials were more likely than Gen Xers to say that watching porn alone is cheating. These changing opinions show how modern technology is making the line between loyalty and betrayal harder to define. 

For Danny, the escape wasn’t just into a game. It was into a version of himself he couldn’t find in daylight. And maybe that’s the real question Striking Vipers leaves us with: when the fantasy fits better than the life we’ve built—what do we choose to come home to?

As the truth comes to light, Danny and Theo strike an agreement: once a year, he returns to the virtual world, and she explores real-life connections of her own. It’s not the first time they’ve played pretend—earlier in the episode, they flirted with role-play to revive their spark. But this time, the game is real. Their compromise isn’t a happy ending so much as a new set of rules.

In the United States, polygamy is extremely rare-less than 0.5% of households-but public acceptance is growing. Approval of polygamy as morally acceptable has risen from 7% in 2003 to 23% in 2024, especially among younger, unmarried, and less religious Americans. Interestingly, men are six times more likely than women to be open to polygynous relationships, according to recent UK research.

We already live at the edges of intimacy—crafting curated selves, clinging to parasocial ties, chasing comfort in the glow of a screen. VR, AI, and immersive worlds only pull us deeper, fusing intimacy and illusion into something hard to untangle.

Bodies in the Mirror

In the game, Karl chooses to play as a female fighter named Roxette, not just as a disguise—but as a truth he hasn’t yet admitted. What unfolds is less about sex and more about the fluidity of self in a world where identity can be downloaded and worn like clothing.

The episode reflects the real-world experience of exploring names, pronouns, and appearances in digital spaces before coming out in everyday life. It captures the emotional challenges that many LGBTQ+ individuals face during their coming-out journeys.

In 2023 alone, more than 30 new laws targeting LGBTQ-related education were enacted, reshaping the 2023–24 school year. These measures include bans on discussing sexual orientation and gender identity in classrooms, limits on pronoun use, and mandates for parental notification or opt-in before students can access LGBTQ-inclusive curricula.

Simply put, the physical world is not a welcome one for exploration, which is why so many turn to digital spaces to discover who they are.

A 2025 study on ZEPETO—a social app where people interact through avatars—found that female users who took on male avatars felt more connected to their virtual characters and more confident in their real-life gender identity. 

Inclusive design has been shown to boost mental health and promote a sense of empowerment. A 2024 study of 79 trans and gender-diverse adults found that customizable avatars in games were associated with increased enjoyment, empowerment, and authentic self-representation, while restricted customization reduced engagement and could trigger distress or dysphoria. 

Trans and gender-diverse youth face far higher rates of rejection, discrimination, and violence than their cisgender peers. As a result, around 61% experience suicidal thoughts, and nearly one in three have attempted suicide—more than four times the rate of cisgender youth.

In this context, the digital world becomes a lifeline. Research shows that having just one online space where LGBTQ+ youth feel safe and understood is linked to a 20% lower risk of suicide attempts and a 15% drop in recent anxiety. 

Virtual bodies aren’t just avatars—they’re mirrors of inner truth. And for those navigating the margins of society’s acceptance, they can become windows into a more authentic future.

But here’s a deeper question: when does a safe space become a place to hide? 

The Digital High

It starts with two old friends staying up all night playing the game they loved in their twenties—laughing, trash-talking, reliving the past. But what begins as nostalgia slowly shifts. The game becomes a secret habit, a nightly escape that feels more thrilling and alive than the routine of Danny’s real life.

Soon, he’s forgetting his anniversary and growing distant from his wife. Striking Vipers isn’t just about sex or fantasy; it’s about how addiction can sneak in under the cover of comfort, and how escaping reality too often can leave the real world behind.

Between 2% and 20% of frequent VR users display compulsive behaviors, with addiction risk linked to the immersive feeling of embodiment inside an avatar.

Our attention spans have dropped to just 45 seconds on average—and video games are a major driver. Many of the most addictive titles keep us hooked with competitive and social features (like Fortnite or League of Legends), immersive escapism (Skyrim, Stardew Valley), and personalized role-play (World of Warcraft, The Sims). These experiences trigger dopamine hits, making everyday life feel dull, chaotic, or unrewarding in contrast.

Video game addiction affects an estimated 3–4% of gamers worldwide, with higher rates among adolescents and young adults, especially males. Addicted gamers can spend up to 100 hours a week immersed in play, sacrificing relationships, hobbies, and responsibilities along the way.In Striking Vipers, the title itself becomes a metaphor: just like a viper’s deadly strike, addiction can sneak up unexpectedly, striking again and again as players hunt for that elusive thrill.

Join my YouTube community for insights on writing, the creative process, and the endurance needed to tackle big projects. Subscribe Now!

For more writing ideas and original stories, please sign up for my mailing list. You won’t receive emails from me often, but when you do, they’ll only include my proudest works.

When 2016 could have been like 1984

Opinions_Apple-768x512

Apple standing up against the pressure of FBI is a critical moment for our future

By Elliot Chan, Opinions Editor
Formerly published in The Other Press. Mar 2, 2016

Flash back to New Year’s Eve 1983, when Apple released one of the most monumental and memorable commercials to date. The ad depicted a heroine charging at a Big Brother-like figure, an homage to George Orwell’s dystopian novel Nineteen Eighty-Four, with a hammer. The heroine ends up throwing the hammer at the figure and the figure erupts, and then words appeared on the screen: “On January 24th, Apple Computer will introduce the Macintosh. And you’ll see why 1984 won’t be like Nineteen Eighty-Four.

We all celebrated.

On February 16, 32 years after the commercial aired, Apple CEO Tim Cook wrote a letter to his customers, raising a lot of concerns and showing how close we were to losing our privacy, just like the characters from Orwell’s fiction. The technology company was receiving external pressure from the US government to build a backdoor to customers’ personal devices. This backdoor would enable government officials—under specific protocol and significant measures—to access data by bypassing security. In another word, the FBI would have been able to access your iPhone if they were “suspicious” of you.

Cook wrote in his letter that such a backdoor does not currently exist, and that they don’t intend to build one, despite the government’s pressure—and pressure from many fearful citizens. The risk is far too great. The slope is far too slippery. One thing will lead to the next and before you know it, the government will have access to all the data we keep in our devices. We keep a lot of data in our devices.

Creating this backdoor is undoubtedly a knee-jerk reaction to the countless terrorist attacks that have taken place on US soil recently, because terrorists use the same technology we do and need to communicate with each other to orchestrate attacks. However, to simply give up our rights to privacy within our personal communication channels would be a victory for the terrorists. They want us to take extreme measures. They want us to turn the lens upon ourselves. The world does not become safer because of heightened monitoring. It becomes more paranoid.

I remember years ago when cameras in public places was a big controversy. Now, it is the norm. But those cameras are stationary. They don’t travel with us. They are not an extension of who we are. We don’t share our intimate moments with those cameras. Our devices, on the other hand, are in a sense our other hand, and to have the government forcibly hold it wherever we go is a scary thought. It’s what Apple vowed not to do when they aired that commercial. It vowed not to turn our world into a dystopian place ruled by a mistrustful administration, and it is holding true to its word.

While the answer is not to build a backdoor, I do believe there is a solution, one that requires thought and careful calculation, and one that does not compromise the security and privacy of law-abiding citizens. We just need to think about it differently.