Loch Henry: Black Mirror, Can It Happen?

Before we dive into Loch Henry, let’s flash back to when this episode first aired: June 15, 2023.

In 2023, policing and law enforcement were under intense scrutiny across the United States, and small towns in particular made headlines in ways that echo the tensions explored in Loch Henry. In January, Tyre Nichols — a 29-year-old Black man — died after being beaten by five Memphis police officers from “Scorpion”, the city’s specialized crime-suppression unit.

That same month, a violent and racially charged incident unfolded in Rankin County, Mississippi, when six white law enforcement officers entered a home without a warrant and tortured two Black men

The officers, calling themselves the “Goon Squad,” were later convicted and sentenced to prison terms ranging from 10 to 40 years. 

Meanwhile, across the U.S., countless small towns have been forced to downsize—or even disband. In Minnesota, the town of Goodhue made headlines in 2023 when its entire police force resigned over low pay and staffing challenges, leaving the community entirely in the hands of the county sheriff. 

In 2023, Netflix released Murdaugh Murders: A Southern Scandal, a documentary that peeled back the façade of one of South Carolina’s most influential families. What started as a story about a tragic boating accident unraveled into a web of corruption, financial fraud, and generational violence orchestrated by attorney Alex Murdaugh. The series revealed how a powerful patriarch could operate unchecked for decades, because an entire community learned to look away.

All of this brings us to Black Mirror—Season 6, Episode 2: Loch Henry. The episode dives into personal trauma, systemic injustice, and buried community secrets. Davis returns to his small hometown, only to uncover a dark history of violence and corruption, including his own father’s hidden crimes. 

In this video, we’ll break down the episode’s themes, explore real-world parallels, and ask whether these events have already happened—and if not, whether it is still plausible. 

1. Still Recording

In Loch Henry, Davis brings his girlfriend Pia home to Scotland, expecting nothing more than a quiet visit and a small documentary project. But the moment they start digging into a local cold case, the more they uncover a truth no one ever intended to see. 

Digitizing his mom’s forgotten videotapes seemed like an innocent act, a way to preserve fading memories, until those memories revealed truths Davis had spent a lifetime unknowingly living beside.

What makes Loch Henry so unsettling is how familiar that unraveling feels. In real life, technology can expose the truth just as suddenly and brutally, often through the very tools we use to record memories. Old camcorders, security footage, and archived videos don’t just preserve the past; they can capture crimes, lies, and hidden actions that were never meant to be seen.

Take police body cameras. They were introduced as tools of transparency, but instead they’ve documented some of the most devastating failures in modern policing. 

On March 31, 2021, 22‑year-old Anthony Alvarez was fatally shot by a police officer in Chicago. Body‑cam footage showed Alvarez being shot in the back while fleeing, despite the police’s claim that he posed a threat.

When Pia first meets Davis’s mother, she asks if Pia grew up in America, noting her accent. What seems like an innocent question lands as a probing first impression, part of a series of subtle microaggressions that highlight her outsider status. The tension is heightened by the knowledge that Davis’s father was a police officer. These everyday slights reflect a harsher reality: around the world, visible minorities are often scrutinized and disproportionately targeted by law enforcement.

In late 2025, the Edmonton Police Service launched a pilot program equipping body-worn cameras with real-time facial recognition, scanning a “watch list” of more than 7,000 people. Privacy experts immediately raised red flags. Facial-recognition tech is still wildly unreliable, especially on marginalized groups, and rolling it out without broad public consultation risks turning entire communities into living databases.

A 2025 academic study by University of Philadelphia showed that the blurrier the footage the more facial recognition breaks down. The systems disproportionately misidentify Black people and women, creating a feedback loop of digital injustice.

Even the U.K. Home Office had to admit that the facial-recognition tools used by police generated significantly higher false positives for Black and Asian individuals—sometimes hundreds of times higher than for white subjects.

While converting old tapes to digital, Pia discovers footage that reveals Davis’s mother as an active participant in the murders.

Recording adds another layer of power to the abuse. For some perpetrators, the camera is a tool of control. Capturing the act makes it permanent, something they can own, revisit, and dominate long after the moment itself has passed.

In 2025, VICE reported on a video known online as “The Vietnamese Butcher”—a piece of footage circulated as entertainment despite documenting a real killing. Shot from multiple angles and edited like a production. 

Online investigators later linked the apparent victim to a Vietnamese man, who had previously discussed fantasies about being killed and sought out others willing to do so. At the same time, clips and still images from the video were reportedly sold in bundled packs on dark-web forums and Telegram channels.

Recording has always been sold as protection, a way to preserve facts. But from the moment video existed, it also created new problems: new forms of evidence to interpret and new arguments over who controls the narrative. History has shown how cameras can document injustice—or turn violence into spectacle, from sensationalized true crime to the disturbing legacy of snuff imagery. 

2. The Code of Silence

Once authorities uncovered Ian Adair’s torture chamber in Loch Henry, the quiet village was thrust into the national spotlight. But beneath the spectacle lies a familiar story: small communities often close ranks, protecting their own even in the face of terrible crimes.

Real-world parallels make this even more unsettling. Take Thunder Bay. For years, this Canadian city faced national scrutiny over the unexplained deaths of Indigenous youth, many of whom were dismissed as accidents or misadventures.

It wasn’t until external pressure mounted that a deeper investigation revealed systemic failures. Yet the truth only emerged fully when journalists and the Thunder Bay podcast revisited the cases, re-examining timelines and bringing long-ignored inconsistencies to light. 

Small towns often rely on overlapping social networks—police, officials, and longtime families—which makes whistleblowing socially costly. This is starkly illustrated by Skidmore, Missouri, a town of fewer than 300. In 1981, local bully Ken Rex McElroy was shot in broad daylight on the main street in front of dozens of townspeople. McElroy had terrorized residents for years and repeatedly evaded legal consequences. When he was finally killed, no one identified the shooters. The town’s near-unanimous silence was a deliberate, community-wide decision to shield those responsible.

Loch Henry captures this same dynamic: towns, families, and neighbors often band together to hide uncomfortable truths. And just like in Thunder Bay or Skidmore, it’s only when outsiders dig through old footage and forgotten records that the cracks in the community’s façade are exposed.

3. The Documentary Effect

In Loch Henry, the act of making a documentary rips open old wounds. Davis and Pia set out to film something they assume will barely get traction. But once they pitch the idea to their production contacts and unexpectedly secure funding, the project grows teeth. With real backing behind them, they push deeper into the town’s past.

In the real world, some of the biggest shifts in criminal justice have come from courageous filmmakers who were supposed to be observers — yet became participants.

In 2015, Netflix’s Making a Murderer thrust a salvage yard owner Steven Avery—convicted of the murder of photographer Teresa Halbach—into the global spotlight. The series re-examined the crime itself alongside allegations of mishandled evidence, coercive interrogations, and the institutional forces that shaped his conviction.  

The ground-breaking podcast Serial did something similar for Adnan Syed’s case in 2014, drawing millions into a meticulous re-examination of timelines, phone records, and investigative shortcuts—pressure that eventually led to the overturning of his conviction after more than two decades in prison.

Sometimes, the act of documentation itself becomes the turning point. The Jinx, released in 2015, began as a documentary profile of Robert Durst—a wealthy New York real estate heir long suspected in multiple murders but never successfully prosecuted. During post-production, filmmakers uncovered a chilling moment recorded after an interview, when Durst, still wearing a live microphone, muttered to himself: “Killed them all, of course.” That accidental recording became pivotal evidence, helping reopen the case and leading to Durst’s arrest and eventual conviction. 

Modern documentaries often succeed where police files have gone cold. Digitizing old tapes, enhancing degraded footage, re-analyzing audio, and applying new forensic tools can expose details investigators once missed. 

Series like The Staircase, The Keepers, and Don’t F**k With Cats show how returning to old evidence can fundamentally change our understanding of a crime. 

Taken together, these cases underline what Loch Henry captures so well: cameras, recordings, and storytelling don’t simply preserve the past—they dig it back up, pulling buried truths, forgotten evidence, and long-suppressed crimes into the present.

The more we unearth old recordings and forgotten technology, the more the cracks beneath the surface appear. Media can bring accountability, but it also turns private trauma into public reckoning. That tension is what makes the episode feel like a warning, because nothing truly stays buried once someone presses record.

Join my YouTube community for insights on writing, the creative process, and the endurance needed to tackle big projects. Subscribe Now!

For more writing ideas and original stories, please sign up for my mailing list. You won’t receive emails from me often, but when you do, they’ll only include my proudest works.

Smithereens: Black Mirror, Can It Happen?

Before we dive into the events of Smithereens, let’s flash back to when this episode first aired: June 5, 2019.

In 2019, guided meditation apps like Headspace and Calm surged in popularity. Tech giants like Google and Salesforce began integrating meditation into their wellness programs. By the end of the year, the top 10 meditation apps had pulled in nearly $195 million in revenue—a 52% increase from the year before.

That same year, Uber made headlines with one of the decade’s biggest IPOs, debuting at $45 a share and securing a valuation north of $80 billion. But the milestone was messy. Regulators, drivers, and safety advocates pushed back after a fatal 2018 crash in Tempe, Arizona, where one of the company’s self-driving cars struck and killed a pedestrian during testing.

Inside tech companies, the culture was shifting. While perks like catered meals and gym memberships remained, a wave of employee activism surged. Workers staged walkouts at Google and other firms, and in 2019, the illusion of the perfect tech workplace began to crack.

Meanwhile, 2019 set the stage for the global rollout of 5G, promising faster, smarter connectivity. But it also sparked geopolitical tensions, as the U.S. banned Chinese company Huawei from its networks, citing national security threats. 

Over it all loomed a small circle of tech billionaires. In 2019, Jeff Bezos held the title of the richest man alive with a net worth of $131 billion. Bill Gates followed, hovering between $96 and $106 billion. Mark Zuckerberg’s wealth was estimated between $62 and $64 billion, while Elon Musk, still years away from topping the charts, sat at around $25 to $30 billion.

And that brings us to this episode of Black Mirror, Season 5,  Episode 2: Smithereens

This episode pulls us into the high-stakes negotiation between personal grief and corporate power, where a rideshare driver takes an intern hostage—not for ransom, but for answers.

What happens when the tools meant to connect us become the things that break us?

It forces us to consider:  Do tech CEOs hold too much power, enough to override governments, manipulate systems, and play god?

And are we all just one buzz, one glance, one distracted moment away from irreversible change?

In this video, we’ll unpack the episode’s key themes and examine whether these events have happened in the real world—and if not, whether or not it is plausible. Let’s go!

Addicted by Design

In Smithereens, we follow Chris, a man tormented by the loss of his fiancée, who died in a car crash caused by a single glance at his phone. The episode unfolds in a world flooded by noise: the pings of updates, the endless scroll, the constant itch to check in. And at the center of it all is Smithereen, a fictional social media giant clearly modeled after Twitter.

Like Twitter, Smithereen was built to connect. But as CEO Billy Bauer admits, “It was supposed to be different.” It speaks to how platforms born from good intentions become hijacked by business models that reward outrage, obsession, and engagement at all costs.

A 2024 study featured by TechPolicy Press followed 252 Twitter users in the U.S., gathering over 6,000 responses—and the findings were clear: the platform consistently made people feel worse, no matter their background or personality. By 2025, 65% of users aged 18 to 34 say they feel addicted to its real-time feeds and dopamine-fueled design.

Elon Musk’s $44 billion takeover of Twitter in 2022 was framed as a free speech mission. Musk gutted safety teams, reinstated banned accounts, and renamed the platform “X.” What was once a digital town square transformed into a volatile personal experiment.

This accelerated the emergence of alternatives. Bluesky, a decentralized platform created by former Twitter CEO Jack Dorsey, aims to avoid the mistakes of its predecessor. With over 35 million users as of mid-2025, it promises transparency and ethical design—but still faces the same existential challenge: can a social app grow without exploiting attention?

In 2025, whistleblower Sarah Wynn-Williams testified before the U.S. Senate that Meta—Facebook’s parent company— had systems capable of detecting when teens felt anxious or insecure, then targeted them with beauty and weight-loss ads at their most vulnerable moments. Meta knew the risks. They chose profit anyway.

Meanwhile, a brain imaging study in China’s Tianjin Normal University found that short video apps like TikTok activate the same brain regions linked to gambling. Infinite scroll. Viral loops. Micro-rewards. The science behind addiction is now product strategy.

To help users take control of their app use, Instagram, TikTok, and Facebook offer screen-time dashboards and limit-setting features. But despite these tools, most people aren’t logging off. The average user still spends more than 2 hours and 21 minutes a day on social media with Gen Z clocking in at nearly 4 hours. It appears that self-monitoring features alone aren’t enough to break the cycle of compulsive scrolling.

What about regulations? 

A 2024 BBC Future article explores this question through the lens of New York’s SAFE Kids Act, set to take effect in 2025. This will require parental consent for algorithmic feeds, limit late-night notifications to minors, and tighten age verification. But experts warn: without a global, systemic shift, these measures are just patches on a sinking ship.

Of all the Black Mirror episodes, Smithereens may feel the most real—because it already is. These platforms don’t just consume our time—they consume our attention, our emotions, even our grief. Like Chris holding Jaden, the intern, at gunpoint, we’ve become hostages to the very systems that promised connection.

Billionaire God Mode

When the situation escalated in the episode, Billy Bauer activates God Mode, bypassing his own team to monitor the situation in real time and speak directly with Chris. 

In doing so, he reveals the often hidden power tech CEOs wield behind the scenes, along with the heavy ethical burden that comes with it. It hints at the master key built into their creations and the control embedded deep within the design of modern technology.

No one seems to wield “God Mode” in the real world quite like Elon Musk—able to bend markets, sway public discourse, and even shape government policy with a single tweet or private meeting.

The reason is simple: Musk had built an empire. 

In 2025, Tesla secured the largest U.S. State Department contract of the year: a $400 million deal for armored electric vehicles. 

Additionally, through SpaceX’s satellite network Starlink, Musk played an outsized role in Ukraine’s war against Russia, enabling drone strikes, real-time battlefield intelligence, and communication under siege. 

Starlink also provided emergency internet access to tens of thousands of users during blackouts in Iran and Israel, becoming an uncensored digital lifeline—one that only Musk could switch on or off.

But with that power comes scrutiny. Musk’s involvement in the Department of Government Efficiency—ironically dubbed “Doge”—was meant to streamline bureaucracy. Instead, it sowed dysfunction. Critics argue he treated government like another startup to be disrupted. Within months—after failing to deliver the promised $2 trillion in savings and amid mounting chaos—Donald Trump publicly distanced himself from Elon Musk and ultimately removed him from the post, temporarily ending the alliance between the world’s most powerful man and its richest.

It’s not just Musk. Other tech CEOs like Mark Zuckerberg have also shaped public discourse in quiet, powerful ways. In 2021, whistleblower Frances Haugen exposed Facebook’s secret “XCheck” system—a program that allowed approximately 6 million high-profile users to bypass the platform’s own rules. Celebrities and politicians—including Donald Trump—were able to post harmful content without facing the same moderation as regular users, a failure that ultimately contributed to the January 6 Capitol riots.

Amid the hostage standoff and the heavy hand of tech surveillance, one moment stands out: Chris begs Billy to help a grieving mother, Hayley. And Billy listens. He uses his “God Mode” to offer her closure by giving her access to her late daughter’s Persona account. 

In Germany, a landmark case began in 2015 when the parents of a 15-year-old girl who died in a 2012 train accident sought access to her Facebook messages to determine whether her death was accidental or suicide. A lower court initially ruled in their favor, stating that digital data, like a diary, could be inherited. The case saw multiple appeals, but in 2018, Germany’s highest court issued a final ruling: the parents had the right to access their daughter’s Facebook account.

In response to growing legal battles and emotional appeals from grieving families, platforms like Meta, Apple, and Google have since introduced “Digital Legacy” policies. These allow users to designate someone to manage or access their data after death, acknowledging that our digital lives don’t simply disappear when we do.

In real life, “God Mode” tools exist at many tech companies. Facebook engineers have used internal dashboards to track misinformation in real time. Leaked documents from Twitter revealed an actual “God Mode” that allowed employees to tweet from any account. These systems are designed for testing or security—but they also represent concentrated power with little external oversight.

And so we scroll.

We scroll through curated feeds built by teams we’ll never meet and governed by CEOs who rarely answer to anyone. These platforms know what we watch, where we go, and how we feel. They don’t just reflect the world—we live in the one they’ve built.

And if someone holds the key to everything—who’s watching the one who holds the key?

Deadly Distractions

In Smithereens, Chris loses his fiancée to a single glance at his phone. A notification. An urge. A reminder that in a world wired for attention, even a moment of distraction can cost everything.

In 2024, distracted driving killed around 3,000 people in the U.S.—about eight lives lost every single day—and injured over 400,000 more

Of these, cellphone use is a major factor: NHTSA data shows that cellphones were involved in about 12% of fatal distraction-affected crashes. This means that, in recent years, over 300 to 400 lives are lost annually in the U.S. specifically due to phone-related distracted driving accidents. 

While drunk driving still causes more total deaths, texting while driving is now one of the most dangerous behaviors behind the wheel—raising the risk of a crash by 23 times.

In April 2014, Courtney Ann Sanford’s final Facebook post read: “The Happy song makes me so HAPPY!” Moments later, her car veered across the median and slammed into a truck. She died instantly. Investigators found she had been taking a selfie and updating her status while driving.

Around the world, laws are evolving to address the dangers of distracted driving. In the United States, most states have banned texting while driving—with 48 or 49 states, plus Washington D.C. and other territories, prohibiting text messaging for all drivers, and hands-free laws expanding to more jurisdictions. 

 In Europe, the UK issues £200 fines and six penalty points for distracted driving. Spain and Italy have fines starting around €200—and in Italy, proposed hikes could push that up to €1,697. The current highest fine is in Queensland, Australia, where drivers caught texting or scrolling can face fines up to $1,250

To combat phone use behind the wheel, law enforcement in Australia and Europe now deploys AI-powered cameras that scan drivers in real time. Mounted on roadsides or mobile units, these high-res systems catch everything from texting to video calls. If AI flags a violation, a human officer reviews it before issuing a fine.

As for the role of tech companies? While features like Apple’s “Do Not Disturb While Driving” mode exist, they’re voluntary. No country has yet held a tech firm legally accountable for designing apps that lure users into dangerous distractions. Public pressure is building, but regulation lags behind reality.

In Smithereens, the crash wasn’t just a twist of fate—it was the inevitable outcome of a system designed to capture and hold our attention: algorithms crafted to hijack our minds, interfaces engineered for compulsion, and a culture that prizes being always-on, always-engaged, always reachable. And in the end, it’s not just Chris’s life that’s blown to smithereens—it’s our fragile illusion of control, shattered by the very tech we once believed we could master.

We tap, scroll, and swipe—chasing tiny bursts of dopamine, one notification at a time. Chris’s story may be fiction, but the danger it exposes is all too real. It’s in the car beside you. It’s in your own hands as you fall asleep. We can’t even go to the bathroom without it anymore. No hostage situation is needed to reveal the cost—we’re living it every day.

Join my YouTube community for insights on writing, the creative process, and the endurance needed to tackle big projects. Subscribe Now!

For more writing ideas and original stories, please sign up for my mailing list. You won’t receive emails from me often, but when you do, they’ll only include my proudest works.

We don’t elect governments; we elect scapegoats

Image via Thinkstock

How we love to blame one entity for everything

By Elliot Chan, Opinions Editor
Formerly published in The Other Press. October 28, 2015

Isn’t it great when we can point a finger at someone and say it’s all his or her fault? It’s his fault our economy is in the sink. It’s her fault the ecosystem is dying. It’s his fault I can’t find love, a job, or a place to live. Yes, it’s always easier to blame someone rather than a group of people.

For the past few months, I’ve watched everyone on my predominantly left-wing Facebook news feed blame Stephen Harper for everything wrong with our country. What’s wrong with our country, or what’s wrong with them? So often, the government becomes the scapegoat for all our problems: our failing business, our disobedient child, and our inability to find work, romance, and better health. Instead of taking onus for our problems, we blame the government.

Guess what? The government is not looking out for you, regardless of what you think. The politician doesn’t give a hoot about all the crap you have going on in your life. If you think Justin Trudeau is going to solve all your problems—or even one of your problems—you need to face reality. If you are not taxed for this, you’ll be taxed for that. Nobody can fix what’s wrong with your life but you.

Blaming one sole entity, whether it’s your employer for holding you back, or your instructor for giving you poor grades is a self-destructive way of thinking. Pretty much what you are saying is: “I’m perfect, I don’t need to change, it’s the world around me needs to change. It’s that one person over there who needs to change.” You’ll grow old a bitter, resentful person if this is your way of thinking.

Pointing fingers and placing blame is a defensive mechanism designed to make someone else look worse than you. This is especially effective if the person is of higher rank or prestige. Remember how much Canucks fans enjoyed blaming Roberto Luongo for every hockey game lost? Whoever is at the top, we expect perfection from them, or else give them the noose. But ask yourself: can you stop more shots and win more games? Nope, but you’d like to think you can though.

Don’t think. Do. Stop identifying problems as something manifested by other people. Stop investing your emotional energy on things you cannot control. Can you be a better worker? Yes. Can you force your boss to increase your wage? No. Can you vote for the candidate you like? Yes. Are you able to force others to do the same? No. People will vote or not vote for whomever they want. They’ll cheer for whatever they want. They’ll fail, succeed, and live their life without a care for you. You can blame them, but why do they care? They don’t even know you.