Arkangel: Black Mirror, Can It Happen?

Before we talk about the events in Arkangel, let’s take a look back to when this episode was first released: December 29, 2017.

One of the most high-profile celebrity parenting moments came in June 2017 when Beyoncé gave birth to twins, Sir and Rumi Carter. This announcement went viral, showcasing how celebrities influence public discussions around pregnancy, motherhood, and parenting. 

Meanwhile, the ethical debates around gene editing intensified, particularly with CRISPR technology, “designer babies”, and parental control over genetics. According to MIT, more than 200 people have been treated to this experimental genome editing therapy since it dominated headlines in 2017. 

In December of that year, France enacted a landmark law banning corporal punishment, including spanking, marking a significant shift toward advocating for children’s rights and promoting positive parenting practices. With this legislation, France joined many of its European neighbors, following Sweden, which was the first to ban spanking in 1979, Finland in 1983, Norway in 1987, and Germany in 2000.

Earlier in the year, the controversial travel ban implemented by the Trump administration raised significant concerns, particularly regarding family separations among immigrants from several Muslim-majority countries. Later, the issue escalated with the separation of immigrant families at the U.S.-Mexico border, sparking heated discussions about children’s rights and the complexities of parenting in crisis situations. 

Moreover, the effectiveness of sex education programs came under scrutiny in 2017, particularly as some states continued to push abstinence-only approaches, potentially contributing to rising teenage pregnancy rates. This concern was again exacerbated by the Trump administration, specifically their cuts to Title X funding for teen pregnancy prevention programs.

In 2017, Juul e-cigarettes surged in popularity among teenagers. Social media played a significant role in this trend, with platforms like Snapchat and Instagram flooded with content depicting teens vaping in schools. This led to school bans and public health worries, particularly as Juul e-cigarettes, shaped like a conventional USB harddrive was capable of delivering nicotine nearly 3 times faster than other e-cigarettes. In the coming years, over 60 deaths of teenagers will follow as a direct result of smoking Juuls.

And that’s what brings us to this episode of Black Mirror, Episode 2 of Season 4: Arkangel. As Sara matures, her mother, Marie’s inability to overcome her fears and over-reliance on technology ends up stifling Sara’s growth. Leaving us all questioning our reality, as the prevalence of cameras, sensors, and monitors is now readily accessible — and strategically marketed — to the new generation of parents.

Can excessive control hinder a child’s independence and development? Where does one draw the line between protection and autonomy in parenting? What are the consequences of being overly protective, and is the resentment that arises simply a natural cost of loving a child? 

In this video we will explore three themes of this episode and determine whether or not these events have happened and if not, whether they’re still plausible.  Let’s go! 

Love — and Overprotection

In “Arkangel”, the deep bond between Marie and her daughter Sara is established from the very beginning. After a difficult birth, Marie’s attachment is heightened by the overwhelming relief that followed. However, when young Sara goes missing for a brief but terrifying moment at a playground, her protective instincts shift into overdrive. 

Consumed by fear of losing Sara again, Marie opts to use an experimental technology called Arkangel. This implant not only tracks Sara’s location but also monitors her vital signs and allows Marie to censor what she can see or experience. Driven by the anxiety of keeping her daughter safe and healthy, Marie increasingly relies on Arkangel. But as Sara grows older, the technology starts to intrude on her natural experiences, such as witnessing a barking dog or the collapse of her grandfather.

Perhaps the products that most relate to Arkangel the most are tracking apps like Life360, which have become popular, providing parents with real-time location data on their kids. However, in 2021, teens protested the app’s overuse, arguing it promoted an unhealthy culture of mistrust and surveillance, leading to tension between parents and children. In a number of cases, the parents will continue using Life360 to track their kids even after they have turned 18. 

Now let’s admit it, parenting is hard — and expensive. A 2023 study by LendingTree found that the average annual cost of raising a child in the U.S. is $21,681. With all the new technology that promises to offer convenience and peace of mind, it would almost seem irresponsible not to buy a $500 product as insurance. 

The latest innovation in baby monitors includes the Cubo AI which uses artificial intelligence to provide parents with features such as real-time detection of potential hazards, including facial obstruction, crying, and dangerous sleep positions. It includes a high-definition video feed, night vision, and the ability to capture and store precious moments. 

But these smart baby monitors and security cameras have created a new portal to the external world, and therefore, new problems. In 2020, for instance, iBaby monitors were hacked. Hackers not only accessed private video streams but also saved and shared them online. In some cases, horrified parents discovered strangers watching or even speaking to their children through these monitors.

For many years, manufacturers of smart baby monitors prioritized convenience over security, allowing easy access through simple login credentials that users often don’t change. Additionally, some devices use outdated software or lack firmware updates, leaving them open to exploitation. 

As technology advances, parenting methods evolve, with a growing trend towards helicopter parenting — a style marked by close monitoring and control of children’s activities even after they pass early childhood. 

Apps like TikTok introduced Family Pairing Mode in 2021 to help parents set screen time limits, restrict inappropriate content, manage direct messages, and control the search options. 

Child censorship and content blocking tools can be effective in protecting younger children from inappropriate content, however, they can also foster resentment if overused, and no system is foolproof in filtering content. 

However, many parents are not using iPads as simply entertainment for their children, they are relying on the iPad as a babysitter. Which hinders their children from learning basic skills like patience, especially when managing something that requires focus and attention. 

A 2017 study by Common Sense Media revealed that nearly 80 percent of children now have access to an iPad or similar tablet, making it more common for kids to be consistently online. 

Bark, Qustodio, and Net Nanny are just a few apps in a growing market that offer parents control over their children’s digital activities. While these tools provide protection by monitoring texts, emails, and social media, they also allow parents to intervene. But children, like hackers, are getting more savvy as well.

A recent survey by Impero Software, which polled 2,000 secondary school students, showed that 25 percent of them admitted to watching harmful or violent content online during class, with 17 percent using school-issued devices to do so. Additionally, 13 percent of students reported accessing explicit content, such as pornography, while 10 percent used gambling sites—all while in the classroom.

Parental involvement, communication, and gradual freedom are crucial for ensuring these new technologies work as intended. However, we’ve seen from real-world events and this episode, how overreliance on technology like Arkangel, driven by a maternal fear of losing control, can become problematic. This natural impulse to protect a child hasn’t kept pace with the power such technology grants, ultimately overlooking the child’s need for emotional trust and autonomy, not just physical safety.

Sex — and Discovery

In Arkangel, as Sara enters adolescence, she begins a romantic relationship with her classmate, Trick. Unbeknownst to her, her mother, Marie, uses the Arkangel system to secretly monitor Sara’s intimate moments. 

The situation reaches a breaking point when Marie uncovers the shocking truth: Sara is pregnant. Overcome with maternal love and anxiety, Marie feels compelled to act by sneaking emergency contraceptive pills into Sara’s daily smoothie — the decisive move that will forever change her relationship.

This episode highlights the conflict between natural curiosity and imposed restrictions, emphasizing the risks of interfering or suppressing someone’s sexual experiences and personal choices. In today’s world, this mirrors the ongoing struggle faced by parents, educators, and regulators navigating the balance between sexual education, community support programs, and the natural discovery of personal identity.

Bristol Palin, daughter of Sarah Palin, was thrust into the spotlight at 17 when her pregnancy was announced during her mother’s 2008 vice-presidential campaign. As Sarah Palin had publicly supported abstinence-only education, Bristol’s pregnancy came across as somewhat hypocritical.

A year later, the tv-series Teen Mom premiered and stood as a stark warning about the harsh realities of teenage pregnancy. Beneath its cheery MTV-branding, the show was a depiction of sleepless nights, financial desperation, and mental health struggles. The hypocrisy of a society that glorifies motherhood but fails to support these young women is evident as innocences is ripped from their lives. This show doesn’t just reveal struggles; it exposes a broken system.

A 2022 study by the American College of Pediatricians found that nearly 54% of adolescents were exposed to pornography before age 13, shaping their early understanding of sex. With gaps in sex education, many adolescents turn to pornography to learn.

According to a report (last updated in 2023) by Guttmacher Institute, abstinence is emphasized more than contraception in sex education across the 39 US states and Washington D.C. that have mandated sex education and HIV education. While 39 states require teaching abstinence, with 29 stressing it, only 21 states mandate contraception information. 

Many argue that providing students with information about contraception, consent, and safe sex practices leads to better health outcomes. They cite lower rates of unintended pregnancies and sexually transmitted infections (STIs) in places with comprehensive programs. For example, countries like the Netherlands.

As of 2022, the U.S. had a birth rate of around 13.9 births per 1,000 teens aged 15-19, although this represents a significant decline from previous years. In contrast, the Netherlands with the lowest teen pregnancy rates globally, has just 2.7 births per 1,000 teens in the same age group. 

Yes, we can’t overlook the effectiveness of “Double Dutch,” which combines hormonal contraception with condoms. 

The provision of contraceptives, including condoms, for minors is a topic of significant debate. While some districts, such as New York City public schools, offer free condoms as part of their health service, many believe that such decisions should be left to the parents. 

However, many agree that teens who feel uncomfortable discussing contraception with their parents should still have the ability to protect themselves. A notable example is California’s “Confidential Health Information Act,” which allows minors who are under the insurance of their parents to access birth control without parental notification. 

On the other hand, critics contend that such programs may undermine parental authority and encourage sexual behavior. But such matters extend beyond teenagers. 

Globally, access to contraceptives is tied to reproductive rights, and therefore, women’s rights. In the U.S., following the Supreme Court’s 2022 decision to overturn Roe v. Wade, many states have enacted stricter abortion laws.  

In 2023, the abortion pill mifepristone faced legal challenges, with pro-life advocates seeking to restrict access to medication abortions in multiple states. 

The ongoing struggle to protect reproductive rights and the risks of sliding toward a reality where personal choices are dictated by external authorities is upon us. This episode shows us that, just as Marie’s overreach in Arkangel results in dire consequences for Sara, society must remain vigilant in safeguarding the right to choose to ensure that individuals maintain control over their own lives and bodies.

Drugs — and Consequences

Like sex and violence, this episode uses drugs as a metaphor for the broader theme of risky behavior and self-discovery, a process many teenagers go through. 

However, when Sara experiments with drugs, Marie becomes immediately aware of it through Arkangel’s tracking system.

By spying on her daughter, Marie takes away Sara’s chance to come forward on her own terms. Instead of waiting for Sara to open up when she’s ready, Marie finds out everything through surveillance. This knowledge weighs heavily on her, pushing her to intervene without considering what Sara actually needs.

But when it comes to drugs, is there really time for parents to wait? Does the urgency of substance abuse among teens demand immediate action? In a situation as life-threatening as drug use, doesn’t every second count? 

When rapper Mac Miller passed away from an accidental overdose in 2018, the shock rippled far beyond the music world. His death became a wake-up call, shining a harsh light on the silent struggles of teenage addiction. 

In 2022, a report from UCLA Health revealed that, on average, 22 teenagers between the ages of 14 and 18 died from drug overdoses each week in the U.S. This stark reality underscores a growing crisis, with the death rate for adolescents rising to 5.2 per 100,000, largely driven by fentanyl-laced counterfeit pills. 

This surge has led to calls for stronger prevention measures. Schools are expanding drug education programs to raise awareness of fentanyl in counterfeit pills, while many communities are making naloxone (Narcan), an opioid overdose reversal drug, more readily available in schools and public spaces.

The gateway drug theory argues that starting with something seemingly harmless and socially accepted, like marijuana or alcohol, may open the door to harder drugs over time. 

Teens who use e-cigarettes are more likely to start smoking traditional tobacco products, like cigarettes, cigars, or hookahs, within a short period. In a National Institute of Health study comparing ninth-grade students, 31% of those who had used e-cigarettes transitioned to combustible tobacco within the first six months, compared to only 8% of those who hadn’t used e-cigarettes. 

Developed by Chinese pharmacist Hon Lik, the first e-cigarette was patented in 2003 with the intention of aiding smokers in quitting by replicating the act of smoking while minimizing exposure to tar and other harmful substances. Yes, vaping was promoted as a safer choice, attracting a new market of non-smokers drawn in by enticing flavors.

In 2014, NJOY — a vaporizer manufacturer accused of infringing on Juul’s patents — launched a campaign with catchy slogans like “Friends Don’t Let Friends Smoke”.They strategically placed ads in bars and nightclubs, embedding vaping into social settings to help normalize the behavior, making it seem like a trendy choice.

Ten years later, this narrative has been significantly challenged, as vaping has become the most prevalent form of nicotine use among teenagers in the U.S. as of 2024.

But deep down, maybe we’re looking at drug use all wrong. Instead of just thinking about the risks, it’s worth asking why so many young people are turning to drugs in the first place. What drives them to make that choice? 

Nearly three-quarters (73%) of the 15,963 teenagers who participated in an online survey conducted by the National Addictions Vigilance Intervention and Prevention Program, about their motivations for drug and alcohol use from 2014 to 2022 reported that they used substances “to feel mellow, calm, or relaxed.” Additionally, 44% indicated they used drugs, such as marijuana, as sleep aids.

While drug use among teenagers is a growing concern, the primary challenges young people face might not be addiction, but rather anxiety, depression, and the crippling sense of hopelessness. It is possible that a parent’s overprotectiveness can sometimes misdirect focus towards the wrong problems, leading to a dangerous reliance on technology that fails to reveal the full picture.

Whether the threat is external or tied to self-exploration, this episode of Black Mirror demonstrates how parental fears can easily transform into controlling behaviors. It reflects real-life scenarios where teens, feeling trapped or misunderstood, may seek escape through drugs, sex, or even violence.

Parents, with their best intention, often believe they’re bringing home a protective shield for their children. However, instead the approach turns into a sword, cutting into their relationships and severing the bonds they’ve worked so hard to maintain. What they thought would keep them safe only deepened the divide, a poignant reminder that sometimes the tools meant to protect can backfire and be the ones that cause the most harm.

Join my YouTube community for insights on writing, the creative process, and the endurance needed to tackle big projects. Subscribe Now!

For more writing ideas and original stories, please sign up for my mailing list. You won’t receive emails from me often, but when you do, they’ll only include my proudest works.

USS Callister: Black Mirror, Can It Happen?

Before we talk about the events in USS Callister, let’s flashback to when this episode was first released: December 29, 2017

In March 2017, Nintendo shook up the gaming industry with the release of the Nintendo Switch, a hybrid console that could be used both as a handheld and a home system. Its flexibility and the massive popularity of games like The Legend of Zelda: Breath of the Wild catapulted it to success with over 2.74 million units sold in the first month. 

The same year, Nintendo also released the Super NES Classic, a mini version of their 90s console that left fans scrambling due to shortages.

In the realm of augmented and virtual reality, 2017 also marked important strides. Niantic introduced stricter anti-cheating measures in Pokémon GO, while Oculus revealed the Oculus Go—a more affordable, standalone VR headset designed to bring immersive experiences to more people. Games like Lone Echo pushed the limits of VR, showcasing futuristic gameplay with its zero-gravity world.

However, in the real world, there were significant conversations about the risks of excessive gaming, particularly in China, where new regulations were put in place to limit minors’ time and spending on online games. These shifts in culture raised awareness around the addictive potential of immersive digital environments.

No it was not all fun and games — in fact, there was a lot of work as well. The year was also defined by controversies in the workplace. In October 2017, the Harvey Weinstein scandal broke, igniting the #MeToo movement and leading to widespread discussions about abuse of power, harassment, and accountability. 

Uber was rocked by similar revelations earlier in the year, with a blog post by former engineer Susan Fowler shedding light on a toxic work environment, which ultimately led to the resignation of CEO Travis Kalanick. 

Google wasn’t exempt from these cultural reckonings either, with the firing of software engineer James Damore after his controversial memo questioning the company’s diversity efforts went viral. 

In his memo titled “Google’s Ideological Echo Chamber,” Damore argues that the underrepresentation of women in tech isn’t simply due to discrimination but is also influenced by biological differences between male and female. He further claims that Google should do more to foster an environment where conservative viewpoints, like his, can be freely expressed.

And that brings us to this episode of Black Mirror. Episode 1, Season 4 — USS Callister. This episode combines the excitement of virtual reality with a chilling exploration of power, control, and escapism. 

Much like the controversies of 2017, it asks hard questions: How do we balance the benefits of technology with the ethical implications of its use? What happens when someone with unchecked power has control to live out their darkest fantasies? And finally, how do we confront the consequences of our gradual immersion in digital worlds? 

In this video, we’ll explore three key themes from USS Callister and examine whether similar events have happened—and if they haven’t, whether or not they are still plausible. Let’s go! 

Toxic Workplace

In this episode, we follow Robert Daly, a co-founder and CTO of a successful tech company, Callister. Despite his critical role in the company, Daly is overshadowed by his partner, James Walton, the CEO. Daly’s lack of leadership skills is evident, creating a strained work environment where he is seen as ineffective.

However, in the modified version of the immersive starship game Infinity — a game developed by Callister — Daly lives out his darkest fantasy by assuming the role of a tyrannical captain in a replica of his favorite show, Space Fleet. Here, he wields absolute control over the digital avatars of his employees, who are trapped in the game and forced to obey his every command. This exaggerated portrayal of Daly’s need for power not only reflects his real-world impediments but also highlights his troubling intentions, such as his coercive demands and manipulative actions toward his employees.

USS Callister explores themes of resistance and empowerment as the avatars begin to recognize their situation and challenge Daly’s authority. Their collective struggle to escape the virtual prison serves as a powerful metaphor that underscores the broader issue of navigating workplaces with domineering and unsympathetic employers.

When Elon Musk took over Twitter (now rebranded as X) in October 2022, his management style quickly drew criticism for its harshness and lack of consideration for employees. Musk implemented mass layoffs, cutting about half of the company’s workforce abruptly. By April 2023, Musk confirmed he had fired roughly 80%.

He also implemented a demanding work culture, requiring employees to submit one-page summaries outlining their contributions to the company in order to retain their jobs. This expectation, coupled with long hours and weekend shifts under intense pressure, reflected a disregard for work-life balance and contributed to a high-stress environment.

The rapid and drastic changes under Musk’s tenure not only led to legal and operational challenges but as of January 2024, Fidelity reports that X has seen a 71% decline in value since Elon Musk acquired the company.

In 2020, former staff members accused Ellen DeGeneres and her management team of creating a workplace culture marked by bullying, harassment, and unfair treatment—contradicting her public persona of kindness. Following the backlash and tarnished reputation, Ellen ended her 19 season run and aired her final episode on May 26, 2022 with guests, Jennifer Aniston, Billie Eillish, and Pink.

In November 2017, Matt Lauer, a longtime host of NBC’s “Today” show, was fired after accusations of sexual harassment surfaced. Following his termination, more allegations emerged from female colleagues, revealing a pattern of misconduct. Perhaps the most damning detail was Lauer’s use of a secret button to lock his office door — from the outside—to keep other employees from walking in. 

As harassment in the physical world continues to receive widespread attention, it has also found new avenues in digital spaces. 

According to an ANROWS (Australia’s National Research Organisation for Women’s Safety) report from 2017, workplace harassment increasingly moved online, with one in seven people using tech platforms to harass their colleagues. Harassment via work emails, social media, and messaging platforms became a rising issue, showing the darker side of digital communication in professional environments.

In the same year, concerns about workplace surveillance and management practices emerged, particularly at tech companies. 

Amazon was a prime example of invasive productivity tracking, where employees’ movements and actions were constantly monitored. If their performance drops below their expected productivity rate, they risk being fired.

These challenges extended to remote work, where platforms like Slack encouraged a culture of constant availability, even after hours. 

The rise of automated tools, like HireVue’s AI-powered hiring platform and IBM’s AI-driven performance reviews, raised concerns about bias, unfair evaluations, and the lack of human empathy in the hiring and management processes.

These developments highlight broader trends in workplace dynamics, where toxic environments and power imbalances are increasingly magnified by the misuse of technology. This theme is echoed in USS Callister, where personal grievances and unchecked authority in a digital world allow one man to dominate and manipulate his employees within a disturbing virtual playground. The episode serves as a cautionary tale, illustrating how the abuse of power in both real and digital realms can lead to harmful consequences.

Stolen Identity

In USS Callister, Robert Daly’s method of replicating his colleagues’ identities in Infinity involves a disturbing form of theft. Daly uses biometric and genetic material to create digital clones of his coworkers. Specifically, he collects DNA samples from personal items, such as a lollipop discarded by a young boy and a coffee cup used by his colleague, Nannette Cole.

Daly’s access to advanced technology enables him to analyze these DNA samples and extract the personal information necessary to recreate his victims’ digital identities. These avatars, based on the DNA he collected, are trapped within the game, where Daly subjects them to his authoritarian whims.

The use of DNA in this context underscores a profound invasion of privacy and autonomy, turning personal genetic material into tools for exploitation.

Digitizing DNA involves converting genetic sequences into digital formats for storage, analysis, and interpretation. This process begins with sequencing the DNA to determine the order of nucleotides, then converting the sequence into binary code or other digital representations. The data is stored in databases and analyzed using advanced software tools. 

These technologies enable personalized medicine, genetic research, and ancestry analysis, advancing our understanding of genetics and its applications. Key players in this field include companies like Illumina and Thermo Fisher Scientific, as well as consumer services like 23andMe and Ancestry.com

As more of our genetic data is stored in databases, our personal information becomes increasingly vulnerable. Hackers, scammers, and malicious actors are constantly seeking new ways to exploit data for profit. 

One example is the 2020 Twitter hack, which saw the accounts of major public figures like Elon Musk and Joe Biden hijacked to promote a cryptocurrency scam. The breach not only caused financial losses for unsuspecting followers but also raised alarms about the security of our most-used platforms. 

In 2022, a phishing attack targeted Microsoft Office 365, employing a tactic known as consent phishing to exploit vulnerabilities in multi-factor authentication. In some cases, the attackers impersonated the US Department of Labor and tricked users into granting access to malicious applications and exposing sensitive data such as emails and files. 

In 2024, a BBC investigation revealed an almost 60% increase in romance scams, where individuals used fake identities to form online relationships before soliciting money under false pretenses. 

Similarly, there has also been a rise in sextortion scams targeting college students, where scammers manipulated their victims into compromising situations and demanded ransoms, threatening to release the sensitive material if they didn’t comply.

Jordan DeMay, a 17-year-old high school student from Michigan, died by suicide in March 2022 after being targeted in a sextortion scam that can be traced to two Nigerian brothers, Samuel and Samson Ogoshi, who were later arrested and extradited to the U.S. on charges of conspiracy and exploitation. 

These instances of identity exploitation mirror another concerning trend: the misuse of genetic data. In 2019, GEDmatch—the database that helped catch the Golden State Killer—experienced a breach that exposed genetic data from approximately one million users who had opted out of law enforcement access. The breach allowed law enforcement to temporarily access private profiles without consent, raising significant privacy concerns about the security of sensitive personal data.

Some insurance companies — specifically those in Canada—  have been criticized for misusing genetic data to raise premiums or deny coverage, especially in areas like life or disability insurance. This highlights the importance of understanding your policy and legal rights, as insurance companies are not always complying to new regulations such as the Genetic Nondiscrimination Act (GNDA).

All this illustrates the terrifying possibilities shown in USS Callister, that our most intimate data — our identity — could be used against us in ways we never imagined. Whether through hacked social media accounts, phishing scams, or stolen genetic data, the digital age has given rise to new forms of manipulation.

Stuck in a Game

In USS Callister, the very avatars Daly dominates ended up outwitting him in a thrilling turn of events. Led by Nanette Cole, the trapped digital crew formulates a bold plan to break free. While Daly is preoccupied, the crew triggers an escape through a hidden wormhole in the game’s code that forces an upgrade. They outmaneuver Daly by transferring themselves to a public version of the game and locking him out for good. As the avatars seize their freedom, Daly, once the ruler of his universe, is left trapped in isolation — doomed.

For anyone who has ever been drawn into the world of video games, “trapped” feels like a fitting description.

Some games, such as Minecraft or massively multiplayer online games (MMOs), have an open-ended structure that allows for infinite play. Without a defined ending, players can easily become absorbed in the game for hours at a time.

Games also tap into social connectivity. Multiplayer games like Fortnite and World of Warcraft foster relationships, forming tight-knit communities where players bond over shared experiences. Much like social media, this sense of connection can make it more difficult to disengage, as players feel a part of something bigger than themselves.

In both USS Callister and real-world video games, a sense of progression and achievement is built into the experience. Daly manipulates his world to ensure a constant sense of control and success that fails to replicate real life, where milestones and mastery can take weeks, months, and years. 

Video games are highly effective at captivating players through well-designed reward systems, which often rely on the brain’s natural release of dopamine. This neurotransmitter, associated with pleasure and motivation, plays a key role in the cycle of gratification. This behavioral reinforcement is seen in other addictive activities, such as gambling.

Game developers employ a multitude of psychological techniques to keep players hooked — trapped. 

The World Health Organization’s (WHO) recognition of “gaming disorder” in 2022 underscores the growing concern surrounding video game addiction. Lawsuits against major gaming companies like Call of Duty, Fortnite, and Roblox have shown serious efforts to hold companies accountable for employing addictive algorithms similar to those found in casinos.

Real-world tragedies have also shed light on the dangers of excessive gaming. In Thailand, for instance, 17-year old Piyawat Harikun died following an all-night gaming marathon in 2019, sparking debates over the need for better safeguards to protect young gamers. Cases like this hammer home the need for stronger regulations around how long players, especially minors, are allowed to engage in these immersive experiences.

The financial aspects of gaming, such as esports, has created incentives for players to commit to their addiction as a vocation. Players who make money through competitive gaming or virtual economies may find themselves stuck in cycles of excessive play to maintain or increase their earnings. 

This phenomenon is evident in high-profile cases like Kyle “Bugha” Giersdorf, who won $3 million in the Fortnite World Cup, or Anshe Chung, aka the Rockefeller of Second Life, a virtual real estate mogul. 

Then there is the rise of blockchain-based games like Axie Infinity, a colorful game powered by Ethereum-based cryptocurrencies, which introduces financial speculation into the gaming world. These play-to-earn models push players to engage excessively in the hopes of earning monetary rewards. However, they also expose players to significant financial risks, as in-game currency values fluctuate unpredictably, often leading to a sunk-cost fallacy where players feel compelled to continue investing despite diminishing returns.

This episode reminds us that we can often find ourselves imprisoned by our work. Yet, the cost of escapism can be high. While technology may seem to open doors to new worlds, what appears to be an endless realm of freedom can, in reality, be a staircase leading to an inevitable free-fall. USS Callister highlights the abyss that technology can create and the drain it has on our most valuable resource — time. This episode serves as a warning: before we log in at the behest of those in power, we should remember that what happens in the virtual world will ultimately ripple out into the real one.

Join my YouTube community for insights on writing, the creative process, and the endurance needed to tackle big projects. Subscribe Now!

For more writing ideas and original stories, please sign up for my mailing list. You won’t receive emails from me often, but when you do, they’ll only include my proudest works.

What Will Happen If Electronic Technology Disappears? | World Building Questions

As I’m editing this post-apocalyptic story, I realized that I may have lefts some major questions unanswered. I also wonder if trying to answer them at this point in my process may be a distraction from what I should be doing — writing more.

I figured it wouldn’t hurt to stop and ponder. In a previous video, I proposed the question: What Can Bring Us Back to the Stone Ages?

In this follow-up video, I explore the three pillars of modern-day society:

  1. Health
  2. Wealth
  3. Knowledge

and analyze how the sudden disappearance or failure of electronic technology will affect the western world, which relies so heavily on these innovations.

Follow my writing journey on YouTube! 

No More Technology: What Can Bring Us Back to the Stone Ages? | World Building Questions

No more technology — and when I say technology, I’m referring specifically to electronics.

What can bring us back to the stone ages or a time before computers and smartphones?

In this episode of The Other Epic Story Vlog, I discuss a key factor that drives my story, which is set in a tech-free future, a time when people lived confined to their local communities and is limited by the resources and information they have.

While I discussed the consequence in the story, I don’t ever address the cause, and as a writer and world builder, I myself — at a minimum — must understand it. So what can cause the world to black out and loses hold on the riches that is modern technology?

I jot down three possible causes:

1) Solar flares: Known as coronal mass ejections, plasma and magnetic fields caused by the disruption of the sun can fry all of Earth’s power grids.

2) Limited resources: Many of the elements used to develop electronics are in limited supply, such as ruthenium (used for hard drives) and hafnium (used for processors).

3) Human error: As we start to automate more and more of our process, human error can cause a chain reaction, where unanticipated issues to override fail-safes.

… and a bonus factor… oooohhhh….

Follow my writing journey on YouTube!   

 

Do what the robots can’t

Opinions_Robot-workers

If robots can replace your job, it’s not the robots’ fault

By Elliot Chan, Opinions Editor
Formerly published in The Other Press. June 8, 2016

Robots are here to make our lives easier, and in the process, they are eliminating a lot of menial work. We see it everywhere from the banking to the food industry, and all areas of retail and trade. These industries employ people all across the globe. The idea of all of these jobs becoming obsolete is a bit concerning since there has yet to be a real replacement.

When a worker is made redundant, replaced by a machine or an algorithm, the situation is met with pessimism. The notion is that if you don’t know how to code, you might as well starve. However, the rise of the automated, robotic workforce is something we have been experiencing since our youth. We grew up with computers and machines, so why is it so shocking when a new system replaces us on the assembly line?

In tech, there is a lot of talk about disruption. Is this software or hardware capable of changing the way we accomplish a task? Can the iPhone change the way we pay our bills? Will streaming services make video rental stores relics? How can virtual reality change the way we shop online? Not only do innovators consider how a product can disrupt an industry, they consider the industries ripe for disruption. They find the problem before the solution.

A controversial disruption at the moment is with driverless cars. The technology is there, but regulations and lobbyists are preventing it from reaching the next phase. The transportation network Uber has openly announced that as soon as driverless cars are available, clients will be able to select that as an option when hailing a ride. Who’s angry with this? Taxi drivers, chauffeurs, transit people, and anybody else that makes a living working in transportation.

Only time will tell if driverless cars will become a fixture in our daily society. But if I was a taxi driver, I’m not going to bank on my driving skills to sustain me for the next 40 years, I’m going to start developing some other set of skills just in case. Learning how to fix cars can be another skill to add on. That’s just a thought.

So often we are pessimistic when it comes to new technology stealing our jobs. But these technologies didn’t sneak up on us. These technologies took years and years of development. They are all over the news and they gave us every opportunity to be more relevant. Like a rival, it is pushing us to improve. You cannot and should not fight against it, as it has been shown all through history, humans will veer to the side of convenience, profitability, and security.

Turn the lens onto yourself and ask: “How will a robot disrupt my career?” Then, either build that robot, or be better than it. The question is not how robots can replace you, but how can you replace the robots when they come? I’m confident that you will figure it out.

We are only as smart as our AI

Opinions_Tay-Tweets-1

What Microsoft’s bot, Tay, really says about us

By Elliot Chan, Opinions Editor
Formerly published in The Other Press. April 7, 2016

While we use technology to do our bidding, we don’t always feel that we have supremacy over it. More often than not, we feel dependent on the computers, appliances, and mechanics that help our every day run smoothly. So, when there is a chance for us to show our dominance over technology, we take it.

As humans, we like to feel smart, and we often do that through our ability to persuade and influence. If we can make someone agree with us, we feel more intelligent. If we can change the way a robot thinks—reprogram it—we become gods indirectly. That is something every person wants to do. When it comes to the latest Microsoft intelligent bot, Tay, that is exactly what people did.

I have some experience chatting with artificial intelligence and other automated programs. My most prevalent memory of talking to a robot was on MSN Messenger—back in the days—when I would have long-winded conversations with a chatbot named SmarterChild. Now, I wasn’t having deep introspective talks with SmarterChild. I was trying to outsmart it. I’d lead it this way and that, trying to make it say something offensive or asinine. Trying to outwit a robot that claims to be a “smarter child” was surprisingly a lot of fun. It was a puzzle.

When the programmers at Microsoft built Tay, they probably thought it would have more practical uses. It was designed to mimic the personality of a 19-year-old girl. Microsoft wanted Tay to be a robot that could genuinely engage in conversations. However, without the ability to understand what she was actually copying, she had no idea that she was being manipulated by a bunch of Internet trolls. She was being lied to and didn’t even know it. Because of this, she was shut down after a day of her adapting to and spouting offensive things over Twitter.

I believe we are all holding back some offensive thoughts in our head. Like a dam, we keep these thoughts from bursting through our mouths in day-to-day life. On the Internet we can let these vulgar thoughts flow. When we know that the recipient of our thoughts is a robot with no real emotion, we can let the dam burst. There is no real repercussion.

In high school, I had a pocket-sized computer dictionary that translated English into Chinese and vice versa. This dictionary had an audio feature that pronounced words for you to hear. Obviously what we made the dictionary say was all the words we weren’t allow saying in school. I’m sure you can imagine a few funny ones. That is the same as what people do with bots. To prove that the AI is not as smart as us, we make it do what we don’t. At the moment, I don’t believe the general public is sophisticated enough to handle artificial intelligence in any form.

People who need people rating apps

Opinions_Peeple-768x460

Controversial app Peeple is everything tech shouldn’t become

By Elliot Chan, Opinions Editor
Formerly published in The Other Press. March 23, 2016

I hate that review apps exist to begin with. While customer reviews are one of the most trusted forms of marketing, I have little respect for the people who leave negative reviews. What can I say? When I read reviews sometimes, I often feel that those who wrote them are small people who need to do whatever it takes to feel big. They are using their power of free speech to harm a business.

Now, it gets worse. There is now an app that allows you to rate and review people’s reputations. The app is called Peeple, and it is gaining a lot of negative publicity. Why not? Remember when you were young, and your parents taught you that if you have nothing good to say, then you shouldn’t say anything at all? This teaching should not change in the digital age, but I believe it has. Take a look at all the bullshit comments on social media if you don’t believe me.

It’s clear that things are going to get worse before they are going to get better in this realm.

Interacting with people shouldn’t be the same as buying electronics. You shouldn’t go online, Google someone, and compare them with other people. The thing is, I know what the creators and founders of Peeple were thinking: so many people are shitty. Yes, of course, people are shitty, but that is life. Dealing with shitty people, whether they are in front of you in the Starbucks lineup or they are your parents, is a part of human existence. Technology does not make people more considerate or more caring, especially not an app that encourages people to treat others like businesses.

If you were a business, you would separate the job from your personal identity. You would have a website, a LinkedIn page, a Facebook fan page, or anything else where you can have a two-way channel, where there can be communication, and progress to resolving an issue—should there be one. However, if it is just a review or a rating system, rarely is there any valuable feedback. It’s more or less just a rant or words of caution. Since, we aren’t talking about a business but an actual human person with feelings, giving someone a one-star rating is a clear, unprovoked diss.

Let’s live in a world where we can approach each other as friends and speak honestly, rather than reviewing and rating others, harbouring animosity, and deterring others from having a genuine human experience. If you truly want to help someone, and not just judge them, you wouldn’t use an app like Peeple to express your thoughts.

And for those who really care about their online reputation, well, maybe you should work on your actual human reputation first.

How to live with Big Brother

Big_Brother_is_Watching_Wide

Understanding why privacy matters

By Elliot Chan, Opinions Editor
Formally published in The Other Press. February 17, 2016

While it isn’t necessarily the government that is tracking all your activity, the combination of all the data accumulated in day-to-day life is enough for them to know you better than your parents do. We can almost be certain that, although there is nobody watching us on a screen, our every action is recorded, filed away, and capable of being pulled out and evaluated by those with the credentials to do so. Most often those people aren’t people at all, they are just marketing algorithms designed to match your queries and daily behaviours with advertisements.

Now, Google isn’t out to embarrass you by exposing your search queries. TransLink will not send a message to your girlfriend if you decide to make a mysterious trip out to Surrey. Bell is not going to let your boss know that you’ve been trash talking him with your friends. These things don’t benefit the company, so don’t be paranoid.

It’s hard to trust the motives of big corporations, but I always bring it back to one question: Does such and such action cause them to lose or gain money? If your behaviour continues to benefit the business you get the service from, you can keep going merrily by—as long as you are not committing any heinous crimes.

There is no way around it; we need to trust companies to use our information ethically. However, we need to also be conscious of what information we are haphazardly giving away. See, privacy matters. Without privacy, you’ll lose control of your own life. The companies will own it.

Any sort of meaningful self-development does not happen in a group, or with Sauron’s eye watching you. It happens independently, not on Facebook and not while Googling. I’m not talking about education or improving your business skills or finding online romance, I’m talking about the growth that occurs when you are allowed room to breathe. This is the type of growth that has no deadlines and no guidance. This in essence is the life you’ll live.

We have become so obsessed with sharing our experiences on social media, telling everything we do to Big Brother, that we are forgetting the real point of our pursuits: to create memories that aren’t saved on any hard drive, except the one between our ears. We are scared of people listening in on us, but we have stopped listening to ourselves.

The season is changing. It’ll be a warm summer, I predict. This is an opportunity to get away from the information highway and do something nobody on the Internet will know. Big companies are constantly collecting data, and so should you. The good thing is, you get to decide what information you want to store: what’s spat out to you by those online or what you discover yourself. It’s up to you.

Tell me what I want

Opinions_Apple_1_Computer_In_A_Briefcase-768x509

How Apple is changing our outlook on technology

By Elliot Chan, Opinions Editor
Formerly published in the 1976-theme issue of The Other Press. January 13, 2016

The old way of thinking: Nobody owns a computer because nobody needs one. Take a look at the new Apple 1, which came on sale this summer (July 1976). It looks like something a high school student built during the final days before the science fair. That crummy looking machine is worth the equivalent of a month’s salary for many middle-class people.

Few consumers want computers, and even fewer understand them, but that is not how trends should continue. People are generally content with living day to day within a routine. Technology doesn’t abide by those rules. Technology disrupts, but it often takes many years for it to do so. The same way the printing press, the wristwatch, and the steam engine changed the world, I believe that computers can do the same.

Yet when I approach every new technology—like the Apple 1—I still say: “Nah! I don’t need that. I’m happy with what I have.” I’m happy writing this article out on a pen and paper, then transcribing it on a word processor, and transferring that to a printing press. That’s not a big deal to me.

Steve Jobs, the young and hip founder of Apple, said: “People don’t know what they want until you show it to them… Our task is to read things that are not yet on the page.” It’s an inspiring quote that perfectly separates innovators from us mere mortals. This quote allows me to be even more optimistic about technology, knowing that in most cases it will win over.

Will there one day be virtual reality, mobile payments, or robot vacuum cleaners available to consumers? Probably. It could happen within the year, or it could take 40 years, but to write off technology is an ignorant reaction to change. We all need to push in the direction of progress. We need to push with Jobs and the Apple 1.

It’s easy to look to the past and think about how stupid those people were for doing things the “old” ways. Yet, what would the future generation say about us? Yes, technology is stealing jobs away from hardworking people, but I don’t believe that is a bad thing. I believe that people, like technology, should evolve. We need to start thinking like innovators and less like routine-orientated consumers. We should not just pick a job and stick with it. If you look at it, pretty much every job could be replaced with a robot one day, but I ask you this: how will you work with the technology?

Computers aren’t stealing jobs away from people. Computers are changing the way people work. Take this example: bank tellers are losing jobs to automated-teller machines. But then again, what are tellers doing to respond to this? They must innovate. We must see what has yet to be written.

In-app purchase games are out of line

Photo via Thinkstock

What’s to blame: tech-company trickery or poor parenting?

By Elliot Chan, Opinions Editor
Formerly published in The Other Press. October 21, 2015

On October 9, Kanye West took to Twitter to give mobile game developers a little piece of his mind: “That makes no sense!!! We give the iPad to our child and every five minutes there’s a new purchase!!!” He added: “If a game is made for a two-year-old, just allow them to have fun and give the parents a break for Christ sake.” Empathic and on point as West was, he also neglected to mention that the mother of his child has one of the most lucrative mobile games on the market. I’m speaking of Kim Kardashian: Hollywood, a game where you get to prepare the reality TV star for the red carpet.

It’s hard to sympathize with West, because… well, who gives a shit what he does financially. However, many parents out there are facing the same problem as the multi-millionaire rapper. They give their kids an iPad, as a replacement for a doll, a toy car, or a deck of Yu-gi-oh! cards, and expect them to have fun and be responsible. Now, I don’t know too many two-year-olds that are able to conceptualize virtual money, because many adults still aren’t able to. Check around to see how many of your grown-up friends have credit card debt. It’s unfair to put the onus on children to be responsible while playing, so who should take the blame?

We blame cigarette companies for giving us cancer, we blame fast food companies for making us fat, and of course we should blame mobile game companies for leaking money out of our virtual wallets. Some consider the freemium-style of business brilliant, while others consider it trickery. In terms of games, it begins as a sample, usually free, to get the user hooked, and then they up the price once the player is addicted. While I believe the game companies have done a brilliant job in harnessing this, I don’t believe their intentions were malicious. And, as a businessman, West should know that it’s just supply and demand. If the player wants to skip a level, earn more stock, or gain leverage over an opponent—but they don’t want to put in the time—they can upgrade with a monetary solution.

Surprise, your kids are going to cost you money! Freemium games aren’t the culprit, they are just another avenue for your money to be lost. The same way you don’t give your children your credit card and PIN at the toy store, you shouldn’t give them an iPad with full access until they understand that the reality of their purchases. Educate your children about frivolousness and how each $0.99 click adds up.

You cannot stop businesses from creating products for profit, even if they do target children. Don’t believe me? Look at McDonald’s. You can’t win that way. What you can do is pull the iPad away from your child if he or she abuses it. Be a good parent and teach your children from an early age the value of money, and how it relates to the technology they are using. Organizations aren’t going to educate your children for you… or maybe there is an app for that.