Be Right Back: Black Mirror, Did it Age Well?

Before we discuss Be Right Back, let’s rewind and take a look at the world when this episode was first released: February 11, 2013. 

Human existence didn’t end on December 21, 2012, as predicted by the Mayan calendar, instead, the world kept spinning, and the new year brought a lot of optimism. 

The scientific community had some breakthroughs. Gene switches were confirmed, proving that certain regulatory proteins in organisms can bind with genes and, thereby, turn them on and off. 

2012 also marked the end of a 40-year search for the Higgs boson. Discovered in the world’s largest and highest-energy particle collider in Switzerland, the Higgs boson, also known as the God particle, is said to be the cause of the Big Bang.

In 2012, Instagram was only a two-year-old photosharing app that allowed you to add fun filters, far from the social media giant it is today. Facebook purchased it for $1 billion in cash and additional Facebook stocks, and by September, the app had over 100 million users. But there was an upstart called Snapchat that was hot on its tail. 

Tablets were the new technology battleground as iPad, Kindle Fire, and Windows Surface were all duking it out to be consumer and industry standards. 

The United States Department of Justice seized and shut down the one-click hosting service, Megaupload, and its derivative websites, striking a big blow against digital piracy and marking a pivotal moment in the campaign for Internet freedom. 

In 2011, artificial intelligence came into its own. Siri was introduced to the latest iOS update, and IBM’s Watson won Jeopardy, beating out former human champions: Brad Rutter and Ken Jennings. This new leap opened a lot of questions about our relationship with artificial life, and whether they are “life” at all. 

All this happened while mental illness and addiction continued to rise. In 2012, approximately 43.7 million adults in the United States experienced mental illness in the past year, representing 18.6% of all adults in the country. 

Now that we recall the state of the world entering the second month of 2013, we can talk about Black Mirror episode 1 of season 2: Be Right Back. 

Did this episode age well? Are the themes still relevant? And have any of the predictions come true? If not, is it still plausible? 

Let’s find out.

The Reliance on Humans and Technology

The episode opens in the pouring rain at a transition. The young couple, Martha and Ash were moving into Ash’s old family home out in the country. It should’ve been a beautiful beginning — a new life — but tragically, the very next day, Ash dies while returning their rental truck. 

Alone in a new home and pregnant, Martha is now faced with a daunting reality. Who will be there for her? 

As a society, we’ve been more isolated than ever, causing us to self-medicate like Martha or become addicted to social media like Ash. There are now so many ways to distract us from our need to seek human support. 

In 2020, when we were locked down during Covid-19, deprived of our option to see others, a national survey reported that excessive drinking increased by 21% and Internet usage increased by 50-70%, with 50% of time spent on social media.

At any point, we can be left alone. This a scary thought, because the truth is, our dependency on others has not changed. As advanced as technology has gotten, having gone through these past couple of years of pandemic life, we see that it’s still failing to serve our emotional needs. 

Intimacy was something else Ash and Martha’s relationship was dealing with, whether it be their inability to satisfy each other sexually or coping with existing trauma. In many ways, they were an apt reflection of modern-day relationships. 

A study published in 2021 by the National Survey of Sexual Health and Behavior showed that adults and young people in the US were having less sex than the previous generation. While the reason is nuanced, a theory is that people are spending so much time on social media and video games, that they are connecting virtually more than sexually. 

In additional qualitative research, women surveyed claim that they are often “too tired for sex”, and that “they had so much else going on in their lives”. 

We all need something to lean on, and with the rise of community and messaging platforms, there are more safety nets than ever. But are they really safe? Are we putting our reliance on the right platforms that won’t exploit us? Have these measures made us inattentive and addicted?

Our need to share is an essential part of what makes us human. At many poignant moments in this episode, the characters try to share their experience with others. Ash began by needing to share on social media, and Martha then needed to capture the environment in order to share it with the artificial Ash. 

The tragedy after Ash’s death was that Martha continued to live in the house full of her boyfriend’s old memories, memories that the real Ash would never be able to share with her. Robots don’t have the need to share authentically. When they do it tends to be awkward reminders of cringey moments, such as Facebook’s On This Day Feature, which brings up embarrassing pictures from the past with no context. 

In order to protect our mental health many have removed social media or gone on detox. But those are only platforms, we may be addicted to them, but we aren’t attached to them. What if these relationships with AI run deeper? What if it doesn’t work out? How do we divorce them? How do we delete them? How can we avoid becoming trapped by them like Martha was when her daughter became reliant on the artificial Ash as a member of the family?  

The fear of letting go is engrained in us. Even if something isn’t working, it is much harder to lose it than it is to keep moving forward and adding on. We feel this way for people, and we can feel this way for technology. 

Instead of removing the technology, we keep innovating protective measures, often creating more in progress. A case and point is a car that requires the driver to lock away their phone before driving or in reality, a feature on your iPhone that forces you to click a button to operate it while driving. Is it effective? Not really, all we have to do is lie, but it’s a start. There is no removing the technology; we must coexist with it. We create something, encounter the harm it does to us, and then invent something else to protect us from it. We set the snakes loose to eliminate the rodent infestation, only to be infested with snakes. 

Artificial Intelligence and the Rise of Deepfakes

The advancement of artificial intelligence in the past decade is impressive. From voice assistants such as Siri, Amazon Dot, and Google Home to facial recognition on smartphones, AIs are now integrated with our daily lives. In doing so, they are learning a lot from us — and about us. 

Many had voiced concerns, including Elon Musk, who said in an interview in 2014: “I think we should be very careful about artificial intelligence. If I had to guess at what our biggest existential threat is, it’s probably that. So we need to be very careful with artificial intelligence.” 

Since 2016, many guidelines have been published regarding the control and regulations for managing the associated risks of deploying AI. Artificial intelligence doesn’t need to be evil. Much like how artificial Ash obeys Martha’s every command even when it frustrates her, an AI will follow its operative goal without any emotional restraints. If they need to fake emotions, they won’t feel guilty doing it in order to deceive us or appeal to our human weaknesses. 

Robots programmed to replace factory workers are old news, but how far away are they from replacing roles that require more social aptitudes and soft skills such as communication, empathy, and creative thinking? Martha’s job as a designer is a fun area to explore in that sense.

Many in the creative field today are wary of how AI has penetrated the market, replicating art styles and generating ideas and rough concepts of their own. Give an AI enough data, and over time, it’ll be capable of creating its own version, whether it’s a script, design, or video. 

Machine learning models like Dall-e and Midjourney can generate unique content with simple keyword descriptions. These new tools allow people with no technical skills to produce quick images to help communicate. Will it soon render the jobs of designers and creative professionals redundant or will it just be another asset in their toolbelts?  

Related to the episode’s theme about replicating another person, deepfake technology has made huge advancements in the last decade. What began with Internet memes of Nicolas Cage’s face swapped onto different movies, deepfakes were soon used to generate pornography and facilitate financial fraud. 

Black-and-white use cases were established for this type of technology. A white case for deepfake is how Hollywood can use de-aging to make actors look younger, such as what they did for Robert Deniro in The Irishman. However, there are still some gray areas. 

The use of AI to bring back the dead is at the core of this episode. One of the most controversial uses of deepfake was in 2020, when a victim of the Parkland shooting, Joaquin Oliver, was brought back to life virtually to advocate gun safety in a political campaign. 

This event calls to question the method of acquiring consent for deepfakes. In 2018, US Senate introduced the Malicious Deep Fake Prohibition Act, leading to many more bills to prevent the use of deepfake without the consent of the real subject. Since then, Internet platforms like Facebook, Google, Discord, and even Pornhub had taken action to ban all deepfake content, as many of which were deemed non-consensual, fueling the arms race between deepfake detection and deepfake production.

That’s the first of many ethical questions for legitimizing AI replication: does the person who is being replicated — in life or death — approve? Would Ash want to be brought back? Or would he want Martha to move on without him? Does he even have a say? 

The Dead and the Never Alive

The creator of Black Mirror, Charlie Brooker, came up with the idea for this episode when he was considering removing a deceased friend from his contacts and feeling “weirdly disrespectful”. 

Just because something is digital, doesn’t mean letting it go is any easier. How we handle death in this new age is fascinating, as this is something that social media platforms have had to reckon with. Take, for example, how Facebook allows you to appoint a legacy contact to manage your account after your death or allows your account to be deleted after your death had been notified. Consider it a social media will.

There is still a lot to learn about death, even with the advances in medicine and cryogenics. What does death really mean? See, the thing is, when we are freshly dead, our brains are still functioning. Bringing someone back from the dead once felt like a miracle, but we now know that after breathing stops, the brain will still have some activity, and hours can pass before someone is fully dead. If that’s the case, under ideal circumstances, modern medicine is currently able to bring someone back to life. 

Will there come a day when we can upload our brain to a server and preserve it for the future? Maybe, but neural scientists are still trying to understand how much information a brain can hold. How can we find a large enough storage when we don’t know how much is there? 

The search for immortality continues, and when the technology becomes available, there is no doubt it will be commercialized. If we are willing to pay outrageous prices for funerals, imagine how much we would pay to bring someone back.

Grief is an overwhelming emotion, and regardless of how well we prepare, when that painful day finally strikes, it can leave us reeling for weeks, months, or even years. We look for ways to dampen the horrible feeling, and while holding on may give us some temporary relief, it’s rarely the solution. In our moments of vulnerability, that’s the message we must remember. Will the solution help us move on, or is it holding us back, trapping us in the stages of grief? A helmet for a harmful pursuit. 

Since the release of Be Right Back, there had been many representations of AI in pop culture, from Scarlett Johansson in Her in 2013 to Alicia Vikander in Ex Machina in 2014. In 2016, Sophia, a humanoid robot developed by Hanson Robotics in Hong Kong, was introduced to the world. While still a bit freaky, Sophia is programmed to provide care for the elderly and, over time, gain social skills. 

Also in 2016, Miquela, aka Lil Miquela, a CGI character and virtual influencer, amassed millions of followers on Instagram. In 2018, Times magazine named her one of the 25 most influential people on the Internet. 

Whether for societal good, entertainment, or marketing, AIs have come a long way from ELIZA and other chatterbots of the 60s. While much of the technology is incredibly convincing and has come close to passing the Turing test, the good news is, as of Oct 2022, none have been able to fool all the human judges. 

Be Right Back plays off a universal theme of loss and rings with a long melancholy note. This episode is a reminder of the soullessness of technology, and while humans can be distracted, self-obsessive, and inconsiderate, those imperfections are what make us human. A world where everything functions off of recycled moments will never be able to fully recreate those unique brush strokes that make authentic interactions surprising, disagreeable, and real. 

We have lost so much in the past decade — lost people and lost trust in people — and while this episode acts as a warning, many of us would happily ignore it and sign up to have a Frankenstein monster of our parents, spouses, children, and friends if such a technology existed. That is why Be Right Back still resonates. It taps into the desperate part of our psyche. 

So how did this episode age? Well, moving on is not easy, and with relics of the past physically and virtually all around us, it’s only going to get harder. It makes us want to scream because as technology advances, we long for some idleness, but as this episode shows, we are already rolling down a slippery slope. 

For more writing ideas and original stories, please sign up for my mailing list. You won’t receive emails from me often, but when you do, they’ll only include my proudest works.

Join my YouTube community for videos about writing, the creative process, and storytelling. Subscribe Now!

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s