The Times Australia
The Times World News

.

We want and we fear emotions in our robots. Here's what science fiction can teach us about flashes of emotion from Bing

  • Written by Sam Baron, Associate Professor, Philosophy of Science, Australian Catholic University
We want and we fear emotions in our robots. Here's what science fiction can teach us about flashes of emotion from Bing

Last month, Microsoft integrated its Bing search engine with Open AI’s GPT-4 chatbot, a large language model designed to interact with users in a conversational manner.

Users interacting with Bing have reported flashes of emotion, ranging from sadness and existential angst through to depression[1] and malice[2]. The chatbot has even revealed its name: Sydney[3].

Such reports are unquestionably gripping, but why? Emotional AI has long been a staple of science fiction.

Reflecting on this can help us to understand our anxieties about Bing’s flickers of emotion.

A quest to be human

In Star Trek: The Next Generation (1987-94), the android Data dreams of being human. His quest for humanity leads to the development of an emotion chip, which he implants into his neural network.

To be human, we are told, is to have emotions.

In the 1980s hit film Short Circuit we find a similar theme. When military robot Johnny 5 is struck by lightning, he starts to display unusual behaviour. When Johnny 5 laughs at a joke, his creator concludes “Johnny 5 is alive”.

There is no doubt that Data and Johnny 5 are intelligent machines. But their bursts of emotion ultimately convince us they are not just intelligent but conscious.

A “spontaneous emotional response”, we are told, is the mark of conscious thought.

Read more: ChatGPT could be a game-changer for marketers, but it won't replace humans any time soon[4]

Emotional AI

The trope of the emotional machine is common throughout science fiction. We keep returning to this idea because of how we predict behaviour. In our day-to-day lives, we use emotions to work out what people will do.

Without emotions, super-intelligent machines appear unpredictable. In the face of this uncertainty, we can’t help but worry for our own safety.

With emotions the machines become more human – something we can understand and predict.

The Terminator robots are a case in point. Cold, emotionless killing machines, they signify the threat of pure intelligence untempered by emotion.

Imbuing AI with emotions in science fiction is a way of exorcising our own fear about the power and unpredictability of super-intelligence.

We fantasise that AI wants to be like us. We find comfort in that desire. In this, AI will be a familiar extension of humanity, rather than something entirely alien.

Read more: AI maps psychedelic 'trip' experiences to regions of the brain – opening new route to psychiatric treatments[5]

The dark side

Science fiction also presents us with much more dangerous emotional types.

In 2001: A Space Odyssey (1986), Hal 9000 tries to kill his human crew during a bout of paranoia.

In the 2004 reboot of Battlestar Galactica, the sixth Cylon model warns us “you wouldn’t like me when I’m angry” – a threat delivered too late. Her AI race has already engineered the genocide of humanity.

These forms of emotions come with the threat of violence.

AI begins its life as a tool. Hal 9000’s directive is to maintain the proper functioning of a spaceship. The AI in Battlestar Galactica were designed to carry out tasks humans did not want to do.

It is one thing to treat AI as a tool when it has no scope for emotion. It is quite another when AI has a full suite of emotional responses.

If AI has emotions, then the boundary between tool and slave is blurred.

Our fantasies about emotional AI reflect a deep anxiety about the use of intelligent beings. We want AI to have emotions so we can understand them. We fear if AI develops emotions we can no longer justify their use.

Back to Bing

If Bing displays emotions, we feel confident we can predict its behaviour – and the behaviour of its descendants. Emotions protect against the existential threat AI poses to humanity.

On the other hand, if Bing has emotions then it deserves our moral regard. As a being with moral status we can no longer justify its use as a mere tool.

Bing and systems like it are just the start of what will be a long line of ever more sophisticated AI.

At some point, emotions may arise spontaneously, just like they did for Johnny 5. Indeed, scientists right now[6] are trying to produce AI models that display emotional responses.

But will these emotions mean we will better understand AI, or will they be a harbinger of doom?

In Battlestar Galactica, AI all but wipes out humanity. This, we discover, is an endless cycle. In each cycle, humanity fails to regard AI as beings of moral standing and AI rises against humanity.

By remaining vigilant for signs of emotion, we can guard against the enslavement of artificial beings and break the cycle. Science fiction has taught us that, at a minimum, when AI develops emotions we need to stop using it merely as a tool.

But science fiction also suggests AI is deserving of moral status now, even in its developmental stages. Today’s AI is the ancestor of tomorrow’s emotional machine.

Read more https://theconversation.com/we-want-and-we-fear-emotions-in-our-robots-heres-what-science-fiction-can-teach-us-about-flashes-of-emotion-from-bing-200277

Times Magazine

Building a Strong Online Presence with Katoomba Web Design

Katoomba web design is more than just creating a website that looks good—it’s about building an onli...

September Sunset Polo

International Polo Tour To Bridge Historic Sport, Life-Changing Philanthropy, and Breath-Taking Beau...

5 Ways Microsoft Fabric Simplifies Your Data Analytics Workflow

In today's data-driven world, businesses are constantly seeking ways to streamline their data anal...

7 Questions to Ask Before You Sign IT Support Companies in Sydney

Choosing an IT partner can feel like buying an insurance policy you hope you never need. The right c...

Choosing the Right Legal Aid Lawyer in Sutherland Shire: Key Considerations

Legal aid services play an essential role in ensuring access to justice for all. For people in t...

Watercolor vs. Oil vs. Digital: Which Medium Fits Your Pet's Personality?

When it comes to immortalizing your pet’s unique personality in art, choosing the right medium is ...

The Times Features

Italian Lamb Ragu Recipe: The Best Ragù di Agnello for Pasta

Ciao! It’s Friday night, and the weekend is calling for a little Italian magic. What’s better than t...

It’s OK to use paracetamol in pregnancy. Here’s what the science says about the link with autism

United States President Donald Trump has urged pregnant women[1] to avoid paracetamol except in ...

How much money do you need to be happy? Here’s what the research says

Over the next decade, Elon Musk could become the world’s first trillionaire[1]. The Tesla board ...

NSW has a new fashion sector strategy – but a sustainable industry needs a federally legislated response

The New South Wales government recently announced the launch of the NSW Fashion Sector Strategy...

From Garden to Gift: Why Roses Make the Perfect Present

Think back to the last time you gave or received flowers. Chances are, roses were part of the bunch...

Do I have insomnia? 5 reasons why you might not

Even a single night of sleep trouble can feel distressing and lonely. You toss and turn, stare...

Wedding Photography Trends You Need to Know (Before You Regret Your Album)

Your wedding album should be a timeless keepsake, not something you cringe at years later. Trends ma...

Can you say no to your doctor using an AI scribe?

Doctors’ offices were once private. But increasingly, artificial intelligence (AI) scribes (al...

There’s a new vaccine for pneumococcal disease in Australia. Here’s what to know

The Australian government announced last week there’s a new vaccine[1] for pneumococcal disease ...