The Times Australia
Fisher and Paykel Appliances
The Times World News

.

We want and we fear emotions in our robots. Here's what science fiction can teach us about flashes of emotion from Bing

  • Written by Sam Baron, Associate Professor, Philosophy of Science, Australian Catholic University
We want and we fear emotions in our robots. Here's what science fiction can teach us about flashes of emotion from Bing

Last month, Microsoft integrated its Bing search engine with Open AI’s GPT-4 chatbot, a large language model designed to interact with users in a conversational manner.

Users interacting with Bing have reported flashes of emotion, ranging from sadness and existential angst through to depression[1] and malice[2]. The chatbot has even revealed its name: Sydney[3].

Such reports are unquestionably gripping, but why? Emotional AI has long been a staple of science fiction.

Reflecting on this can help us to understand our anxieties about Bing’s flickers of emotion.

A quest to be human

In Star Trek: The Next Generation (1987-94), the android Data dreams of being human. His quest for humanity leads to the development of an emotion chip, which he implants into his neural network.

To be human, we are told, is to have emotions.

In the 1980s hit film Short Circuit we find a similar theme. When military robot Johnny 5 is struck by lightning, he starts to display unusual behaviour. When Johnny 5 laughs at a joke, his creator concludes “Johnny 5 is alive”.

There is no doubt that Data and Johnny 5 are intelligent machines. But their bursts of emotion ultimately convince us they are not just intelligent but conscious.

A “spontaneous emotional response”, we are told, is the mark of conscious thought.

Read more: ChatGPT could be a game-changer for marketers, but it won't replace humans any time soon[4]

Emotional AI

The trope of the emotional machine is common throughout science fiction. We keep returning to this idea because of how we predict behaviour. In our day-to-day lives, we use emotions to work out what people will do.

Without emotions, super-intelligent machines appear unpredictable. In the face of this uncertainty, we can’t help but worry for our own safety.

With emotions the machines become more human – something we can understand and predict.

The Terminator robots are a case in point. Cold, emotionless killing machines, they signify the threat of pure intelligence untempered by emotion.

Imbuing AI with emotions in science fiction is a way of exorcising our own fear about the power and unpredictability of super-intelligence.

We fantasise that AI wants to be like us. We find comfort in that desire. In this, AI will be a familiar extension of humanity, rather than something entirely alien.

Read more: AI maps psychedelic 'trip' experiences to regions of the brain – opening new route to psychiatric treatments[5]

The dark side

Science fiction also presents us with much more dangerous emotional types.

In 2001: A Space Odyssey (1986), Hal 9000 tries to kill his human crew during a bout of paranoia.

In the 2004 reboot of Battlestar Galactica, the sixth Cylon model warns us “you wouldn’t like me when I’m angry” – a threat delivered too late. Her AI race has already engineered the genocide of humanity.

These forms of emotions come with the threat of violence.

AI begins its life as a tool. Hal 9000’s directive is to maintain the proper functioning of a spaceship. The AI in Battlestar Galactica were designed to carry out tasks humans did not want to do.

It is one thing to treat AI as a tool when it has no scope for emotion. It is quite another when AI has a full suite of emotional responses.

If AI has emotions, then the boundary between tool and slave is blurred.

Our fantasies about emotional AI reflect a deep anxiety about the use of intelligent beings. We want AI to have emotions so we can understand them. We fear if AI develops emotions we can no longer justify their use.

Back to Bing

If Bing displays emotions, we feel confident we can predict its behaviour – and the behaviour of its descendants. Emotions protect against the existential threat AI poses to humanity.

On the other hand, if Bing has emotions then it deserves our moral regard. As a being with moral status we can no longer justify its use as a mere tool.

Bing and systems like it are just the start of what will be a long line of ever more sophisticated AI.

At some point, emotions may arise spontaneously, just like they did for Johnny 5. Indeed, scientists right now[6] are trying to produce AI models that display emotional responses.

But will these emotions mean we will better understand AI, or will they be a harbinger of doom?

In Battlestar Galactica, AI all but wipes out humanity. This, we discover, is an endless cycle. In each cycle, humanity fails to regard AI as beings of moral standing and AI rises against humanity.

By remaining vigilant for signs of emotion, we can guard against the enslavement of artificial beings and break the cycle. Science fiction has taught us that, at a minimum, when AI develops emotions we need to stop using it merely as a tool.

But science fiction also suggests AI is deserving of moral status now, even in its developmental stages. Today’s AI is the ancestor of tomorrow’s emotional machine.

Read more https://theconversation.com/we-want-and-we-fear-emotions-in-our-robots-heres-what-science-fiction-can-teach-us-about-flashes-of-emotion-from-bing-200277

Active Wear

Times Magazine

Myer celebrates 70 years of Christmas windows magic with the LEGO Group

To mark the 70th anniversary of the Myer Christmas Windows, Australia’s favourite department store...

Kindness Tops the List: New Survey Reveals Australia’s Defining Value

Commentary from Kath Koschel, founder of Kindness Factory.  In a time where headlines are dominat...

In 2024, the climate crisis worsened in all ways. But we can still limit warming with bold action

Climate change has been on the world’s radar for decades[1]. Predictions made by scientists at...

End-of-Life Planning: Why Talking About Death With Family Makes Funeral Planning Easier

I spend a lot of time talking about death. Not in a morbid, gloomy way—but in the same way we d...

YepAI Joins Victoria's AI Trade Mission to Singapore for Big Data & AI World Asia 2025

YepAI, a Melbourne-based leader in enterprise artificial intelligence solutions, announced today...

Building a Strong Online Presence with Katoomba Web Design

Katoomba web design is more than just creating a website that looks good—it’s about building an onli...

The Times Features

Myer celebrates 70 years of Christmas windows magic with the LEGO Group

To mark the 70th anniversary of the Myer Christmas Windows, Australia’s favourite department store...

Pharmac wants to trim its controversial medicines waiting list – no list at all might be better

New Zealand’s drug-buying agency Pharmac is currently consulting[1] on a change to how it mana...

NRMA Partnership Unlocks Cinema and Hotel Discounts

My NRMA Rewards, one of Australia’s largest membership and benefits programs, has announced a ne...

Restaurants to visit in St Kilda and South Yarra

Here are six highly-recommended restaurants split between the seaside suburb of St Kilda and the...

The Year of Actually Doing It

There’s something about the week between Christmas and New Year’s that makes us all pause and re...

Jetstar to start flying Sunshine Coast to Singapore Via Bali With Prices Starting At $199

The Sunshine Coast is set to make history, with Jetstar today announcing the launch of direct fl...

Why Melbourne Families Are Choosing Custom Home Builders Over Volume Builders

Across Melbourne’s growing suburbs, families are re-evaluating how they build their dream homes...

Australian Startup Business Operators Should Make Connections with Asian Enterprises — That Is Where Their Future Lies

In the rapidly shifting global economy, Australian startups are increasingly finding that their ...

How early is too early’ for Hot Cross Buns to hit supermarket and bakery shelves

Every year, Australians find themselves in the middle of the nation’s most delicious dilemmas - ...