The Times Australia
The Times World News

.

The latest version of ChatGPT has a feature you’ll fall in love with. And that’s a worry

  • Written by Rob Brooks, Scientia Professor of Evolutionary Ecology; Academic Lead of UNSW's Grand Challenges Program, UNSW Sydney

If you’re a paid subscriber to ChatGPT, you may have noticed the artificial intelligence (AI) large language model has recently started to sound more human when you are having audio interactions with it.

That’s because the company behind the language model-cum-chatbot, OpenAI, is currently running a limited pilot of a new feature known as “advanced voice mode”.

OpenAI says this new mode[1] “features more natural, real-time conversations that pick up on and respond with emotion and non-verbal cues”. It plans[2] for all paid ChatGPT subscribers to have access to the advanced voice mode in coming months.

Advanced voice mode sounds strikingly human. There aren’t the awkward gaps we are used to with voice assistants; instead it seems to take breaths like a human would. It is also unfazed by interruption, conveys appropriate emotion cues and seems to infer the user’s emotional state from voice cues.

But at the same time as making ChatGPT seem more human, OpenAI has expressed concern[3] that users might respond to the chatbot as if it were human – by developing an intimate relationship with it.

This is not a hypothetical. For example, a social media influencer named Lisa Li has coded ChatGPT to be her “boyfriend”[4]. But why exactly do some people develop intimate relationships with a chatbot?

The evolution of intimacy

Humans have a remarkable capacity for friendship and intimacy. This is an extension of the way primates physically groom one another[5] to build alliances that can be called upon in times of strife.

But our ancestors also evolved a remarkable capacity to “groom” one another verbally[6]. This drove the evolutionary cycle in which the language centres in our brains became larger and what we did with language became more complex.

More complex language in turn enabled more complex socialising with larger networks of relatives, friends and allies. It also enlarged the social parts of our brains.

Language evolved alongside human social behaviour. The way we draw an acquaintance into friendship or a friend into intimacy is largely through conversation.

Experiments in the 1990s[7] revealed that conversational back-and-forth, especially when it involves disclosing personal details, builds the intimate sense our conversation partner is somehow part of us.

So I’m not surprised that attempts to replicate this process of “escalating self-disclosure” between humans and chatbots[8] results in humans feeling intimate with the chatbots[9].

And that’s just with text input. When the main sensory experience of conversation – voice – gets involved, the effect is amplified. Even voice-based assistants that don’t sound human, such as Siri and Alexa, still get an avalanche of marriage proposals[10].

The writing was on the lab chalkboard

If OpenAI were to ask me how to ensure users don’t form social relationships with ChatGPT, I would have a few simple recommendations.

First, don’t give it a voice. Second, don’t make it capable of holding up one end of an apparent conversation. Basically don’t make the product you made.

The product is so powerful precisely because it does such an excellent job of mimicking the traits we use to form social relationships.

Close-up of GPT-4o displayed on a smartphone screen.
OpenAI should have known the risks of creating a human-like chatbot. QubixStudio/Shutterstock[11]

The writing was on the laboratory chalkboard since the first chatbots flickered on nearly 60 years ago[12]. Computers have been recognised as social actors[13] for at least 30 years. The advanced voice mode of ChatGPT is merely the next impressive increment, not what the tech industry would gushingly call a “game changer”.

That users not only form relationships with chatbots but develop very close personal feelings became clear early last year when users of the virtual friend platform Replika AI[14] found themselves unexpectedly cut off from the most advanced functions of their chatbots.

Replika was less advanced than the new version of ChatGPT. And yet the interactions were of such a quality that users formed surprisingly deep attachments.

The risks are real

Many people, starved[15] for the kind of company that listens in a non-judgmental way, will get a lot out of this new generation of chatbots. They may feel less lonely and isolated[16]. These kinds of benefits of technology can never be overlooked.

But the potential dangers of ChatGPT’s advanced voice mode are also very real.

Time spent chatting with any bot is time that can’t be spent interacting with friends and family. And people who spend a lot of time with technology[17] are at greatest risk[18] of displacing relationships with other humans.

As OpenAI identifies, chatting with bots can also contaminate existing relationships people have with other people. They may come to expect their partners or friends to behave like polite, submissive, deferential chatbots.

These bigger effects of machines on culture[19] are going to become more prominent. On the upside, they may also provide deep insights into how culture works.

References

  1. ^ OpenAI says this new mode (help.openai.com)
  2. ^ It plans (help.openai.com)
  3. ^ has expressed concern (openai.com)
  4. ^ has coded ChatGPT to be her “boyfriend” (edition.cnn.com)
  5. ^ groom one another (link.springer.com)
  6. ^ to “groom” one another verbally (www.hup.harvard.edu)
  7. ^ Experiments in the 1990s (journals.sagepub.com)
  8. ^ between humans and chatbots (academic.oup.com)
  9. ^ intimate with the chatbots (dl.acm.org)
  10. ^ an avalanche of marriage proposals (www.yahoo.com)
  11. ^ QubixStudio/Shutterstock (www.shutterstock.com)
  12. ^ nearly 60 years ago (dl.acm.org)
  13. ^ recognised as social actors (dl.acm.org)
  14. ^ users of the virtual friend platform Replika AI (theconversation.com)
  15. ^ starved (www.thelancet.com)
  16. ^ less lonely and isolated (psyche.co)
  17. ^ time with technology (www.usu.edu)
  18. ^ risk (www.sciencedirect.com)
  19. ^ effects of machines on culture (www.nature.com)

Read more https://theconversation.com/the-latest-version-of-chatgpt-has-a-feature-youll-fall-in-love-with-and-thats-a-worry-238073

Times Magazine

September Sunset Polo

International Polo Tour To Bridge Historic Sport, Life-Changing Philanthropy, and Breath-Taking Beauty On Saturday, September 6th, history will be made as the International Polo Tour (IPT), a sports leader headquartered here in South Florida...

5 Ways Microsoft Fabric Simplifies Your Data Analytics Workflow

In today's data-driven world, businesses are constantly seeking ways to streamline their data analytics processes. The sheer volume and complexity of data can be overwhelming, often leading to bottlenecks and inefficiencies. Enter the innovative da...

7 Questions to Ask Before You Sign IT Support Companies in Sydney

Choosing an IT partner can feel like buying an insurance policy you hope you never need. The right choice keeps your team productive, your data safe, and your budget predictable. The wrong choice shows up as slow tickets, surprise bills, and risky sh...

Choosing the Right Legal Aid Lawyer in Sutherland Shire: Key Considerations

Legal aid services play an essential role in ensuring access to justice for all. For people in the Sutherland Shire who may not have the financial means to pay for private legal assistance, legal aid ensures that everyone has access to representa...

Watercolor vs. Oil vs. Digital: Which Medium Fits Your Pet's Personality?

When it comes to immortalizing your pet’s unique personality in art, choosing the right medium is essential. Each artistic medium, whether watercolor, oil, or digital, has distinct qualities that can bring out the spirit of your furry friend in dif...

DIY Is In: How Aussie Parents Are Redefining Birthday Parties

When planning his daughter’s birthday, Rich opted for a DIY approach, inspired by her love for drawing maps and giving clues. Their weekend tradition of hiding treats at home sparked the idea, and with a pirate ship playground already chosen as t...

The Times Features

Do you really need a dental check-up and clean every 6 months?

Just over half of Australian adults[1] saw a dental practitioner in the past 12 months, most commonly for a check-up[2]. But have you been told you should get a check-up and c...

What is a Compounding Pharmacy and Why Do You Need One in Melbourne?

Ever picked up a prescription and thought, this pill is too big, too bitter, or full of things I cannot have? That is where a compounding chemist becomes important. A compounding p...

Deep Cleaning vs Regular Cleaning: Which One Do Perth Homes Really Need?

Whether you live in a coastal home in Cottesloe or a modern apartment in East Perth, keeping your living space clean isn’t just about aesthetics, it’s essential for your health and...

Rubber vs Concrete Wheel Stops: Which is Better for Your Car Park?

When it comes to setting up a car park in Perth, wheel stops are a small feature that make a big difference. From improving driver accuracy to preventing costly damage, the right c...

Not all processed foods are bad for you. Here’s what you can tell from reading the label

If you follow wellness content on social media or in the news, you’ve probably heard that processed food is not just unhealthy, but can cause serious harm. Eating a diet domin...

What happens if I eat too much protein?

The hype around protein[1] intake doesn’t seem to be going away. Social media is full of people urging you to eat more protein, including via supplements such as protein sha...