Google AI
The Times Australia
The Times World News

.

We're told AI neural networks 'learn' the way humans do. A neuroscientist explains why that's not the case

  • Written by James Fodor, PhD Candidate in Cognitive Neuroscience, The University of Melbourne
We're told AI neural networks 'learn' the way humans do. A neuroscientist explains why that's not the case

Recently developed artificial intelligence (AI) models are capable of many impressive feats, including recognising images and producing human-like language. But just because AI can perform human-like behaviours doesn’t mean it can think or understand like humans.

As a researcher studying how humans understand and reason about the world, I think it’s important to emphasise the way AI systems “think” and learn is fundamentally different to how humans do – and we have a long way to go before AI can truly think like us.

Read more: Robots are creating images and telling jokes. 5 things to know about foundation models and the next generation of AI[1]

A widespread misconception

Developments in AI have produced systems that can perform very human-like behaviours. The language model GPT-3[2] can produce text that’s often indistinguishable from human speech. Another model, PaLM[3], can produce explanations for jokes it has never seen before[4].

Most recently, a general-purpose AI known as Gato has been developed which can perform hundreds of tasks[5], including captioning images, answering questions, playing Atari video games, and even controlling a robot arm to stack blocks. And DALL-E is a system which has been trained to produce modified images and artwork from a text description.

These breakthroughs have led to some bold claims about the capability of such AI, and what it can tell us about human intelligence.

For example Nando de Freitas, a researcher at Google’s AI company DeepMind, argues scaling up existing models will be enough to produce human-level artificial intelligence. Others have echoed[6] this view.

In all the excitement, it’s easy to assume human-like behaviour means human-like understanding. But there are several key differences between how AI and humans think and learn.

Neural nets vs the human brain

Most recent AI is built from artificial neural networks[7], or “neural nets” for short. The term “neural” is used because these networks are inspired by the human brain, in which billions of cells called neurons form complex webs of connections with one another, processing information as they fire signals back and forth.

Neural nets are a highly simplified version of the biology. A real neuron is replaced with a simple node, and the strength of the connection between nodes is represented by a single number called a “weight”.

With enough connected nodes stacked into enough layers, neural nets can be trained to recognise patterns and even “generalise[8]” to stimuli that are similar (but not identical) to what they’ve seen before. Simply, generalisation refers to an AI system’s ability to take what it has learnt from certain data and apply it to new data.

Being able to identify features, recognise patterns, and generalise from results lies at the heart of the success of neural nets – and mimics techniques humans use for such tasks. Yet there are important differences.

Neural nets are typically trained by “supervised learning[9]”. So they’re presented with many examples of an input and the desired output, and then gradually the connection weights are adjusted until the network “learns” to produce the desired output.

To learn a language task, a neural net may be presented with a sentence one word at a time, and will slowly learns to predict the next word in the sequence.

This is very different from how humans typically learn. Most human learning is “unsupervised”, which means we’re not explicitly told what the “right” response is for a given stimulus. We have to work this out ourselves.

For instance, children aren’t given instructions on how to speak, but learn this through a complex process[10] of exposure to adult speech, imitation, and feedback.

A toddler tries to walk outdoors, with an adult guiding it by both hands
Childrens’ learning is assisted by adults, but they’re not fed massive datasets the way AI systems are. Shutterstock

Another difference is the sheer scale of data used to train AI. The GPT-3 model was trained on 400 billion words[11], mostly taken from the internet. At a rate of 150 words per minute, it would take a human nearly 4,000 years to read this much text.

Such calculations show humans can’t possibly learn the same way AI does. We have to make more efficient use of smaller amounts of data.

Neural nets can learn in ways we can’t

An even more fundamental difference concerns the way neural nets learn. In order to match up a stimulus with a desired response, neural nets use an algorithm called “backpropagation” to pass errors backward through the network, allowing the weights to be adjusted in just the right way.

However, it’s widely recognised by neuroscientists that backpropagation can’t be implemented[12] in the brain, as it would require external signals[13] that just don’t exist.

Some researchers have proposed variations[14] of backpropagation could be used by the brain, but so far there is no evidence human brains can use such learning methods.

Instead, humans learn by making structured mental concepts[15], in which many different properties and associations are linked together. For instance, our concept of “banana” includes its shape, the colour yellow, knowledge of it being a fruit, how to hold it, and so forth.

As far as we know, AI systems do not form conceptual knowledge like this. They rely entirely on extracting complex statistical associations from their training data, and then applying these to similar contexts.

Efforts are underway to build AI that combines different types of input[16] (such as images and text) – but it remains to be seen if this will be sufficient for these models to learn the same types of rich mental representations humans use to understand the world.

There’s still much we don’t know about how humans learn, understand and reason. However, what we do know indicates humans perform these tasks very differently to AI systems.

As such, many researchers believe[17] we’ll need new approaches, and more fundamental insight into how the human brain works, before we can build machines that truly think and learn like humans.

References

  1. ^ Robots are creating images and telling jokes. 5 things to know about foundation models and the next generation of AI (theconversation.com)
  2. ^ GPT-3 (www.twilio.com)
  3. ^ PaLM (ai.googleblog.com)
  4. ^ seen before (www.cnet.com)
  5. ^ perform hundreds of tasks (www.zdnet.com)
  6. ^ echoed (medium.com)
  7. ^ artificial neural networks (theconversation.com)
  8. ^ generalise (medium.com)
  9. ^ supervised learning (www.ibm.com)
  10. ^ complex process (www.abc.net.au)
  11. ^ 400 billion words (arxiv.org)
  12. ^ backpropagation can’t be implemented (www.nature.com)
  13. ^ external signals (journals.plos.org)
  14. ^ variations (www.sciencedirect.com)
  15. ^ structured mental concepts (link.springer.com)
  16. ^ combines different types of input (www.techrepublic.com)
  17. ^ many researchers believe (www.zdnet.com)

Read more https://theconversation.com/were-told-ai-neural-networks-learn-the-way-humans-do-a-neuroscientist-explains-why-thats-not-the-case-183993

Times Magazine

Adobe Ushers in a New Era of Creativity with New Creative Agent and Generative AI Innovations in Adobe Firefly

Adobe (Nasdaq: ADBE) — the global technology leader that unleashes creativity, productivity and ...

CRO Tech Stack: A Technical Guide to Conversion Rate Optimization Tools

The fascinating thing is that the value of this website lies in the fact that creating a high-cali...

How Decentralised Applications Are Reshaping Enterprise Software in Australia

Australian businesses are experiencing a quiet revolution in how they manage data, execute agreeme...

Bambu Lab P2S 3D Printer Review: High-End Performance Meets Everyday Usability

After a full month of hands-on testing, the Bambu Lab P2S 3D printer has proven itself to be one...

Nearly Half of Disadvantaged Australian Schools Run Libraries on Less Than $1000 a Year

A new national snapshot from Dymocks Children’s Charities reveals outdated books, no librarians ...

Growing EV popularity is leading to queues at fast chargers. Could a kerbside charger network help?

The war on Iran has made crystal clear how shaky our reliance on fossil fuels is. It’s no surpri...

The Times Features

Mother’s Day, The Lodge Dining Room

Her Day, The Lodge Way This Mother’s Day, The Lodge Dining Room presents a refined take on high...

The Albanese Government’s plan to impose a retrospectiv…

LABOR’S RETROSPECTIVE TAX GRAB RISKS 3 MILLION JOBS The Albanese Government’s plan to impose a retr...

Court outcome reinforces wildlife trafficking will not …

A 20-year-old man has been fined close to $50,000 and ordered to pay costs after pleading guilty t...

Businesses tap UOW PhD researchers to accelerate innova…

Industry internship program connects businesses with research talent to fast-track innovation an...

Olivia Colman, Kate Box to join an exclusive Live Q…

Photo credit : Photo Credit Mark De BlokFresh out of cinemas, JIMPA - the new film by acclaimed di...

Rental growth reaccelerates as cost to tenants reaches …

Australian renters are spending a record share of their gross median household income on housing c...

Worried about feeding your baby solid foods? Here’s wha…

When you have a baby, mealtimes can be messy and stressful. If you’re a new parent you may be...

Key Nutrients to Consider Before Pregnancy

Preparing for pregnancy often begins well before conception. Nutrition plays an important role durin...

When AI starts shopping for you, fashion may be enterin…

Fashion has always been a bit different to other industries. Consumers do not just buy because...