The Times Australia
Google AI
The Times World News

.

We're told AI neural networks 'learn' the way humans do. A neuroscientist explains why that's not the case

  • Written by James Fodor, PhD Candidate in Cognitive Neuroscience, The University of Melbourne
We're told AI neural networks 'learn' the way humans do. A neuroscientist explains why that's not the case

Recently developed artificial intelligence (AI) models are capable of many impressive feats, including recognising images and producing human-like language. But just because AI can perform human-like behaviours doesn’t mean it can think or understand like humans.

As a researcher studying how humans understand and reason about the world, I think it’s important to emphasise the way AI systems “think” and learn is fundamentally different to how humans do – and we have a long way to go before AI can truly think like us.

Read more: Robots are creating images and telling jokes. 5 things to know about foundation models and the next generation of AI[1]

A widespread misconception

Developments in AI have produced systems that can perform very human-like behaviours. The language model GPT-3[2] can produce text that’s often indistinguishable from human speech. Another model, PaLM[3], can produce explanations for jokes it has never seen before[4].

Most recently, a general-purpose AI known as Gato has been developed which can perform hundreds of tasks[5], including captioning images, answering questions, playing Atari video games, and even controlling a robot arm to stack blocks. And DALL-E is a system which has been trained to produce modified images and artwork from a text description.

These breakthroughs have led to some bold claims about the capability of such AI, and what it can tell us about human intelligence.

For example Nando de Freitas, a researcher at Google’s AI company DeepMind, argues scaling up existing models will be enough to produce human-level artificial intelligence. Others have echoed[6] this view.

In all the excitement, it’s easy to assume human-like behaviour means human-like understanding. But there are several key differences between how AI and humans think and learn.

Neural nets vs the human brain

Most recent AI is built from artificial neural networks[7], or “neural nets” for short. The term “neural” is used because these networks are inspired by the human brain, in which billions of cells called neurons form complex webs of connections with one another, processing information as they fire signals back and forth.

Neural nets are a highly simplified version of the biology. A real neuron is replaced with a simple node, and the strength of the connection between nodes is represented by a single number called a “weight”.

With enough connected nodes stacked into enough layers, neural nets can be trained to recognise patterns and even “generalise[8]” to stimuli that are similar (but not identical) to what they’ve seen before. Simply, generalisation refers to an AI system’s ability to take what it has learnt from certain data and apply it to new data.

Being able to identify features, recognise patterns, and generalise from results lies at the heart of the success of neural nets – and mimics techniques humans use for such tasks. Yet there are important differences.

Neural nets are typically trained by “supervised learning[9]”. So they’re presented with many examples of an input and the desired output, and then gradually the connection weights are adjusted until the network “learns” to produce the desired output.

To learn a language task, a neural net may be presented with a sentence one word at a time, and will slowly learns to predict the next word in the sequence.

This is very different from how humans typically learn. Most human learning is “unsupervised”, which means we’re not explicitly told what the “right” response is for a given stimulus. We have to work this out ourselves.

For instance, children aren’t given instructions on how to speak, but learn this through a complex process[10] of exposure to adult speech, imitation, and feedback.

A toddler tries to walk outdoors, with an adult guiding it by both hands
Childrens’ learning is assisted by adults, but they’re not fed massive datasets the way AI systems are. Shutterstock

Another difference is the sheer scale of data used to train AI. The GPT-3 model was trained on 400 billion words[11], mostly taken from the internet. At a rate of 150 words per minute, it would take a human nearly 4,000 years to read this much text.

Such calculations show humans can’t possibly learn the same way AI does. We have to make more efficient use of smaller amounts of data.

Neural nets can learn in ways we can’t

An even more fundamental difference concerns the way neural nets learn. In order to match up a stimulus with a desired response, neural nets use an algorithm called “backpropagation” to pass errors backward through the network, allowing the weights to be adjusted in just the right way.

However, it’s widely recognised by neuroscientists that backpropagation can’t be implemented[12] in the brain, as it would require external signals[13] that just don’t exist.

Some researchers have proposed variations[14] of backpropagation could be used by the brain, but so far there is no evidence human brains can use such learning methods.

Instead, humans learn by making structured mental concepts[15], in which many different properties and associations are linked together. For instance, our concept of “banana” includes its shape, the colour yellow, knowledge of it being a fruit, how to hold it, and so forth.

As far as we know, AI systems do not form conceptual knowledge like this. They rely entirely on extracting complex statistical associations from their training data, and then applying these to similar contexts.

Efforts are underway to build AI that combines different types of input[16] (such as images and text) – but it remains to be seen if this will be sufficient for these models to learn the same types of rich mental representations humans use to understand the world.

There’s still much we don’t know about how humans learn, understand and reason. However, what we do know indicates humans perform these tasks very differently to AI systems.

As such, many researchers believe[17] we’ll need new approaches, and more fundamental insight into how the human brain works, before we can build machines that truly think and learn like humans.

References

  1. ^ Robots are creating images and telling jokes. 5 things to know about foundation models and the next generation of AI (theconversation.com)
  2. ^ GPT-3 (www.twilio.com)
  3. ^ PaLM (ai.googleblog.com)
  4. ^ seen before (www.cnet.com)
  5. ^ perform hundreds of tasks (www.zdnet.com)
  6. ^ echoed (medium.com)
  7. ^ artificial neural networks (theconversation.com)
  8. ^ generalise (medium.com)
  9. ^ supervised learning (www.ibm.com)
  10. ^ complex process (www.abc.net.au)
  11. ^ 400 billion words (arxiv.org)
  12. ^ backpropagation can’t be implemented (www.nature.com)
  13. ^ external signals (journals.plos.org)
  14. ^ variations (www.sciencedirect.com)
  15. ^ structured mental concepts (link.springer.com)
  16. ^ combines different types of input (www.techrepublic.com)
  17. ^ many researchers believe (www.zdnet.com)

Read more https://theconversation.com/were-told-ai-neural-networks-learn-the-way-humans-do-a-neuroscientist-explains-why-thats-not-the-case-183993

Times Magazine

Governance Models for Headless CMS in Large Organizations

Where headless CMS is adopted by large enterprises, governance is the single most crucial factor d...

Narwal Freo Z Ultra Robotic Vacuum and Mop Cleaner

Rating: ★★★★☆ (4.4/5)Category: Premium Robot Vacuum & Mop ComboBest for: Busy households, ha...

Shark launches SteamSpot - the shortcut for everyday floor mess

Shark introduces the Shark SteamSpot Steam Mop, a lightweight steam mop designed to make everyda...

Game Together, Stay Together: Logitech G Reveals Gaming Couples Enjoy Higher Relationship Satisfaction

With Valentine’s Day right around the corner, many lovebirds across Australia are planning for the m...

AI threatens to eat business software – and it could change the way we work

In recent weeks, a range of large “software-as-a-service” companies, including Salesforce[1], Se...

Worried AI means you won’t get a job when you graduate? Here’s what the research says

The head of the International Monetary Fund, Kristalina Georgieva, has warned[1] young people ...

The Times Features

Oztent RV tent range. Buy with caution

A review of the Oztent RV "30 second tent" range. Three years ago we bought an RV-4 from BCF Mack...

Essential Upgrades for a Smarter, Safer Australian Home

As we settle into 2026, the concept of the "dream home" has fundamentally shifted. The focus has m...

How To Modernise Your Home Without Overcapitalising

For many Australian homeowners, the dream of a "Grand Designs" transformation is often checked by ...

The Art of the Big Trip: Planning a Seamless Multi-Generational Getaway in Tropical North Queensland

There is a unique magic to the multi-generational holiday. It is a rare opportunity where gr...

Love Without Borders: ‘Second Marriage At First Sight’ Opens Casting Call for Melbourne Singles Willing to Relocate for Romance

Fans of Married At First Sight UK and Married At First Sight Australia are about to see the expe...

Macca’s is bringing pub-style vibes to the menu with the new Bistro Béarnaise Angus range

Two indulgent Aussie Angus burgers – plus the arrival of Kirks Lemon, Lime & Bitters – the  ...

What are your options if you can’t afford to repay your mortgage?

After just three rate cuts in 2025, interest rates have risen again[1] in Australia this year. I...

Small, realistic increases in physical activity shown to significantly reduce risk of early death

Just Five Minutes More a Day Could Prevent Thousands of Deaths, Landmark Study Finds Small, rea...

Inside One Global resorts: The Sydney Stay Hosting This Season of MAFS Australia

As Married At First Sight returns to Australian screens in 2026, viewers are once again getting a ...