The Times Australia
The Times World News

.

Physics Nobel awarded to neural network pioneers who laid foundations for AI

  • Written by Aaron J. Snoswell, Research Fellow in AI Accountability, Queensland University of Technology
Infographic comparing natural and artificial neurons.

The 2024 Nobel Prize in Physics[1] has been awarded to scientists John Hopfield and Geoffrey Hinton “for foundational discoveries and inventions that enable machine learning with artificial neural networks”.

Inspired by ideas from physics and biology, Hopfield and Hinton developed computer systems that can memorise and learn from patterns in data. Despite never directly collaborating, they built on each other’s work to develop the foundations of the current boom in machine learning and artificial intelligence (AI).

What are neural networks? (And what do they have to do with physics?)

Artificial neural networks are behind much of the AI technology we use today.

In the same way your brain has neuronal cells linked by synapses, artificial neural networks have digital neurons connected in various configurations. Each individual neuron doesn’t do much. Instead, the magic lies in the pattern and strength of the connections between them.

Neurons in an artificial neural network are “activated” by input signals. These activations cascade from one neuron to the next in ways that can transform and process the input information. As a result, the network can carry out computational tasks such as classification, prediction and making decisions.

Infographic comparing natural and artificial neurons.
Johan Jarnestad / The Royal Swedish Academy of Sciences[2] Most of the history of machine learning has been about finding ever more sophisticated ways to form and update these connections between artificial neurons. While the foundational idea of linking together systems of nodes to store and process information came from biology, the mathematics used to form and update these links came from physics. Networks that can remember John Hopfield (born 1933) is a US theoretical physicist who made important contributions over his career in the field of biological physics. However, the Nobel Physics prize was for his work developing Hopfield networks[3] in 1982. Hopfield networks were one of the earliest kinds of artificial neural networks. Inspired by principles from neurobiology and molecular physics, these systems demonstrated for the first time how a computer could use a “network” of nodes to remember and recall information. The networks Hopfield developed could memorise data (such as a collection of black and white images). These images could be “recalled” by association when the network is prompted with a similar image. Although of limited practical use, Hopfield networks demonstrated that this type of ANN could store and retrieve data in new ways. They laid the foundation for later work by Hinton. Infographic showing how a neural network can store information as a kind of 'landscape'. Johan Jarnestad / The Royal Swedish Academy of Sciences[4] Machines that can learn Geoff Hinton (born 1947), sometimes called one of the “godfathers of AI[5]”, is a British-Canadian computer scientist who has made a number of important contributions to the field. In 2018, along with Yoshua Bengio and Yann LeCun, he was awarded the Turing Award (the highest honour in computer science) for his efforts to advance machine learning generally, and specifically a branch of it called deep learning. The Nobel Prize in Physics, however, is specifically for his work with Terrence Sejnowski and other colleagues in 1984, developing Boltzmann machines[6]. These are an extension of the Hopfield network that demonstrated the idea of machine learning – a system that lets a computer learn not from a programmer, but from examples of data. Drawing from ideas in the energy dynamics of statistical physics, Hinton showed how this early generative computer model could learn to store data over time by being shown examples of things to remember. Infographic showing different types of neural network. Johan Jarnestad / The Royal Swedish Academy of Sciences[7] The Boltzmann machine, like the Hopfield network before it, did not have immediate practical applications. However, a modified form (called the restricted Boltzmann machine) was useful in some applied problems. More important was the conceptual breakthrough that an artificial neural network could learn from data. Hinton continued to develop this idea. He later published influential papers on backpropagation[8] (the learning process used in modern machine learning systems) and convolutional neural networks[9] (the main type of neural network used today for AI systems that work with image and video data). Why this prize, now? Hopfield networks and Boltzmann machines seem whimsical compared to today’s feats of AI. Hopfield’s network contained only 30 neurons (he tried to make one with 100 nodes, but it was too much for the computing resources of the time), whereas modern systems such as ChatGPT can have millions. However, today’s Nobel prize underscores just how important these early contributions were to the field. While recent rapid progress in AI – familiar to most of us from generative AI systems such as ChatGPT – might seem like vindication for the early proponents of neural networks, Hinton at least has expressed concern. In 2023, after quitting a decade-long stint at Google’s AI branch, he said he was scared by the rate of development[10] and joined the growing throng of voices calling for more proactive AI regulation. After receiving the Nobel prize, Hinton said[11] AI will be “like the Industrial Revolution but instead of our physical capabilities, it’s going to exceed our intellectual capabilities”. He also said he still worries that the consequences of his work might be “systems that are more intelligent than us that might eventually take control”. References^ 2024 Nobel Prize in Physics (www.nobelprize.org)^ Johan Jarnestad / The Royal Swedish Academy of Sciences (www.nobelprize.org)^ Hopfield networks (www.pnas.org)^ Johan Jarnestad / The Royal Swedish Academy of Sciences (www.nobelprize.org)^ godfathers of AI (www.forbes.com)^ Boltzmann machines (www.cs.toronto.edu)^ Johan Jarnestad / The Royal Swedish Academy of Sciences (www.nobelprize.org)^ backpropagation (www.nature.com)^ convolutional neural networks (dl.acm.org)^ scared by the rate of development (www.nytimes.com)^ said (www.bbc.com)

Read more https://theconversation.com/physics-nobel-awarded-to-neural-network-pioneers-who-laid-foundations-for-ai-240833

Times Magazine

What AI Adoption Means for the Future of Workplace Risk Management

Image by freepik As industrial operations become more complex and fast-paced, the risks faced by workers and employers alike continue to grow. Traditional safety models—reliant on manual oversight, reactive investigations, and standardised checklist...

From Beach Bops to Alpine Anthems: Your Sonos Survival Guide for a Long Weekend Escape

Alright, fellow adventurers and relaxation enthusiasts! So, you've packed your bags, charged your devices, and mentally prepared for that glorious King's Birthday long weekend. But hold on, are you really ready? Because a true long weekend warrior kn...

Effective Commercial Pest Control Solutions for a Safer Workplace

Keeping a workplace clean, safe, and free from pests is essential for maintaining productivity, protecting employee health, and upholding a company's reputation. Pests pose health risks, can cause structural damage, and can lead to serious legal an...

The Science Behind Reverse Osmosis and Why It Matters

What is reverse osmosis? Reverse osmosis (RO) is a water purification process that removes contaminants by forcing water through a semi-permeable membrane. This membrane allows only water molecules to pass through while blocking impurities such as...

Foodbank Queensland celebrates local hero for National Volunteer Week

Stephen Carey is a bit bananas.   He splits his time between his insurance broker business, caring for his young family, and volunteering for Foodbank Queensland one day a week. He’s even run the Bridge to Brisbane in a banana suit to raise mon...

Senior of the Year Nominations Open

The Allan Labor Government is encouraging all Victorians to recognise the valuable contributions of older members of our community by nominating them for the 2025 Victorian Senior of the Year Awards.  Minister for Ageing Ingrid Stitt today annou...

The Times Features

Meal Prep as Self-Care? The One Small Habit That Could Improve Your Mood, Focus & Confidence

What if the secret to feeling calmer, more focused, and emotionally resilient wasn’t found in a supplement or self-help book — but in your fridge? That’s the surprising link uncov...

From a Girlfriend’s Moisturiser to a Men’s Skincare Movement: How Two Mates Built Two Dudes

In a men’s skincare market that often feels like a choice between hyper-masculinity and poorly disguised women’s products, Two Dudes stands out. It’s not trying to be macho. It’s n...

The Great Fleecing: Time for Aussies to demand more from their banks

By Anhar Khanbhai, Chief Anti-Fleecing Officer, Wise   As Australians escape the winter chill for Europe’s summer or Southeast Asia’s sun, many don’t realise they’re walking strai...

Agentforce for Financial Services: Merging AI and Human Expertise for Tailored BFSI Solutions

In this rapidly evolving world of financial services, deploying customer experiences that are personalized and intelligent is crucial. Agentforce for Financial Services by Sale...

Cult Favourite, TokyoTaco, Opens Beachfront at Mooloolaba this June

FREE Tokyo Tacos to Celebrate!  Cult favourite Japanese-Mexican restaurant TokyoTaco is opening a beachfront venue at the Mooloolaba Esplanade on Queensland’s Sunshine Coast t...

Samsara Eco and lululemon announce 10 year partnership

lululemon and Samsara Eco Announce 10-Year Plan to Advance Recycled Material Portfolio Plan will see lululemon source a significant portion of its future nylon 6,6 and polyes...