Hopfield and Hinton: The 2024 Nobel Prize in Physics

December 3, 2024
Science Magazine

Networks. Artificial intelligence. Machine learning. 

At first glance, these may seem like the keywords of a computer science research paper or an explanation of ChatGPT’s inner workings. But these words are actually from a press release for this year’s 2024 Nobel Prize in Physics, honoring the creation of foundational discoveries in machine learning.

 

Above: John J. Hopfield (left) and Geoffrey Hinton (right), winners of the 2024 Nobel Prize in Physics. Image courtesy of the Nobel Prize Foundation.

Hopfield Networks

To understand what the Nobel Committee describes as a “completely new way for us to use computers,” we need to step back a couple of decades—years before the creation of current mainstream Large Language Models (LLM) like ChatGPT and Claude.

In 1982, three years before the release of Windows 1.0, physicist Dr. John Hopfield was toying with the idea of associative memory, or a way to remember relationships between two unrelated things like images and numbers. In his paper published later that year, Dr. Hopfield described a method of associative memory using interactions among simple elements of physical systems as inspiration. With such simple components making up natural phenomena like stable magnetic orientations in a magnetic system or vortex patterns in fluid flow, Hopfield wondered whether a system of simple interacting neurons could have useful “computational” correlates.

To answer this question, Hopfield developed what is now known as a Hopfield network. He took inspiration from a system used in physics called spin glass, which describes a magnetic material’s characteristics based on its atomic spin, where each atom is assigned a binary state. Like these binary states, Hopfield networks contain a layer of interconnected neurons (or nodes) that are individually assigned states, with the states of the neighboring neurons affecting each neuron’s state.

When an image is stored in a Hopfield network, the connections between these nodes are altered to achieve the minimal-energy state. Researchers then train the network to find these values for the minimal-energy configuration connections so that when a distorted or incomplete image is inputted, the network can use the learned minimization process to find the saved image most like the one inputted. Over the years, the Hopfield network improved so that later networks could differentiate between multiple stored images and recognize slight subtleties between inputted images.

Above: A simplified visualization of Hopfield networks using landscaping. Image courtesy of Nobel Prize Foundation.

The Boltzmann Machine

A couple of years after Hopfield’s invention, computer scientist Dr. Geoffrey Hinton began to build on simple Hopfield networks to integrate probabilities into each node. Borrowing ideas from the Boltzmann equation and statistical mechanics—which uses a probability distribution to analyze particles—he created a tool that could classify images and generate new images based on the patterns on which it was trained. It was named the Boltzmann Machine, as a nod to the physics equation it was based on. Later developments led to the creation of restricted Boltzmann Machines that improved performance by eliminating the connections between some nodes. Given the significance of his work in kick-starting the deep learning revolution, Hinton is now often referred to as the Godfather of Artificial Intelligence.

Above: Different types of networks. Image courtesy of Nobel Prize Foundation.

An Enduring Legacy

The work of Hopfield and Hinton has paved the way for the exploration of neural networks in academia, a path scientists previously considered a dead end. Researchers have built new forms of networks, including backpropagation, which is a machine learning technique essential to training artificial neural networks. Many scientific fields have found incredible value in Hopfield networks. For example, biologists have implemented this type of network to investigate how brain neurons work together in memory and navigation. Even the LLMs that now serve as the basis for GPT technologies used worldwide today can trace their roots back to Hopfield networks and Boltzmann Machines. 

Controversy

Unfortunately, the announcement of the 2024 Nobel Prize in Physics was met with mixed feelings. Many researchers and physics enthusiasts expressed concern about these networks’ relevance in physics over online forums and news sites. “Even if there's inspiration from physics, they're not developing a new theory in physics or solving a longstanding problem in physics,” said Dr. Noah Giansiracusaan, an associate math professor at Bentley University, in an interview with CBC News.

The Nobel Prize awards prizes in only six fields: medicine or physiology, physics, chemistry, literature, peace, and economics. “[T]he Nobel Prize committee doesn't want to miss out on this AI stuff… both [works] are dubious, but nonetheless worthy of a Nobel Prize in terms of the science they've done. So how else are you going to reward them?” asked Professor Dame Wendy Hall, computer scientist and artificial intelligence (AI) advisor to the United Nations, in an interview with Reuters

All things considered, Hopfield and Hinton have left a significant mark on the field of physics. For example, the discovery of the Higgs Boson would not have been possible without machine learning, which analyzed data from millions of collisions and found patterns to identify this elementary particle. In its published scientific background to this year’s Nobel Prize, the Nobel Committee cited numerous other applications for artificial neural networks—including the IceCube neutrino detector project that produced the first neutrino image of the Milky Way.

Therefore, this year’s Nobel Prize in Physics recognizes not only the scientific achievements of neural networks but also their work in revealing the fundamental role that physics plays in shaping the world today.

Above: An artistic depiction of the Milky Way seen with a neutrino lens, which used Hopfield and Hinton’s networks as a foundation. Image courtesy of IceCube Collaboration/U.S. National Science Foundation (Lily Le & Shawn Johnson)/ESO (S. Brunier).

Related Articles