Physics and mathematics: The building blocks for Jamaica’s AI future
Dear Editor,
In October 2024, John Hopfield and Geoffrey Hinton joined the esteemed ranks of Nobel laureates, winning the Nobel Prize in Physics.
Their groundbreaking contributions to the development of artificial intelligence (AI) are not only a testament to their brilliance, but also a celebration of the deep connection between mathematics, physics, and AI.
This victory highlights the urgent need for countries like Jamaica to strengthen their foundations in mathematics and physics if they aim to contribute meaningfully to the ever-expanding world of AI.
Hopfield and Hinton are titans in the field of AI. Hopfield, an American physicist at Princeton University, laid the groundwork for the neural networks that underpin AI with his associative memory model, which could store and reconstruct patterns in data. His 1982 paper became the cornerstone upon which future AI developments would build.
Geoffrey Hinton, often referred to as the godfather of AI, advanced this field further by introducing backpropagation, a technique essential for training deep learning models to learn from errors and improve over time. Together they revolutionised the way machines learn, opening the door for applications like facial recognition and language translation.
What is remarkable about these men is that their work, now celebrated globally, was not always held in such high regard. In fact, for decades, both Hopfield and Hinton were seen as eccentric visionaries, working in an obscure field that few believed had any practical value.
The early 1980s, when neural networks were still in their infancy, were rife with scepticism. Hinton, who went on to win the Turing Award, recounted how many in the scientific community thought their efforts were “nonsense” and a waste of time. Yet their perseverance paid off, and today AI is at the forefront of technological advancement.
But what lies beneath these AI breakthroughs is a sophisticated interplay between mathematics and physics. Neural networks, the backbone of AI, are modelled after the neurons in the human brain. This concept is rooted in physics, in which the idea of interconnected nodes — similar to neurons — stems from theories of dynamical systems and energy minimisation.
Hopfield’s early work on associative memory, for example, was based on physical principles of energy minimisation, in which patterns could be stored and recalled by moving the system to its lowest energy state. The mathematics behind these ideas is equally essential, as complex algorithms govern the behaviour of neural networks, helping them recognise patterns, learn from data, and make predictions.
At the core of Hinton’s backpropagation algorithm is calculus — a mathematical tool that allows the AI system to adjust its internal parameters by calculating gradients and minimising errors. This iterative process is akin to how students learn from their mistakes, constantly refining their answers until they arrive at the correct solution. Without this mathematical foundation, it would be impossible to teach machines how to “think” or “learn”.
Thus, the development of AI is intrinsically connected to both mathematics and physics. The neural networks, algorithms, and systems that power AI applications — ChatGPT from OpenAI or ClaudeAI from Anthropic — are not built by mere trial and error. Instead, they are the products of years of research in mathematics, physics, and computer science. Tools like Python, TensorFlow, and PyTorch, while necessary for coding these applications, would be insufficient without a deep understanding of the underlying mathematics and physics.
Horatio Deer
horatiodeer2357@gmail.com