The dynamics of magnetic metamaterials offer a path to low-energy, next-gen computing
The public launch of OpenAI’s ChatGPT in November 2022 caused a media sensation and kicked off a rapid proliferation of similar Large Language Models (LLMs). However, the computing power needed to train and run these LLMs and other artificial intelligence (AI) systems is colossal, and the energy requirements are staggering. Training the GPT-3 model behind ChatGPT, for example, required 355 years of single-processor computing time and consumed 284,000 kWh of energy1. This is one example of a task that the human brain handles much more efficiently than a traditional computer, and researchers are investigating the potential of more brain-like (neuromorphic) computing methods that may prove to be more energy efficient. Physical reservoir computing is one such method, using the natural, complex responses of materials to perform challenging computations. Researchers from the University of Sheffield are investigating the use of magnetic metamaterials – structured at the nanoscale to exhibit complex and emergent properties – to perform such computations. In work recently published in Communications Physics, they have demonstrated an ability to tune the system to achieve state-of-the-art performance in different types of computation. Their results show that an array of interconnected magnetic nanorings is a promising architecture for neuromorphic computing systems.
Emergence Could Power More Brain-Like Computers
Anyone who has witnessed the majestic and mesmerising flight of a murmuration of starlings has no doubt wondered how a flock of birds can achieve such synchronised behaviour. This is an example of emergence, where the interactions of simple things lead to complex collective behaviours. But emergence doesn’t only occur in the natural world, and a group at the University of Sheffield is investigating how the emergent behaviour can be engineered in magnetic materials when they are patterned to have nanoscale dimensions.
Dr Tom Hayward, Senior Lecturer in Materials Physics at the University of Sheffield and author of this paper says,
Life is inherently emergent – with simple entities connecting together to give complex behaviours that a single element would not have. It’s exciting because we can take simple things – which hypothetically can be very energy efficient – and make them manifest the kind of complexity we see in the brain. Material computation relies on the fact that many materials that exhibit some form of memory can take an input and transform it into a different output – precisely the properties we need to perform computation. Our system connects a series of tiny magnetic rings into a big ensemble. One individual ring in isolation shows quite simple behaviours. But when we connect them, they interact with each other to give complex behaviours.
Magnets have a number of properties that make them interesting for these kinds of applications:
- Firstly, they are non-volatile, with inherent memory – if you stick a magnet to your fridge, it stays put.
- Brains (and brain-like computers) need to have non-linear responses, taking simple information and performing complicated transforms, and that’s something magnets are naturally good at.
- There are plenty of ways to make magnets change state and perform computations that use very little energy.
- And magnets are a well-established technology (used, for example, in hard drives and Magnetoresistive random-access memory (MRAM)), and so there are existing routes to technology integration.
XPEEM Highlights the Underlying Magnetic Dynamics
Key to this research is understanding what’s happening to these magnetic nanorings when they’re connected together – the way that emergence changes the way they change magnetic states.
Read more on Diamond website