First published 2024
The fascination with artificial intelligence (AI) stems from its ability to handle massive data volumes with superhuman efficiency. Traditional AI systems depend on computers running complex algorithms through artificial neural networks, but these systems consume significant energy, particularly when processing real-time data. To address this, a novel approach to machine intelligence is being pursued, shifting from software-based artificial neural networks to more efficient physical neural networks in hardware, specifically using silver nanowires.
Silver nanowires, only a few nanometres in diameter, offer a more efficient alternative to conventional graphical processing units (GPUs) and neural chips. These nanowires form dense neuron networks, surpassing the efficiency of computer-based AI systems. Their small size allows for more densely packed networks, enhancing information processing speed and complexity. The nanowires’ flexibility and durability further add to their appeal, being adaptable to different configurations and more resistant to wear compared to traditional AI systems. Their operation at the speed of light significantly surpasses the electrical speed of GPUs and neural chips, leading to faster processing speeds. This, coupled with the high conductivity of silver, allows these nanowires to operate at lower voltages, thereby reducing power consumption. Their small size is particularly beneficial for integration into compact devices like smartphones and wearables. Moreover, their ability to multitask means they can process more information in less time, enhancing their suitability for various AI applications.
While the advancements in using silver nanowires for AI are promising, they are accompanied by several challenges. Their high cost limits accessibility, particularly for smaller firms and startups, and the nanowires’ limited availability complicates their integration into a wide range of products. Additionally, the fragility of silver nanowires may compromise their durability, requiring careful handling to prevent damage and making them potentially less robust than traditional GPUs and neural chips. Furthermore, despite their rapid data processing capabilities, silver nanowires may not yet rival the performance of GPUs in high-performance computing or be as efficient in handling large-scale data processing.
In contrast, the field of neuromorphic computing, which aims to replicate the complex neuron topology of the brain using nanomaterials, is making significant strides. Networks composed of silver nanowires and nanoparticles are particularly noteworthy for their resistive switching properties, akin to memristors, which enhance network adaptability and plasticity. A prime example of this is Atomic Switch Networks (ASNs) made of Ag2S junctions. In these networks, dendritic Ag nanowires form interconnected atomic switches, effectively emulating the dense connectivity found in biological neurons. These networks have shown potential in various natural computing paradigms, including reservoir computing, highlighting the diverse applications of these innovative neural network architectures.
Further explorations in creating neuromorphic networks have involved self-assembled networks of nanowires or nanoparticles, such as those formed from metal oxide nanoparticles like gold or tin. These networks display neuromorphic properties due to their resistive switches and show recurrent properties crucial for neuromorphic applications. Such advancements in the field of AI, particularly with the use of silver nanowires, point to a future where computing not only becomes more efficient but also more closely emulates the complex processes of the human brain. These developments indicate the potential for revolutionary changes in how data is processed and learned, paving the way for more advanced and energy-efficient AI systems.
A recent approach in neuromorphic computing demonstrates the capability of neural networks of silver nanowires to learn and recognise handwritten numbers and memorise digit strings, with findings published in Nature Communications (2023), in collaboration with researchers from the University of Sydney and the University of California, Los Angeles. The team employs nanotechnology to create networks of silver nanowires, each about one thousandth the width of a human hair. These networks form randomly, resembling the brain’s neuron network. In these networks, external electrical signals prompt changes at the intersections of nanowires, mimicking the function of biological synapses. With tens of thousands of these synapse-like junctions, these networks efficiently process and transmit information.
A significant aspect of this research is the demonstration of real-time, online machine learning capabilities of nanowire networks, in contrast to conventional batch-based learning in AI. Unlike traditional systems that process data in batches, this approach allows continuous data stream processing, enabling the system to learn and adapt instantly. This “on the fly” learning reduces the need for repetitive data processing and extensive memory requirements, resulting in substantial energy savings and increased efficiency.
The team tested the nanowire network’s learning and memory capabilities using the Modified National Institute of Standards and Technology (MNIST) database of handwritten digits. The network successfully learned and improved its pattern recognition with each new digit, showcasing real-time learning. Additionally, the network was tested on memory tasks involving digit patterns, demonstrating an aptitude for remembering sequences, akin to recalling a phone number.
These experiments highlight the potential of neuromorphic nanowire networks in emulating brain-like learning and memory processes. This research represents just the beginning of unlocking the full capabilities of such networks, indicating a promising future for AI development. The implications of these findings are far-reaching. Nanowire Network (NWN) devices could be used in areas such as natural language processing and image analysis, making the most of their ability to learn and remember dynamic sequences. The study points to the possibility of NWNs contributing to new types of computational applications, moving beyond the traditional limits of the Turing Machine concept and based on real-world physical systems.
In conclusion, the exploration of silver nanowires in artificial intelligence marks a significant shift towards more efficient, brain-like computing. These nanowires, mere nanometres in diameter, present a highly efficient alternative to traditional GPUs and neural chips, forming densely packed neuron networks that excel in processing speed and complexity. Their adaptability, durability, and ability to operate at lower voltages highlight the potential for integration into various compact devices and AI applications.
However, challenges such as high cost, limited availability, and fragility temper the widespread adoption of silver nanowires, along with their current limitations in matching the performance of GPUs in certain high-demand computing tasks. Despite these hurdles, the advancements in neuromorphic computing using silver nanowires and other nanomaterials are promising. Networks like Atomic Switch Networks (ASNs) demonstrate the potential of these materials in replicating the complex connectivity and functionality of biological neurons, paving the way for breakthroughs in natural computing paradigms.
The 2023 study showcasing the online learning and memory capabilities of silver nanowire networks, especially in tasks like recognising handwritten numbers and memorising digit sequences, represents a leap forward in AI research. These networks, capable of processing data streams in real time, offer a more energy-efficient and dynamic approach to machine learning, differing fundamentally from traditional batch-based methods. This approach not only saves energy but also mimics the human brain’s ability to learn and recall quickly and efficiently.
As the field of AI continues to evolve, silver nanowires and neuromorphic networks stand at the forefront of research, potentially revolutionising how data is processed and learned. Their application in areas such as natural language processing and image analysis could harness their unique learning and memory abilities. This research, still in its early stages, opens the door to new computational applications that go beyond conventional paradigms, drawing inspiration from the physical world and the human brain. The future of AI development, influenced by these innovations, holds immense promise for more advanced, efficient, and brain-like artificial intelligence systems.
Links
https://nanografi.com/blog/silver-nanowires-applications-nanografi-blog/
https://www.techtarget.com/searchenterpriseai/definition/neuromorphic-computing
https://www.nature.com/articles/s41467-023-42470-5