If Artificial Neural Networks Sleep Better, They Train Better

Washington [US19th November (ANI) Researchers talk about the possibility of mimicking the neural patterns that are similar to those of human beings artificial neural networks can reduce the danger of catastrophic memory loss in the latter increasing their value across a range of research topics.

“The brain is very busy when we sleep, repeating what we have learned during the day,” said Maxim Bazhenov, PhD, Professor of Medicine and a sleep researcher at the University of California San Diego School of Medicine. “Sleep helps reorganize memories and presents them in the most efficient way.”

In earlier published research, Bazhenov and colleagues have revealed how sleep helps build rational memory, which allows you to recall indefinite or indirect associations between individuals, objects, or events. Sleep also guards against the possibility of forgetting past memories.

Artificial neural networks rely on the structure that the brain has in order to enhance many technologies and systems including fundamental science and medicine to social media and finance. In certain ways they’ve achieved extraordinary performance, like processing speed, however they fall short in a crucial aspect: when artificial neural networks learn in a sequential manner new information, it overwrites the previously learned information, which is known as catastrophic forgetting.

“In contrast, the human brain learns continuously and incorporates new data into existing knowledge,” said Bazhenov, “and it typically learns best when new training is interleaved with periods of sleep for memory consolidation.”

In the November 18th 2022 issue in PLOS Computational Biology Senior author Bazhenov and co-authors discuss the ways that biological models can assist in reducing the risk of catastrophic loss of memory within artificial neural networks increasing their value across a broad range of research areas.

Researchers employed spiking neural networks to artificially replicate natural neural networks: Instead of data being transmitted constantly, it is communicated in individual instances (spikes) at specific times.

The researchers found that when the network’s spiking neurons were trained to perform the same task however, with periodic off-line intervals that resembled sleep, massive forgetting was slowed. Similar to the human brain, according to the study’s authors “sleep” for the networks let them replay old memories without explicitly using the old training data.

Artificial neural networks are able to learn more during the time they “sleep”

As they age, humans typically require sleep between 7 to 13 hours each day. This is a time when numerous things occur including breathing, heart rate, and metabolism flow and ebb as hormone levels change and the body is relaxed. The brain is active more than other organs while we sleep, reliving the lessons we learned throughout the day, and reorganizing our memories efficiently creating rational memory which is the capacity to recall random or indirect connections between individuals, objects, and occasions – while also preventing losing previous memories.

Recently artificial neural networks are based on the structure that the brain has to enhance the efficiency of various technologies and systems including basic science, social media and finance. Although in certain ways, they’ve made it possible to achieve superhuman results including extreme computing speed, they currently do not perform well in one crucial aspect which is that when they learn sequentially and new information replaces prior information, a process known as “catastrophic forgetting.”

“In contrast, the human brain learns continuously and incorporates new data into existing knowledge, and it typically learns best when new training is interleaved with periods of sleep for memory consolidation,” the study’s senior researcher Maxim Bazhenov, a professor of Medicine and sleep specialist of the University of California, San Diego.

According to the scientists, biological models can assist in avoiding catastrophic memory loss of artificial neural systems, increasing their effectiveness in a variety of interest. To achieve that, Bazhenov and his team utilized spiking neural systems that artificially replicate natural neural systems. Instead of data being transmitted continuously and continuously, it is communicated in discrete events, or spikes – at particular intervals.

The results of their experiments showed that when the spiking network was trained to do a specific task, however, with periodic sleep-like periods the risk of catastrophic forgetting was reduced. So, similar to our brains, “sleep” for the networks enabled them to recall old memories , without using training data from the past.

“When we are learning new information, our neurons activate in a particular order, and this causes synapses to increase between them. In sleep, the patterns of spiking that we learned in our state of consciousness occur spontaneously. This is known as reactivation or replay,” Bazhenov explained.

“Synaptic plasticity, the capacity to be altered or molded, is still in place during sleep and it can further enhance synaptic weight patterns that represent the memory, helping to prevent forgetting or to enable transfer of knowledge from old to new tasks.”

When researchers applied this method to neural network simulations, they observed that it plays an important function in preventing disastrous mistakes. “It means that the networks were able to learn continually, much like humans and animals. Understanding how our brains process information during sleep could help to improve memory in humans. The enhancement of sleep patterns will improve memory.”

Neural Networks Learn Better by Mimicking Human Sleep Patterns

Maxim Bazhenov, PhD, is an associate Professor of medical research and sleep at the University of California San Diego School of Medicine.

“The brain is very busy when we sleep, repeating what we learned during the day,” Bazhenov states. “Sleep helps reorganize memories and presents them in the most efficient way.”

The Problem of Catasrophic Forgetting

Artificial neural networks take on the structure of the human brain to develop AI technology and systems. While these techniques have managed to surpass human performance in terms of speed at which they compute however, they do have one significant drawback. When neural networks learn in a sequential manner they overwrite the prior information, resulting in a process known as catastrophic forgetting.

A group of researchers from the University of California – San Diego is studying the possibility of artificial neural networks to replicate the sleep patterns of the human brain, thereby reducing the risk of catastrophic memory loss.

The study is published in PLOS Computational Biology.

On average, we need between 7 and 13 hours of rest every 24 hours. Although sleep can relax the body in a variety of methods, your brain is very active.

Active Brain During Sleep

Maxim Bazhenov, PhD, is an associate Professor of medical research and sleep at the University of California San Diego School of Medicine.

“The brain is very busy when we sleep, repeating what we learned during the day,” Bazhenov states. “Sleep helps reorganize memories and presents them in the most efficient way.”

Bazhenov and his team published earlier research about how sleep helps build rational memory that is the ability to recall indefinite or indirect associations between people, objects or even events. It also guards against the possibility of losing the old memories.

The Problem of Catasrophic Forgetting

Artificial neural networks take on the structure of the human brain in order to enhance AI systems and technologies. While these techniques have managed to surpass human efficiency in the form of speed at which they compute however, they do have one significant drawback. As neural networks learn, they do so in a sequence and learn, the new information is able to overwrite previously learned information in a process known as catastrophic forgetting.

“In contrast, the human brain learns continuously and incorporates new data into existing knowledge, and it typically learns best when new training is interweaved with periods of sleep for memory consolidation,” Bazhenov states.

The team employed the spiking neural network to imitate the natural brain systems. Instead of being transmitted continuously information is relayed in distinct events, called spikes at certain times.

Mimicking Sleep in Neural Networks

Researchers discovered that when the spiking networks were trained to perform new tasks, with periodic off-line intervals that mimicked sleep, the issue of forgetting catastrophically was reduced. Like humans’ brains, researchers claim that “sleep” enables the networks to replay memories from the past without explicitly using the old training data.

Alex

I am a senior journalist with a passion for writing. I was born in Texas and have been involved in mass communication for many years. I love to cook and enjoy sports. I am also a very passionate person.

Leave a Reply

Your email address will not be published. Required fields are marked *