Brain chips are augmenting computer and cybercrime

The human brain is more powerful and energy-efficient than any computer. Scientists are mimicking how they work to produce better computer chips and help process the increasing amount of data generated every day.
go through Tom Cassauwers
To prevent smart home devices from being hacked, researchers are developing ultra-fast, energy-efficient brain-like chips that can detect threats on our devices in real time.
From smart refrigerators and TVs to toothbrushes connected to the internet, more and more home gadgets are now part of the Internet of Things. This makes it easier to analyze usage data or install remote updates. But this is also a security risk.
These smart devices are often hacked to create so-called botnets – tortured devices networks that can be used to initiate large-scale cyberattacks.
Edge computing
To solve this problem, for example, we can collect all the data through the device and send it to a data center where suspicious activity is found in millions of connected devices using AI algorithms. But this takes time and requires the transmission of a large amount of data.
That’s why scientists want to be able to do these calculations locally – on the refrigerator or on the toothbrush itself.
However, at the edge of the network, the concept of edge computing performed locally also faces challenges. Many complex calculations must be done quickly on small chips that do not use too much power.
“If you are generating these amounts of data, it’s very demanding to process it on the fly,” said Dr. Matěj Hejda, a research scientist specializing in advanced computing and photonics. Hejda is part of an EU-funded program called Neuropuls, which is addressing the issue head-on.
HEJDA and other researchers from the Neural Team are developing a small chip or processor that can perform very fast AI computing while consuming almost any energy.
“If there is a cyber attack, you won’t be able to afford the delay. We rely on AI to make quick decisions based on a large amount of data. That’s what our chip design is for.”
Brain power
Their innovations are inspired by the human brain, which can perform complex tasks than today’s traditional computers. Through key functions based on neural processing, the team hopes to provide intelligent low-power computing for a range of real-world applications.
“Circuits mimic the behavior of the brain,” said Dr. Fabio Pavanello, a French national scientific researcher at the Alps Center, the Center for Video and Micronanoelectronics. Pavanello is responsible for coordinating neural research.
This new mixture of neuroscience and high-tech is called neuromorphic computing, and it quickly gains correlation.
“There are many ways to do this,” Pavanello said. “We chose photonics, which means we used light beams instead of electrical signals to formulate the calculations.”
Merge memory and processing
Hewlett Packard Enterprise Labs, Belgium, who works in Hejda, did some research. Researchers there are working to solve one of the bottlenecks in modern AI computing: memory.
“We have a way to get around this barrier,” Pavanelo said. On a regular computer, memory is separated from the central processing unit where the computing occurs. The processor calculates the contents, and the data used in the calculation is stored in the storage unit.
Often, this data needs to be continuously transferred from memory to processor and then transferred backwards through certain circuits. This creates a bottleneck for AI, as the connection between the processor and memory cannot handle such large data flows.
This bottleneck can lead to slower computing and higher energy use. But researchers may have found a solution.
“Our goal is to put memory and calculation in the same place,” Hejda said. “That’s also done in our brain, by the way. Memory and thinking seem to be in common in essence.”
Light wave
Another innovation proposed by neural chips is ultra-low power photon computing. Instead of using electrical signals for calculations, it uses special chips where light passes through a microscopic pathway called a waveguide.
Using light provides several advantages such as minimum signal loss, ultra-low latency or delay between sending and receiving data, and big data rates.
“It is also easier to do many parallel calculations with different colors of light,” Pavanello said.
“With these systems, you can have more sensors and collect more data. This means we can make smart and knowledgeable decisions about better energy costs. ”
Another advantage of using photon technology is to build a safer shield for such chips to better protect their operation and processing data. “This is a critical requirement for their secure use in systems and networks,” Pavanelo added.
Improve self-driving cars
The Neuropuls research team plans to test new chips in practical applications, such as detecting intrusions from computer networks. But they also want to use it in other real-world situations.
For example, it can be used to speed up the response time of self-driving cars. When the vehicle needs to brake suddenly or rotate suddenly, it will not be able to wait for the remote data center to process information and respond – everything must happen reliably.
The photon architecture used in glia will provide high bandwidth and low latency, allowing automotive software to make real-time decisions and improve road safety.
These chips can also be used in traffic cameras and sensors, helping to optimize urban mobility, or in wearable health devices that monitor vital signs and send real-time alerts when problems arise.
Progress quickly
Partners for the project include the French Alternative Energy and Atomic Energy Commission, the Barcelona Supercomputing Centre, and leading universities from Italy, Belgium, Portugal, Germany and Greece.
The researchers’ goal is to finalize and test their new chip design by 2027. Nevertheless, it may take some time for a brain chip to get into our devices because it needs to be prepared for large-scale applications.
“Our approach is very scalable due to the same microchip technology, but it will actually take several years to achieve widespread use,” Pavanello said.
That is to say, neuromorphism and photonic chips have become the latest technology fashion. Large AI chip companies such as NVIDIA are investing in integrated photonic technology. For Hejda, this shows that the technology is on the cusp of wider acceptance.
“It’s obvious that the biggest players in the market think Photonics is the technology they need to look at,” he said. “This bodes well for accelerating the path to real-life applications.”
The research in this article is funded by the EU’s Horizon Program. The views of respondents do not necessarily reflect the views of the European Commission.
This article was originally published in the European Journal of Research and Innovation.
More information
Related
Discover more from Horizon Magazine Blog
Subscribe to send the latest posts to your email.