Engineers at Northwestern University in the US used a novel design for transistors that not only helps them miniaturise them but also aids in making artificial intelligence (AI) tasks 100 times more energy efficient, a press release has revealed.
The AI wave has swept the tech industry, and big and small companies are working to incorporate AI-powered features into their products. It has previously been reported how silicon-based chips have powered the rise of AI, and companies like Microsoft have spent millions on such chips to build up cloud-based infrastructure from scratch.
As more businesses climb the AI bandwagon, the demand for such infrastructure will inevitably grow. However, Mark Hersam, a professor of Materials Science at Northwestern University, points out that the approach is energy-intensive since data is collected, sent to the cloud for analysis, and then results are sent back to the user. Instead, local processing of data would be much more energy efficient.
Moving away from silicon
Before analysis, collected data needs to be sorted into various categories for the machine learning process to begin. Since each silicon transistor can perform only one step of the data processing tasks, the number of transistors needed to complete the job increases in proportion to the size of the data set.
Hersam's team decided to move away from silicon and used two-dimensional molybdenum disulfide and one-dimensional carbon nanotubes to make their mini transistors. The design of these new transistors was such that they could be reconfigured to work on different steps of the analysis.
"The integration of two disparate materials into one device allows us to strongly modulate the current flow with applied voltages, enabling dynamic reconfigurability," said Hersam in the press release.
Not only did this drastically reduce the number of transistors and the energy consumed, but it also helped miniaturise the analysis to such a degree that it could be integrated into a regular wearable device.
Advanced analysis but not in the cloud
The researchers used publicly available medical datasets to demonstrate the device's capability. They trained the AI to interpret the data from electrocardiogram (ECG), something that even medical workers require intensive training on.
The device was then asked to classify the data into six types of heartbeats that are often seen, viz, normal, atrial premature beat, premature ventricular contraction, paced beat, left bundle branch block beat, and right bundle branch block beat for 10,000 ECG samples, which it successfully achieved with 95 percent accuracy.
A task of such complexity would require at least 100 silicon transistors for computation, but the Northwestern researchers achieved the same with two just two transistors of novel design.
Hersam also highlights how local processing of data also protects patient privacy. "Every time data are passed around, it increases the likelihood of the data being stolen,” If personal health data is processed locally – such as on your wrist in your watch – that presents a much lower security risk."
In the future, the team envisions that their devices will be incorporated into everyday wearables, powering real-time applications without sapping grid power.
The research findings were published in the journal Nature Electronics recently.