EletiofeAI Algorithms Are Slimming Down to Fit in Your...

AI Algorithms Are Slimming Down to Fit in Your Fridge

-

- Advertisment -

Artificial intelligence has made stunning strides, but it often needs absurd amounts of data and computer power to get there. Now some AI researchers are focused on making the technology as efficient as possible.

Last week researchers showed it is possible to squeeze a powerful AI vision algorithm onto a simple, low-power computer chip that can run for months on a battery. The trick could help bring more advanced AI capabilities, like image and voice recognition, to home appliances and wearable devices, along with medical gadgets and industrial sensors. It could also help keep data private and secure by reducing the need to send anything to the cloud.

“This result is quite exciting to us,” says Song Han, an assistant professor at MIT leading the project. Although the work is a lab experiment for now, it “can quickly move to real-world devices,” Han says.

Microcontrollers are relatively simple, low-cost, low-power computer chips found inside billions of products, including car engines, power tools, TV remotes, and medical implants.

article image

The WIRED Guide to Artificial Intelligence

Supersmart algorithms won’t take all the jobs, But they are learning faster than ever, doing everything from medical diagnostics to serving up ads.

The researchers essentially devised a way to pare down deep learning algorithms, large neural network programs that loosely mimic the way neurons connect and fire in the brain. Over the past decade, deep learning has propelled huge advances in AI, and it is the bedrock of the current AI boom.

Deep learning algorithms typically run on specialized computer chips that divide the parallel computations needed to train and run the network more effectively. Training the language model known as GPT-3, which is capable of generating cogent language when given a prompt, required the equivalent of cutting-edge AI chips running at full tilt for 355 years. Such uses have led to booming sales of GPUs, chips well-suited to deep learning, as well as a growing number of AI-specific chips for smartphones and other gadgets.

There are two parts to the new research approach. First, the researchers use an algorithm to explore possible neural network architectures, looking for one that fits the computational constraints of the microcontroller. The other part is a compact, memory-efficient software library for running the network. The library is designed in concert with the network architecture, to eliminate redundancy and account for the lack of memory on a microcontroller. “What we do is like finding a needle in a haystack,” Han says.

The researchers created a computer vision algorithm capable of identifying 1,000 types of objects in images with 70 percent accuracy. The previous best low-power algorithms achieved only around 54 percent accuracy. It also required 21 percent of the memory and reduced latency by 67 percent, compared with existing methods. The team showed similar performance for a deep learning algorithm that listens for a particular “wake word” in an audio feed. Han says further improvements should be possible by refining the methods used.

Silhouette of a human and a robot playing cards

“This is indeed quite impressive,” says Jae-sun Seo, an associate professor at Arizona State University who works on resource-constrained machine learning.

“Commercial applications could include smart glasses, augmented reality devices that continuously run object detection,” Seo says. “And edge devices with on-device speech recognition without connecting to the cloud.”

John Cohn, a researcher at the MIT-IBM Watson AI Research Group and part of the team behind the work, says some IBM customers are interested in using the technology. He says one obvious use would be in sensors designed to predict problems with industrial machinery. Currently, these sensors need to be wirelessly networked so that computation can be done remotely, on a more powerful system.

Another important application could be in medical devices. Han says he has begun working with colleagues at MIT on devices that use machine learning to continuously monitor blood pressure.


More Great WIRED Stories

Latest news

Bovi’s wife recounts how ectopic pregnancy ruptured in one of her tubes

The mother of three narrates how she had an emergency surgery. ...

Why stereotypes are harmful and not funny

Stereotypes reinforces bias and causes discrimination. ...

2023: I’ll Go Naked If Atiku Refuses To Handover To Southeast After Four Years As President – Dokpesi

Raymond Dokpesi Raymond Dokpesi has called on Igbos to support Atiku Abubakar's presidential ambition...

Court Finds Instagram Comedian, Degeneral Guilty Of Drug Trafficking

Degeneral Instagram comedian, Joshua Sunday aka Degeneral and one Caleb William, a cinematographer have...
- Advertisement -

2023: North Has No Justification To Remain In Power – Tanko Yakassai

Tanko Yakassai A northern elder statesman, Tanko Yakassai has called for power to go...

Lagos Businessman Tortures Apprentice To Death Over Missing N1,000 (Photo)

Uchechukwu and the shop A businessman has ran away after allegedly torturing his 11-year-old...

Must read

Bovi’s wife recounts how ectopic pregnancy ruptured in one of her tubes

The mother of three narrates how she had an...

Why stereotypes are harmful and not funny

Stereotypes reinforces bias and causes discrimination. ...
- Advertisement -

You might also likeRELATED
Recommended to you