Featured Product
This Week in Quality Digest Live
Innovation Features
Gleb Tsipursky
Here’s the true path to junior staff success
Nathan Furr
Here’s how to balance psychological safety and intellectual honesty for better team performance
Massoud Pedram
An electrical engineer explains the potential
Vanessa Bates Ramirez
And the lowest-skilled benefited most
Tina Behers
First, leaders must overcome their fear of failure

More Features

Innovation News
High-performance model extends vision capability
10-year technology partnership includes sponsorship of quality control lab
Research commissioned by the Aerospace & Defense PLM Action Group with Eurostep and leading PLM providers
MM series features improved functionality and usability
Features improved accuracy, resolution, versatility, and efficiency
Meeting new package configuration trends
New report rethinks hydroelectric solutions
Adding its new SV series to NASCAR’s all-time leader in wins
International Paper Co. saves money with Radian Plus laser tracker and vProbe

More News

Adam Zewe

Innovation

A Chip for Decoding Data Transmissions Breaks Energy-Efficiency Records

The chip could enable lower-cost devices that perform better and use less hardware

Published: Wednesday, March 8, 2023 - 13:01

Imagine using an online banking app to deposit money into your account. Like all information sent over the internet, those communications could be corrupted by noise that inserts errors into the data.

To overcome this problem, senders encode data before they are transmitted, and then a receiver uses a decoding algorithm to correct errors and recover the original message. In some instances, data are received with reliability information that helps the decoder figure out which parts of a transmission are likely errors.


This new decoder chip uses a universal decoding algorithm that MIT researchers previously developed that can unravel any error-correcting code. It has broken the record for energy-efficient decoding, performing between 10 and 100 times better than other hardware. Image credit: Christine Daniloff, MIT

Researchers at MIT and elsewhere have developed a decoder chip that employs a new statistical model to use this reliability information in a way that’s much simpler and faster than conventional techniques.

Their chip uses a universal decoding algorithm the team previously developed that can unravel any error-correcting code. Typically, decoding hardware can process only one particular type of code. This new, universal decoder chip has broken the record for energy-efficient decoding, performing between 10 and 100 times better than other hardware.

This advance could enable mobile devices with fewer chips, since they would no longer need separate hardware for multiple codes. This would reduce the amount of material needed for fabrication, cutting costs and improving sustainability. By making the decoding process less energy intensive, the chip could also improve device performance and lengthen battery life. It could be especially useful for demanding applications like augmented and virtual reality and 5G networks.

“This is the first time anyone has broken below the 1 picojoule-per-bit barrier for decoding,” says Muriel Médard, the School of Science NEC professor of software science and engineering, a professor in the Department of Electrical Engineering and Computer Science, and a co-author of a paper presenting the new chip. “That is roughly the same amount of energy you need to transmit a bit inside the system. It had been a big symbolic threshold, but it also changes the balance in the receiver of what might be the most pressing part from an energy perspective. We can move that away from the decoder to other elements.”

Médard’s co-authors include: lead author Arslan Riaz, a graduate student at Boston University; Rabia Tugce Yazicigil, assistant professor of electrical and computer engineering at BU; and Ken R. Duffy, then director of the Hamilton Institute at Maynooth University and now a professor at Northeastern University; as well as others from MIT, BU, and Maynooth University. The work was presented at the International Solid-States Circuits Conference in February.

Smarter sorting

Digital data are transmitted over a network in the form of bits (0s and 1s). A sender encodes data by adding an error-correcting code, which is a redundant string of 0s and 1s that can be viewed as a hash. Information about this hash is held in a specific code book. A decoding algorithm at the receiver, designed for this particular code, uses its code book and the hash structure to retrieve the original information, which may have been jumbled by noise. Because each algorithm is code-specific and most require dedicated hardware, a device would need many chips to decode different codes.

The researchers previously demonstrated GRAND (guessing random additive noise decoding), a universal decoding algorithm that can crack any code. GRAND works by guessing the noise that affected the transmission, subtracting that noise pattern from the received data, and then checking what remains in a code book. It guesses a series of noise patterns in the order they are likely to occur.

Data are often received with reliability information, also called soft information, that helps a decoder figure out which pieces are errors. The new decoding chip, called ORBGRAND (ordered reliability bits GRAND), uses this reliability information to sort data based on how likely each bit is to be an error.

But it isn’t as simple as ordering single bits. While the most unreliable bit might be the likeliest error, perhaps the third and fourth most unreliable bits together are as likely to be an error as the seventh-most unreliable bit. ORBGRAND uses a new statistical model that can sort bits in this fashion, considering that multiple bits together are as likely to be an error as some single bits.

“If your car isn’t working, soft information might tell you that it’s probably the battery,” Médard says. “But if it isn’t the battery alone, maybe it’s the battery and the alternator together that are causing the problem. This is how a rational person would troubleshoot. You’d say that it could actually be these two things together before going down the list to something that is much less likely.”

This is a much more efficient approach than traditional decoders, which would instead look at the code structure and perform in a way that’s generally designed for the worst-case.

“With a traditional decoder, you’d pull out the blueprint of the car and examine each and every piece,” Médard explains. “You’ll find the problem, but it will take you a long time, and you’ll get very frustrated.”

ORBGRAND stops sorting as soon as a code word is found, which is often very soon. The chip also employs parallelization, generating and testing multiple noise patterns simultaneously so it finds the code word faster. Because the decoder stops working once it finds the code word, its energy consumption stays low even though it runs multiple processes simultaneously.

Record-breaking efficiency

When the team compared this approach to other chips, ORBGRAND decoded with maximum accuracy while consuming only 0.76 picojoules of energy per bit, breaking the previous performance record. ORBGRAND consumes between 10 and 100 times less energy than other devices.

One of the biggest challenges of developing the new chip came from this reduced energy consumption, Médard says. With ORBGRAND, generating noise sequences is now so energy-efficient that other processes the researchers hadn’t focused on before, like checking the code word in a code book, consume most of the effort.

“Now, this checking process, which is like turning on the car to see if it works, is the hardest part,” she says. “So, we need to find more efficient ways to do that.”

The team is also exploring ways to change the modulation of transmissions so they can take advantage of the improved efficiency of the ORBGRAND chip. They also plan to see how their technique could be used to more efficiently manage multiple transmissions that overlap.

The research is funded, in part, by the U.S. Defense Advanced Research Projects Agency (DARPA) and Science Foundation Ireland.

First published Feb. 22, 2023, by MIT News.

Discuss

About The Author

Adam Zewe’s picture

Adam Zewe

Adam Zewe is a writer for Massachusetts Institute of Technology, covering the electrical engineering and computer science beat in the MIT News Office.