Question

How can new MIT chips help with neural networks?

Answer
By Justin Stoltzfus | Last updated: May 31, 2018

New scientific work on neural networks may reduce their power and resource requirements to the point where engineers could put their powerful capabilities into much more diverse sets of devices.

That can have a huge impact on everything in our lives, from how we prepare food to how we go to the doctor, or how we get around using our cars or public transportation.

Think about how smartphones changed our lives – then think about having machine learning and artificial intelligence technologies built into these small, portable devices.

Some of this groundbreaking work is on display at MIT, where some electrical engineering and computer science students are looking at how to improve the design and build of AI/ML systems.

Specifically, the efforts of Abhishek Biswas, an MIT graduate student, and various colleagues are getting a lot of attention in technology press.

Techcrunch talks about how the evolution of neural network science could promote “computing at the edge” and put more powerful technologies into portable battery-driven devices.

Forbes says that Biswas’ breakthrough could “put artificial intelligence inside your blender.”

In general, the advances of the MIT scientists are making waves partly because it's evident how these achievements can affect our consumer technologies, as well as those used for government or business purposes.

Essentially, the type of processor evolution that Biswas describes has to do with co-locating functions in a chip environment. In a Science Daily article, the writer explains how most traditional processors have memory that is stored outside of the processing area, and data is shuttled back and forth. However, this need for the movement of stored memory data takes a lot of power.

Biswas talks about the “dot product” or core operation that helps neural networks work. These scientists are also considering the use of binary weights to simplify systems – and this idea has really been a fundamental part of computer science ever since before the first personal computers were invented.

By promoting these kinds of hardware changes, scientists are providing more versatility for the machine learning and artificial intelligence tools that are changing how we use technologies. By moving from purely deterministic linear programming to a system where computers mimic human brain activity, we're about to embark on a new adventure with much more powerful technologies at our fingertips.

Share this Q&A

  • Facebook
  • LinkedIn
  • Twitter

Tags

Hardware processing Machine Learning Deep Learning

Written by Justin Stoltzfus | Contributor, Reviewer

Profile Picture of Justin Stoltzfus

Justin Stoltzfus is a freelance writer for various Web and print publications. His work has appeared in online magazines including Preservation Online, a project of the National Historic Trust, and many other venues.

More Q&As from our experts

Related Terms

Related Articles

Term of the Day

High Efficiency Video Coding

High Efficiency Video Coding (HEVC) is a video compression standard which offers double the data compression ratio at the...
Read Full Term

Tech moves fast! Stay ahead of the curve with Techopedia!

Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.

Resources
Go back to top