New advanced neuromorphic computing could reduce the energy of machine learning by 1000 times

An interview with Assistant Professor Yiyang Li, MSE's newest faculty member
New advanced neuromorphic computing could reduce the energy of machine learning by 1000 times

Yiyang Li’s latest research seeks to reduce energy consumption in computers with a device that contains both memory and logic operations. By co-locating memory and processor, the energy of machine learning could be reduced by a factor of 100-1000. We asked Dr. Li, who most recently worked at Sandia National Laboratories, to talk more about his research, as well as his background in materials science and what attracted him to U-M.

This work, recently published in Advanced Materials, was done in collaboration with scientists at Sandia National Laboratories and with the group of Prof. Wei Lu in the EECS department. 

What is the current challenge with computer energy consumption?

Computers presently expend a tremendous amount of energy for data-intensive operations like machine learning and artificial intelligence. One recent paper shows that certain machine learning operations expend as much CO2 emissions as the life cycle emissions for several passenger cars. This energy demand arises because the silicon transistors in computer chips can be designed into circuits for either information processing (logic) or information storage (memory), but not both. As a result, most energy is expended moving information between logic and memory, resulting in the large quantities of energy consumed during the processing of large data sets.

A highly attractive alternative is to devise new materials and devices with both logic and memory functionality; in some ways this is inspired by human neurons and synapses, which also has simultaneous logic and memory. We call this field “neuromorphic computing.”

A class of electronic devices called “memristors” can be utilized for this purpose, but it is very difficult to control how they switch among analogue information states. The reason is that memristors store information in discrete numbers of atomic defects in a nanosized “filament.” Because atomic defects, like gas molecules, are in constant random motion (kinetic theory), it is very difficult to reliably control the switching behavior of a few atomic defects. Trying to change the memristor state is like flipping a coin or rolling a dice—we don’t know if we will be successful in any one instance. While we have been able to switch memristors between a “1” and “0,” reliably switching between more analogue states (e.g. 0, 1, 2, 3, 4, 5,…100 etc) is nearly impossible.

How does your research address this problem? 

We devised a memristor such that all atoms in the lattice are used to store information, as opposed to just a few atomic defects in the filament. By using the average behavior of millions (or more) atoms, we solved the challenge of stochastic switching because average behavior of all atoms is deterministic even if individual atoms are random. By analogy, if I roll a dice enough times, then I can say with reasonable confidence that 1/6 of the rolls will result in a “6.” By utilizing the statistical behavior of large numbers of atoms, we were able to reliably and reproducibly switching between over 100 analogue states.

What drew you to this particular line of research?

My PhD work at Stanford University was on batteries, which are electrochemical devices used to store energy. I wanted to see if the same principles can be used to store information. On a fundamental materials science level, these devices operate very similarly to batteries and fuel cells, but for a different application. Although this current work is applied towards computation and machine learning, the research itself is on materials and materials properties.

What real-world impact do you anticipate your research will have?

 The ultimate impact is to reduce the energy consumption of computing, especially for machine learning and artificial intelligence. This is especially crucial for an energy-limited application like a smartphone  or an autonomous vehicle. Presently, voice recognition software like Siri often require an Internet connection so the data can be processed centrally because it would otherwise consume too much energy. However, if we can make this process more energy efficient, we can start doing these operations locally. Beyond portable applications, we envision this work will also reduce the aggregate energy consumption of computation and artificial intelligence, which often come from fossil energy sources.

How did you become interested in materials science and engineering?

I started becoming interested my sophomore year of college because of the strong connection between materials science and renewable energy technologies. However, because I went to a very small engineering school, we did not have a formal materials science major, so I majored in electrical engineering and applied for graduate school in materials science. My research has always had some connection to energy: my graduate research focused on Li-ion batteries and this work focused on energy-efficient computing. 

Why did you come to U-M?

I joined Sandia National Laboratories as a postdoc after my graduate work. During that time, I gave a seminar at Wayne State and some other universities. While there, I again saw the vibrancy and excitement of a university, primarily a result of all the students. This helped convince me to try and apply for faculty positions. At the time, I didn’t think it would lead me back to the state of Michigan, but U-M offers a great balance of teaching and research that was just right for me. Moreover, I look forward to working with the other great scholars around campus. Even in this paper, I worked with Prof. Wei Lu and PhD student Sangmin Yoo on adding a modeling and simulation component to this research.

What classes are you teaching at U-M?

In the fall semester, I’m teaching MATSCIE 550, which is an introductory course for graduate students who, like me, had a different undergraduate major. It’s designed to quickly introduce the core concepts of an undergraduate MSE degree. I had some struggles at the beginning of graduate school, and I think a class like this would have benefited me along with the other students who did not have an MSE undergraduate major. In the winter term, I will be teaching the undergraduate senior design class with Prof. Max Shtein. My undergraduate school was founded on the premise of design thinking and project-based learning, so I hope to contribute to that effort at U-M.