New materials are critical components of emerging technologies that promise to be major growth areas for the economy, such as less expensive solar power, electric-car batteries that can go longer between charges, lightweight portable electronic devices, and implantable medical devices for personalized medicine. But the journey from new material to product typically takes one to two decades. That’s in large part because new materials require advanced manufacturing technologies that can take many years to develop.
The White House hopes to cut that time in half by investing $100 million in a Materials Genome Initiative aimed at encouraging more efficient use of the computational modeling tools that researchers use to predict the properties of new materials. The initiative, which is part of the White House’s Advanced Manufacturing Partnership, will support open access to these models and databases across the materials science community in hopes of connecting academics with industry earlier in the development process.
As it stands now, scientists working with new materials don’t take manufacturing issues into account early enough, says Cyrus Wadia, assistant director for clean energy and materials R&D at the White House Office of Science and Technology Policy. As a result, their research can lead them into dead ends. The way to change that, he believes, is to encourage the whole materials science community, from academics to manufacturers, to share data and computational tools—the “materials genome.” Wadia says he wants researchers to ask themselves, “Who’s done it before, what did they learn, and what can the market bear?”
Materials scientists have been using predictive models with varying degrees of success over the past 20 years, manipulating data about properties such as melting point, conductivity, or the way a compound reacts with others to predict whether a material is suitable for a particular application such as a battery electrode. The computations involved are very complicated. But once the code to predict promising candidates for a particular application is written, it can be applied to test the potential of any material, says Gerbrand Ceder, a professor of materials science at MIT who specializes in computational modeling of new battery-electrode materials. Unfortunately, there’s been no infrastructure to help researchers share their data and the code used to crunch it, and few of the models have taken manufacturing issues into account.
“The problem with scaling and manufacturing is that you don’t understand everything,” says Ceder. “If we could make things exactly how we made them in the lab, there’d be no problem.” But it doesn’t work that way. Minor differences in manufacturing conditions are inevitable when scaling up from making grams of a material to making it by the ton. And the materials coming out of academic labs today are harder to make than the materials of the past. Many advanced materials gain their extraordinary properties through molecular or even atomic-scale structural precision, and making them is not like making, for example, steel. “You make steel by melting metals together in a huge vat,” says Alexander King, director of the Ames National Laboratory in Iowa. In manufacturing advanced materials, says King, “you have to use more controlled methods, or the atoms won’t do what you want.” Inconsistencies in temperature control, mixing, or other factors can lead to failure. And techniques used to achieve atomic-scale precision in the lab can be difficult to translate to large-scale manufacturing.
Making large batches of a complex material consistently in a factory almost always requires processes different from those used to make small batches in the lab. That means more money, time, and risk. For example, say a research lab has made one-inch-square working solar cells whose active layer is created by printing a nanoparticle ink. Commercializing such a technology requires a company to develop several manufacturing techniques. First it has to figure out how to make the nanoparticles in large batches; then it must find an equipment maker to provide a customized machine for printing those inks over square meters, or develop that equipment itself. But it may not even get to that stage. What if, when researchers try to make large numbers of these solar cells, they can’t get the nanoparticles arranged in a consistent way, and the cells don’t work? At any stage, a fatal flaw might be uncovered.
The Materials Genome Initiative aims to predict such manufacturing problems and steer scientists and engineers away from them earlier in the development phase. The problems related to scaling up from lab bench to factory aren’t anything special, says Ceder. The main challenge right now is that individual groups and companies have been developing snippets of code and amassing data on new and existing materials, but they have no way to share this information. They file a patent, get a paper published, and it stops there. The Materials Genome will gather all such data into a central database.
Academic culture is more amenable to sharing data than corporate culture, but Wadia, who has been talking with representatives of the major materials companies about this initiative over the last few years, believes corporate labs will also contribute. Indeed, it would be difficult for such a project to succeed without them. “It will start in pockets of communities, but we have to get a critical mass for this to work,” he says. Companies that make advanced materials are already generating a large amount of data through daily monitoring of manufacturing operations, and he hopes they will share this kind of information with the Materials Genome Initiative.
“We think a key role industry can play is providing our perspective on how materials are used, designed, and evaluated for industrial product applications,” says Christine Furstoss, technical director of manufacturing and materials technologies at GE Global Research. “We use a large number of materials that are applied across multiple industries and have a keen interest in helping to advance the performance and manufacturability of such materials.”
The initial $100 million will be distributed among four government agencies: the National Institute of Standards and Technology, the Department of Energy, the National Science Foundation, and the Department of Defense. White House representatives would not comment on how much money would go to each agency and for what specific projects, but the emphasis, says Wadia, is on building computational infrastructure. Just what that infrastructure should look like will be hashed out over the next year. Funding will also go to educational initiatives.
“Novel materials are key enablers for manufacturing,” says Ceder. “If you’re going to increase manufacturing in the U.S., you’re not going to do that on old technologies.”
Meta has built a massive new language AI—and it’s giving it away for free
Facebook’s parent company is inviting researchers to pore over and pick apart the flaws in its version of GPT-3
The gene-edited pig heart given to a dying patient was infected with a pig virus
The first transplant of a genetically-modified pig heart into a human may have ended prematurely because of a well-known—and avoidable—risk.
Saudi Arabia plans to spend $1 billion a year discovering treatments to slow aging
The oil kingdom fears that its population is aging at an accelerated rate and hopes to test drugs to reverse the problem. First up might be the diabetes drug metformin.
Yann LeCun has a bold new vision for the future of AI
One of the godfathers of deep learning pulls together old ideas to sketch out a fresh path for AI, but raises as many questions as he answers.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.