Every so often, the search for functional materials--materials designed with properties and characteristics that allow it to do something spectacular--delivers a fundamental change to the way we live. 5000 years ago, the discovery and use of bronze enabled no less a task than the ushering in of modern civilization. More recently, we've seen silicon's properties give rise to the Information Age, and the advent of products we now take for granted: plastic, stainless steel, composites, titanium, Gorilla Glass, even clothing that will not stain.
Yet discovering new materials is a lengthy process, and Joshua Agar, an assistant professor of materials science and engineering at Lehigh University, says it comes about in one of three basic ways.
"Throughout human history," he says, "we've seen materials development occur over thousands of years of trial and error that advances across generations and entire civilizations. Efficiencies began to improve dramatically when society began to conduct physical experiments in increasingly-specialized laboratories. This is certainly an improvement, but this approach comes with significant costs as we conduct time-consuming, expensive experiments. Even more recently, though, scientists have begun to use computational simulation to drive research in a wide array of fields, including material discovery. This use of digital technologies is far more efficient than either of the previous methods--yet even here, we can do better."
Increasing the speed of material discovery is but one aim of a sprawling new project being conducted by a research team from Lehigh and University of California at Berkeley, assembled with some $600,000 in support from the National Science Foundation (NSF) through its recently-announced TRIPODS+X program.
"Recent advances in what are called embarrassingly parallel computational methods have enabled machine learning that can draw conclusions from raw data by considering an interlocking multitude of co-dependencies that would otherwise be beyond human comprehension," Agar continues. "Currently, in order to develop an understanding of these deep, nuanced relationships in the data, we utilize computational techniques that amount to brute-force computation to perform a logical chain of highly efficient functions. It's like using a flamethrower to crisp up a crème brûlée."
Through the project, Agar and his team will create what they term "an efficient Bayesian-guided computational framework" that will guide the development of a neural network--a computer system modeled on the human brain and nervous system--that will serve to turbo-charge the search for new and advanced materials with enhanced electrical, thermal, mechanical, and magnetic characteristics.
According to Agar, the project will apply "deep learning neural networks" to more efficiently perform functions associated with traditional physics-based computational simulation. The team intends for its work to eventually accelerate the work of other researchers in the discovery and synthesis of novel materials; if successful, they also believe the technologies they are developing could have an impact far beyond materials development.
"We believe that the increased efficiency of our neural network model may facilitate the asking of scientific questions that are currently computationally intractable," he says. "The proposed work is specifically focused on enabling a neural network to discover what we call strain-induced polar phases and phase competition in materials. But working with our partners at Berkeley, we believe that some of the foundational data science methods developed through this effort may also help to support astronomy's understanding of the large-scale structure of the universe--cosmology in its grandest sense. It is even possible that these concepts will prove to be applicable to computational simulations across a wide range of scientific disciplines and adjacent fields."
"These application areas have little to do with the work conducted in my lab, arranging atoms into crystals and pulling on them to perform various tasks," he continues. "But the underlying data science is, in fact, directly applicable. It certainly demonstrates how fundamentally valuable research in data and computational methods can prove to be across a host of domains, and thus supports the notion of clustering researchers with different perspectives around similar problems."
In this way, Agar believes the TRIPODS+X program is a perfect fit for his research focus and Lehigh's strategic directions more broadly.
"I'm proud our lab serves in some ways as an intersection point for Lehigh's new Institute for Functional Materials and Devices, the Institute for Data, Intelligent Systems and Computation, and the Nano Human Interfaces Initiative," he says. "Across our campus, Lehigh is showing commitment to efforts in team research at the intersection of disciplines. The NSF's TRIPOD+X initiative, in supporting projects like these, demonstrates broader belief in this approach as well."
Earlier in September, the NSF announced the awarding of $8.5 million in TRIPODS+X grants to expand the scope of the cross-disciplinary TRIPODS institutes into broader areas of science, engineering and mathematics. In total, NSF will support 19 collaborative projects at 23 universities. The supported teams will bring new perspectives to bear on complex and entrenched data science problems.
"The multidisciplinary approach for addressing the increasing volume and complexity of data enabled through the TRIPODS+X projects will have a profound impact on the field of data science and its use," said Jim Kurose, assistant director for Computer and Information Science and Engineering at NSF, in the organization's September 11, 2018, announcement. "This impact will be sure to grow as data continues to drive scientific discovery and innovation."
EurekAlert!, the online, global news service operated by AAAS, the science society: https://www.eurekalert.org/pub_releases/2018-09/lu-atb092418.php