Artificial Intelligence for the Development of new Materials, Processes and Properties
The design of new materials, microstructures and manufacturing processes has over millennia been based on trial-and-error.
The knowledge acquired through random discoveries, systematic experiments, and transfer from neighbouring scientific disciplines produced many empirical rules and later predictive theories. Combined with computer simulations, they have today matured into a backbone position of materials science, enabling the community to discover and improve advanced materials and processes on the basis of detailed understanding. This classical ‘intelligent design’ approach is currently being challenged and partly dissolved by the increasing use of advanced statistical analysis and artificial intelligence applied to large data sets.
Indeed, not only the automatization of material production and testing, but also the rapid evolution of advanced characterization and material simulations dramatically increase the volume of relevant information collected by our researchers. But different from the dragons in the ancient tales, who were satisfied with guarding the treasures they had collected, we aim at turning this treasure into scientific progress by combining advanced data analysis and artificial intelligence with existing material science research.
MPIE research addresses particularly the following challenges and chances:
1. Experiments such as Atom Probe Tomography or Scanning Transmission Electron Microscopy produce very large data sets (several GB-TB per experiment), that encode the inherent microstructural patterns of the investigated sample. However, such data scatters due to the omnipresent statistical fluctuations in real materials and due to unavoidable noise in the experiment. Similar challenges arise in computer simulations that employ stochastic sampling to cover high-dimensional parameter spaces. Big data analysis helps to reveal hidden patterns that emerge only when sufficient data is accumulated.
2. Our experimentalists know how to identify and interpret microstructural and other features, but usually spend much more time (100-1000x) in evaluating the data than in producing them. We work on developing tools for semi-automatic analysis, to empower scientists to focus on the exceptional aspects and to reduce human errors in routine tasks.
3. Data should be kept only if it is relevant and accessible. We therefore work on algorithms to filter for important data and reduce raw data storage by preprocessing. Providing proper metadata to track the history of the sample, measurement details, and data processing steps are crucial to ensure data quality, and to enable reuse of available data in new context. Our goal is to automatically attach and augment metadata by the tools we develop, and to make this metadata accessible to other automatic (meta)tools by linking to material science ontologies. For providing an infrastructure to create, manage, and analyse data in a highly automated fashion from day 1 (i.e. for scientists without prior data-science experience), we developed the pyiron framework.
4. Artificial intelligence methods can be trained to make predictions from data without building conventional scientific models. We see great chances in this, but not for replacing traditional science: rather, we explore how artificial intelligence may guide theoretical and experimental setups to the most interesting conditions for exciting discoveries – after all, 90-99% of all research efforts turn out to be unsuccessful, scientific dead-ends, or simply boring. Reducing that fraction by even a small amount will boost progress!
In order to prepare raw data from scanning transmission electron microscopy for analysis, pattern detection algorithms are developed that allow to identify automatically higher-order feature such as crystalline grains, lattice defects, etc. from atomically resolved measurements.
New product development in the steel industry nowadays requires faster development of the new alloys with increased complexity. Moreover, for these complex new steel grades, it is more challenging to control their properties during the process chain. This leads to more experimental testing, more plant trials and also higher rejections due to unmatched requirements. Therefore, the steel companies wish to have a sophisticated offline through process model to capture the microstructure and engineering property evolution during manufacturing.
Crystal Plasticity (CP) modeling  is a powerful and well established computational materials science tool to investigate mechanical structure–property relations in crystalline materials. It has been successfully applied to study diverse micromechanical phenomena ranging from strain hardening in single crystals to texture evolution in polycrystalline aggregates.
Advanced microscopy and spectroscopy offer unique opportunities to study the structure, composition, and bonding state of individual atoms from within complex, engineering materials. Such information can be collected at a spatial resolution of as small as 0.1 nm with the help of aberration correction.
Complex simulation protocols combine distinctly different computer codes and have to run on heterogeneous computer architectures. To enable these complex simulation protocols, the CM department has developed pyiron.
Within the EU project „ADVANCE - Sophisticated experiments and optimisation to advance an existing CALPHAD database for next generation TiAl alloys” MPIE is collaborating with Thermocalc-Software AB, Stockholm, Montanuniversität Leoben and Helmholtz-Zentrum Geesthacht. At MPIE the focus lies on the production and heat treatments of model alloys. By analysing them through metallography, X-ray diffraction, electron probe microanalysis and differential thermal analysis, the necessary data are obtained. Colleagues in Leoben perform atom probe tomography and transmission electron microscopy and in Geesthacht in situ synchrotron X-ray diffraction is carried out. All obtained data are optimised at the company Thermocalc and checked for consistency before they are implemented into the database.