Artificial Intelligence and Digitilization

The design of new materials, microstructures and manufacturing processes has over millennia been based on trial-and-error. The knowledge acquired through random discoveries, systematic experiments, and transfer from neighbouring scientific disciplines produced many empirical rules and later predictive theories. Combined with computer simulations, they have today matured into a backbone position of materials science, enabling the community to discover and improve advanced materials and processes on the basis of detailed understanding. This classical ‘intelligent design’ approach is currently being challenged and partly dissolved by the increasing use of advanced statistical analysis and artificial intelligence applied to large data sets. Indeed, not only the automatization of material production and testing, but also the rapid evolution of advanced characterization and material simulations dramatically increase the volume of relevant information collected by our researchers. But different from the dragons in the ancient tales, who were satisfied with guarding the treasures they had collected, we aim at turning this treasure into scientific progress by combining advanced data analysis and artificial intelligence with existing material science research.

MPIE research addresses particularly the following challenges and chances:

1. Experiments such as Atom Probe Tomography or Scanning Transmission Electron Microscopy produce very large data sets (several GB-TB per experiment), that encode the inherent microstructural patterns of the investigated sample. However, such data scatters due to the omnipresent statistical fluctuations in real materials and due to unavoidable noise in the experiment. Similar challenges arise in computer simulations that employ stochastic sampling to cover high-dimensional parameter spaces.  Big data analysis helps to reveal hidden patterns that emerge only when sufficient data is accumulated.

2. Our experimentalists know how to identify and interpret microstructural and other features, but usually spend much more time (100-1000x) in evaluating the data than in producing them. We work on developing tools for semi-automatic analysis, to empower scientists to focus on the exceptional aspects and to reduce human errors in routine tasks.

3. Data should be kept only if it is relevant and accessible. We therefore work on algorithms to filter for important data and reduce raw data storage by preprocessing. Providing proper metadata to track the history of the sample, measurement details, and data processing steps are crucial to ensure data quality, and to enable reuse of available data in new context. Our goal is to automatically attach and augment metadata by the tools we develop, and to make this metadata accessible to other automatic (meta)tools by linking to material science ontologies. For providing an infrastructure to create, manage, and analyse data in a highly automated fashion from day 1 (i.e. for scientists without prior data-science experience), we developed the pyiron framework.

4. Artificial intelligence methods can be trained to make predictions from data without building conventional scientific models. We see great chances in this, but not for replacing traditional science: rather, we explore how artificial intelligence may guide theoretical and experimental setups to the most interesting conditions for exciting discoveries – after all, 90-99% of all research efforts turn out to be unsuccessful, scientific dead-ends, or simply boring. Reducing that fraction by even a small amount will boost progress!

Computational materials science

Video explaining the digitalization in materials science more

Strain rate, size and defect density interdependence on the deformation of 3D printed microparticles

Statistical significance in materials science is a challenge that has been trying to overcome by miniaturization. However, this process is still limited to 4-5 tests per parameter variance, i.e. Size, orientation, grain size, composition, etc. as the process of fabricating pillars and testing has to be done one by one. With this project, we aim to fabricate arrays of well-defined and located particles that can be tested in an automated manner. With a statistically significant amount of samples tested per parameter variance, we expect to apply more complex statistical models and implement machine learning techniques to analyze this complex problem. more

Microstructure Data Mining in Atom Probe Tomography via Machine Learning

Atom probe tomography (APT) provides three dimensional(3D) chemical mapping of materials at sub nanometer spatial resolution. In this project, we develop machine-learning tools to facilitate the microstructure analysis of APT data sets in a well-controlled way. more

Advancing atom probe towards true atomic-scale analyti-cal tomography

Atom probe tomography (APT) is one of the MPIE’s key experiments for understanding the interplay of chemical composition in very complex microstructures down to the level of individual atoms. In APT, a needle-shaped specimen (tip diameter ≈100nm) is prepared from the material of interest and subjected to a high voltage. Additional voltage or laser pulses trigger the evaporation of single ions from the tip. more

A microscopic view of electrochemical interfaces: ab initio molecular dynamics at controlled electrode potential

Ever since the discovery of electricity, chemical reactions occurring at the interface between a solid electrode and an aqueous solution have aroused great scientific interest, not least by the opportunity to influence and control the reactions by applying a voltage across the interface. Our current textbook knowledge is mostly based on mesoscopic concepts, i.e. effective models with empirical parameters, or focuses on individual reactions decoupled from the environment, presenting therefore a serious obstacle for predicting what happens at a particular interface under particular conditions. more

Software development

Recent developments in experimental techniques and computer simulations provided the basis to achieve many of the breakthroughs in understanding materials down to the atomic scale. While extremely powerful, these techniques produce more and more complex data, forcing all departments to develop advanced data management and analysis tools as well as investing into software engineering expertise. more

Integrated workflows for materials and data science

Integrated Computational Materials Engineering (ICME) is one of the emerging hot topics in Computational Materials Simulation during the last years. It aims at the integration of simulation tools at different length scales and along the processing chain to predict and optimize final component properties.
  more

Big data and machine learning in electron microscopy

Data-rich experiments such as scanning transmission electron microscopy (STEM) provide large amounts of multi-dimensional raw data that encodes, via correlations or hierarchical patterns, much of the underlying materials physics. With modern instrumentation, data generation tends to be faster than human analysis, and the full information content is rarely extracted. We therefore work on automatizing these processes as well as on applying data-centric methods to unravel hidden patterns. At the same time, we aim at exploiting the insights from information extraction to direct the data acquisition to the most relevant aspects, and thereby avoid collecting huge amounts of redundant data.
  more

Learning Dynamics of STEM Data

The project’s goal is to synergize experimental phase transformations dynamics, observed via scanning transmission electron microscopy, with phase-field models that will enable us to learn the continuum description of complex material systems directly from experiment.  more

<div style="text-align: left;" align="center">Artificial intelligence for complex materials</div>

Max Planck researchers present a new deep neural network for predicting materials’ mechanical behaviour more

StahlDigital: Max-Planck-Institut für Eisenforschung coordinates project on digital strategies for steel materials

German Federal Ministry of Education and Research supports digitization of materials research with 26 million euros more

Automatic Feature Extraction from STEM

In order to prepare raw data from scanning transmission electron microscopy for analysis, pattern detection algorithms are developed that allow to identify automatically higher-order feature such as crystalline grains, lattice defects, etc. from atomically resolved measurements. more

Digitally Enhanced New Steel Product Development (DENS)

New product development in the steel industry nowadays requires faster development of the new alloys with increased complexity. Moreover, for these complex new steel grades, it is more challenging to control their properties during the process chain. This leads to more experimental testing, more plant trials and also higher rejections due to unmatched requirements. Therefore, the steel companies wish to have a sophisticated offline through process model to capture the microstructure and engineering property evolution during manufacturing. more

DAMASK - the Düsseldorf Advanced Material Simulation Kit

Crystal Plasticity (CP) modeling [1] is a powerful and well established computational materials science tool to investigate mechanical structure–property relations in crystalline materials. It has been successfully applied to study diverse micromechanical phenomena ranging from strain hardening in single crystals to texture evolution in polycrystalline aggregates. more

Machine-learning based data extraction from APT

Advanced microscopy and spectroscopy offer unique opportunities to study the structure, composition, and bonding state of individual atoms from within complex, engineering materials. Such information can be collected at a spatial resolution of as small as 0.1 nm with the help of aberration correction. more

pyiron - an Integrated Development Environment (IDE) for Computational Materials Science

Complex simulation protocols combine distinctly different computer codes and have to run on heterogeneous computer architectures. To enable these complex simulation protocols, the CM department has developed pyiron. more

Go to Editor View