Many materials science studies use scanning transmission electron microscopy (STEM) to characterize atomic-scale structure. Conventional STEM imaging experiments produce only a few intensity values at each probe position. However, modern high-speed detectors allow us to measure a full 2D diffraction pattern, over a grid of 2D probe positions, forming a four dimensional (4D)-STEM dataset. These datasets consisting of millions of images can record information about the local phase, orientation, deformation, and other parameters, for both crystalline and amorphous materials. Due to the large data sizes, we require highly automated and robust software codes in order to extract the target properties. In this talk, I will show examples of how we use conventional and deep learning analysis codes to perform data-intensive studies of materials over functional length scales. I will also show examples how we modify the input wavefunction of the STEM probe in order to improve signal to noise and measurement robustness.
BIO:
Colin Ophus is a staff scientist at NCEM at Lawrence Berkeley Lab, running both a user program and research group focused on methods, algorithms, and codes for simulation, analysis and instrument design for TEM. He has published over 150 peer reviewed articles, given over 50 invited talks, received a DOE Early Career award (2018) and was awarded the Burton medal from the Microscopy Society of America (2022). He is project leader for the py4DSTEM analysis and the Prismatic STEM simulation open-source codes.