OIM Analysis™

How to Get a Good Answer in a Timely Manner

Shawn Wallace, Applications Engineer, EDAX

One of the joys of my job is troubleshooting issues and ensuring you acquire the best results to advance your research. Sometimes, it requires additional education to help users understand a concept. Other times, it requires an exchange of numerous emails. At the end of the day, our goal is not just to help you, but to ensure you get the right information in a timely manner.

For any sort of EDS related question, we almost always want to look at a spectrum file. Why? There is so much information hidden in the spectrum that we can quickly point out any possible issues. With a single spectrum, we can quickly see if something was charging, tilted, or shadowed (Figure 1). We can even see weird things like beam deceleration caused by a certain imaging mode (Figure 2). With most of these kinds of issues, it is common to run into major quant related problems. Any quant problems should always start with a spectrum.

Figure 1. The teal spectrum shows a strange background versus what a normal spectrum (red) should look like for a material.

Figure 1. The teal spectrum shows a strange background versus what a normal spectrum (red) should look like for a material.

This background information tells us that the sample was most likely shadowed and that rotating the sample to face towards the detector may give better results.

Figure 2. Many microscopes can decelerate the beam to help with imaging. This deceleration is great for imaging but can cause EDS quant issues. Therefore, we recommend reviewing the spectrum up front to reduce the number of emails to troubleshoot this issue.

Figure 2. Many microscopes can decelerate the beam to help with imaging. This deceleration is great for imaging but can cause EDS quant issues. Therefore, we recommend reviewing the spectrum up front to reduce the number of emails to troubleshoot this issue.

To save the spectrum, right-click in the spectrum window, then click on Save (Figure 3). From there, save the file with a descriptive name, and send it off to the applications group. These spectrum files also include other metadata, such as amp time, working distance, and parameters that give us so many clues to get to the bottom of possible issues.

Figure 3. Saving a spectrum in APEX™ is intuitive. Right-click in the area and a pop-up menu will allow you to save the spectrum wherever you want quickly.

Figure 3. Saving a spectrum in APEX™ is intuitive. Right-click in the area and a pop-up menu will allow you to save the spectrum wherever you want quickly.

For information on EDS backgrounds and the information they hold, I suggest watching Dr. Jens Rafaelsen’s Background Modeling and Non-Ideal Sample Analysis webinar.

The actual image file can also help us confirm most of the above.

Troubleshooting EBSD can be tricky since the issue could be from sample prep, indexing, or other issues. To begin, it’s important to rule out any variances associated with sample preparation. Useful information to share includes a description of the sample, as well as the step-by-step instructions used to prepare the sample. This includes things like the length of time, pressure, cloth material, polishing compound material, and even the direction of travel. The more details, the better!

Now, how do I know it is a sample prep problem? If the pattern quality is low at long exposure times (Figure 4) or the sample looks very rough, it is probably related to sample preparation (Figure 4). That being said, there could be non-sample prep related issues too.

Figure 4. This pattern is probably not indexable on its own. Better preparation of the sample surface is necessary to index and map this sample correctly.

Figure 4. This pattern is probably not indexable on its own. Better preparation of the sample surface is necessary to index and map this sample correctly.

For general sample prep guidelines, I would highly suggest Matt Nowell’s Learn How I Prepare Samples for EBSD Analysis webinar.

Indexing problems can be challenging to troubleshoot without a full data set. How do I know my main issues could be related to indexing? If indexing is the source, a map often appears to be very speckled or just black due to no indexing results. For this kind of issue, full data sets are the way to go. By full, I mean patterns and OSC files. These files can be exported out of TEAM™/APEX™. They are often quite large, but there are ways available to move the data quickly.

For the basics of indexing knowledge, I suggest checking out my latest webinar, Understanding and Troubleshooting the EDAX Indexing Routine and the Hough Parameters. During this webinar, we highlight attributes that indicate there is an issue with the data set, then dive into the best practices for troubleshooting them.

As for camera set up, this is a dance between the microscope settings, operator’s requirements, and the camera settings. In general, more electrons (higher current) allow the experiment to go faster and cover more area. With older CCD based cameras, understanding this interaction was key to good results. With the newer Velocity™ cameras based on CMOS technology, the dance is much simpler. If you are having difficulty while trying to optimize an older camera, the Understanding and Optimizing EBSD Camera Settings webinar can help.

So how do you get your questions answered fast? Bury us with information. More information lets us dive deeper into the data to find the root cause in the first email, and avoids a lengthy back and forth exchange of emails. If possible, educate yourself using the resources we have made available, be it webinars or training courses. And always, feel free to reach out to my colleagues and me at edax.applications@ametek.com!

What a Difference a Year Makes

Jonathan McMenamin, Marketing Communications Coordinator, EDAX

EDAX is considered one of the leaders in the world of microscopy and microanalysis. After concentrating on advancements to our Energy Dispersive Spectroscopy (EDS) systems for the Scanning Electron Microscope (SEM) over the past few years, EDAX turned its attention to advances in Electron Backscatter Diffraction (EBSD) and EDS for the Transmission Electron Microscope (TEM) in 2019.

After the introduction of the Velocity™ Plus EBSD camera in June 2018, which produces indexing speeds greater that 3,000 indexed points per second, EDAX raised the bar further in 2019. In March, the company announced the arrival of the fastest EBSD camera in the world, the Velocity™ Super, which can go 50% faster at 4,500 indexed points per second. This was truly a great accomplishment!

EBSD orientation map from additively manufactured Inconel 718 collected at 4,500 indexed points per second at 25 nA beam current.

EBSD orientation map from additively manufactured Inconel 718 collected at 4,500 indexed points per second at 25 nA beam current.

Less than three months later, EDAX added a new detector to its TEM product portfolio. The Elite T Ultra is a 160 mm2 detector that offers a unique geometry and powerful quantification routines for comprehensive analysis solutions for all TEM applications. The windowless detector’s geometric design gives it the best possible solid angle to increase the X-ray count rates for optimal results.

EDAX Elite T Ultra EDS System for the TEM

EDAX Elite T Ultra EDS System for the TEM.

Just before the annual Microscopy & Microanalysis conference, EDAX launched the OIM Matrix™ software module for OIM Analysis™. This new tool gives users the ability to perform dynamic diffraction-based EBSD pattern simulations and dictionary indexing. Users can now simulate EBSD patterns based on the physics of dynamical diffraction of electrons. These simulated patterns can then be compared to experimentally collected EBSD patterns. Dictionary indexing helps improve indexing success rates over standard Hough-based indexing approaches. You can watch Dr. Stuart Wright’s <a href=”https://youtu.be/Jri181evpiA&#8221; target=”_blank”>presentation from M&M</a> for more information.

Dictionary indexing flow chart and conventional indexing results compared with dictionary indexing results for a nickel sample with patterns collected in a high-gain/noisy condition.

Dictionary indexing flow chart and conventional indexing results compared with dictionary indexing results for a nickel sample with patterns collected in a high-gain/noisy condition.

EDAX has several exciting product announcements on the way in early 2020. We have teased a two of these releases, APEX™ Software for EBSD and the Clarity™ Direct Electron Detector. APEX™ EBSD will give users the ability to characterize both compositional and structural characteristics of their samples on the APEX™ Platform. It gives them the ability to collect and index EBSD patterns and EBSD maps, as well as allow for simultaneous EDS-EBSD collection. You can learn more about APEX™ EBSD in the September issue of the Insight newsletter and in our “APEX™ EBSD – Making EBSD Data Collection How You Want It” webinar.

EBSD of a Gibeon Meteorite sample covering a 7.5 mm x 6.5 mm area using ComboScan for large area analysis.

EBSD of a Gibeon Meteorite sample covering a 7.5 mm x 6.5 mm area using ComboScan for large area analysis.

The Clarity™ is the world’s first commercial direct electron detector (DeD) for EBSD. It provides patterns of the highest quality and sensitivity with no detector read noise and no distortion for optimal performance. The Clarity™ does not require a phosphor screen or light transfer system. The DeD camera is so sensitive that individual electrons can be detected, giving users unprecedented performance for EBSD pattern collection. It is ideal for analysis of beam sensitive samples and potential strain applications. We recently had a webinar “Direct Electron Detection with Clarity™ – Viewing EBSD Patterns in a New Light” previewing the Clarity™. You can also get a better understanding of the system in the December issue of the Insight newsletter or the .

EBSD pattern from Silicon using the Clarity™ detector.

EBSD pattern from Silicon
using the Clarity™ detector.

All this happened in one year! 2020 looks to be another great year for EDAX with further improvements and product releases to offer the best possible tools for you to solve your materials characterization problems.

Hats Off/On to Dictionary Indexing

Dr. Stuart Wright, Senior Scientist EBSD, EDAX

Recently I gave a webinar on dynamic pattern simulation. The use of a dynamic diffraction model [1, 2] allows EBSD patterns to be simulated quite well. One topic I introduced in that presentation was that of dictionary indexing [3]. You may have seen presentations on this indexing approach at some of the microscopy and/or materials science conferences. In this approach, patterns are simulated for a set of orientations covering all of orientation space. Then, an experimental pattern is tested against all of the simulated patterns to find the one that provides the best match with the experimental pattern. This approach does particularly well for noisy patterns.

I’ve been working on implementing some of these ideas into OIM Analysis™ to make dictionary indexing more streamlined for datasets collected using EDAX data collection software – i.e. OIM DC or TEAM™. It has been a learning experience and there is still more to learn.

As I dug into dictionary indexing, I recalled our first efforts to automate EBSD indexing. Our first attempt was a template matching approach [4]. The first step in this approach was to use a “Mexican Hat” filter. This was done to emphasize the zone axes in the patterns. This processed pattern was then compared against a dictionary of “simulated” patterns. The simulated patterns were simple – a white pixel (or set of pixels) for the major zone axes in the pattern and everything else was colored black. In this procedure the orientation sampling for the dictionary was done in Euler space.
It seemed natural to go this route at the time, because we were using David Dingley’s manual on-line indexing software which focused on the zone axes. In David’s software, an operator clicked on a zone axis and identified the <uvw> associated with the zone axis. Two zone axes needed to be identified and then the user had to choose between a set of possible solutions. (Note – it was a long time ago and I think I remember the process correctly. The EBSD system was installed on an SEM located in the botany department at BYU. Our time slot for using the instrument was between 2:00-4:00am so my memory is understandably fuzzy!)

One interesting thing of note in those early dictionary indexing experiments was that the maximum step size in the sampling grid of Euler space that would result in successful indexing was found to be 2.5°, quite similar to the maximum target misorientation for modern dictionary indexing. Of course, this crude sampling approach may have led to the lack of robustness in this early attempt at dictionary indexing. The paper proposed that the technique could be improved by weighting the zone axes by the sum of the structure factors of the bands intersecting at the zone axes.
However, we never followed up on this idea as we abandoned the template matching approach and moved to the Burn’s algorithm coupled with the triplet voting scheme [5] which produced more reliable results. Using this approach, we were able to get our first set of fully automated scans. We presented the results at an MS&T symposium (Microscale Texture of Materials Symposium, Cincinnati, Ohio, October 1991) where Niels Krieger-Lassen also presented his work on band detection using the Hough transform [6]. After the conference, we hurried back to the lab to try out Niels’ approach for the band detection part of the indexing process [7].
Modern dictionary indexing applies an adaptive histogram filter to the experimental patterns (at left in the figure below) and the dictionary patterns (at right) prior to performing the normalized inner dot-product used to compare patterns. The filtered patterns are nearly binary and seeing these triggered my memory of our early dictionary work as they reminded me of the nearly binary “Sombrero” filtered patterns– Olé!
We may not have come back full circle but progress clearly goes in steps and some bear an uncanny resemblance to previous ones. I doff my hat to the great work that has gone into the development of dynamic pattern simulation and its applications.

[1] A. Winkelmann, C. Trager-Cowan, F. Sweeney, A. P. Day, P. Parbrook (2007) “Many-Beam Dynamical Simulation of Electron Backscatter Diffraction Patterns” Ultramicroscopy 107: 414-421.
[2] P. G. Callahan, M. De Graef (2013) “Dynamical Electron Backscatter Diffraction Patterns. Part I: Pattern Simulations” Microscopy and Microanalysis 19: 1255-1265.
[3] S.I. Wright, B. L. Adams, J.-Z. Zhao (1991). “Automated determination of lattice orientation from electron backscattered Kikuchi diffraction patterns” Textures and Microstructures 13: 2-3.
[4] Y.H. Chen, S. U. Park, D. Wei, G. Newstadt, M.A. Jackson, J.P. Simmons, M. De Graef, A.O. Hero (2015) “A dictionary approach to electron backscatter diffraction indexing” Microscopy and Microanalysis 21: 739-752.
[5] S.I. Wright, B. L. Adams (1992) “Automatic-analysis of electron backscatter diffraction patterns” Metallurgical Transactions A 23: 759-767.
[6] N.C. Krieger Lassen, D. Juul Jensen, K. Conradsen (1992) “Image processing procedures for analysis of electron back scattering patterns” Scanning Microscopy 6: 115-121.
[7] K. Kunze, S. I. Wright, B. L. Adams, D. J. Dingley (1993) “Advances in Automatic EBSP Single Orientation Measurements.” Textures and Microstructures 20: 41-54.

From Collecting EBSD at 20 Patterns per second (pps) to Collecting at 4,500 pps

John Haritos, Regional Sales Manager Southwest USA. EDAX

I recently had the opportunity to host a demo for one of my customers at our Draper, Utah office. This was a long-time EDAX and EBSD user, who was interested in seeing our new Velocity CMOS camera, and to try it on some of their samples.

When I started in this industry back in the late 90s, the cameras were running at a “blazing” 20 points per second and we all thought that this was fast. At that time, collection speed wasn’t the primary issue. What EBSD brought to the table was automated orientation analysis of diffraction patterns. Now users could measure orientations and create beautiful orientation maps with the push of a button, which was a lot easier than manually interpreting these patterns.

Fast forward to 2019 and with the CMOS technology being adapted from other industries to EBSD we are now collecting at 4,500 pps. What took hours and even days to collect at 20 pps now takes a matter of minutes or seconds. Below is a Nickel Superalloy sample collected at 4,500 pps on our Velocity™ Super EBSD camera. This scan shows the grain and twinning structure and was collected in just a few minutes.

Figure 1: Nickel Superalloy

Of course, now that we have improved from 20 pps to 4,500 pps, it’s significantly easier to get a lot more data. So the question becomes, how do we analyze all this data? This is where OIM Analysis v8™ comes to the rescue for the analysis and post processing of these large data sets. OIM Analysis v8™ was designed to take advantage of 64 bit computing and multi-threading so the software can handle large datasets. Below is a grain size map and a grain size distribution chart from an Aluminum friction stir weld sample with over 7 Million points collected with the Velocity™ and processed using OIM Analysis v8™. This example is interesting because the grains on the left side of the image are much larger than the grains on the right side. With the fast collection speeds, a small (250nm) step size could still be used over this larger collection area. This allows for accurate characterization of grain size across this weld interface, and the bimodal grain size distribution is clearly resolved. With a slower camera, it may be impractical to analyze this area in a single scan.

Figure 2: Aluminum Friction Stir Weld

In the past, most customers would setup an overnight EBSD run. You could see the thoughts running through their mind: will my sample drift, will my filament pop, what will the data look like when I come back to work in the morning? Inevitably, the sample would drift, or the filament would pop and this would mean the dreaded “ugh” in the morning. With the Velocity™ and the fast collection speeds, you no longer need to worry about this. You can collect maps in a few minutes and avoid this issue in practice. It’s a hard thing to say in a brochure, but its easy to appreciate when seeing it firsthand.

For me, watching my customer see the analysis of many samples in a single day was impressive. These were not particularly easy samples. They were solar cell and battery materials, with a variety of phases and crystal structures. But under similar conditions to their traditional EBSD work, we could collect better quality data much faster. The future is now. Everyone is excited with what the CMOS technology can offer in the way of productivity and throughput for their EBSD work.

A Lot of Excitement in the Air!

Sia Afshari, Global Marketing Manager, EDAX

After all these years I still get excited about new technologies and their resulting products, especially when I have had the good fortune to play a part in their development. As I look forward to 2019, there are new and exciting products on the horizon from EDAX, where the engineering teams have been hard at work innovating and enhancing capabilities across all product lines. We are on the verge of having one of our most productive years for product introduction with new technologies expanding our portfolio in electron microscopy and micro-XRF applications.

Our APEX™ software platform will have a new release early this year with substantial feature enhancements for EDS, to be followed by EBSD capabilities later in 2019. APEX™ will also expand its wings to uXRF providing a new GUI and advanced quant functions for bulk and multi-layer analysis.

Our OIM Analysis™ EBSD software will also see a major update with the addition of a new Dictionary Indexing option.

A new addition to our TEM line will be a 160 mm² detector in a 17.5 mm diameter module that provides an exceptional solid angle for the most demanding applications in this field.

Elite T EDS System

Velocity™, EDAX’s low noise CMOS EBSD camera, provides astonishing EBSD performance at greater than 3000 fps with high indexing on a range of materials including deformed samples.

Velocity™ EBSD Camera

Last but not least, being an old x-ray guy, I can’t help being so impressed with the amazing EBSD patterns we are collecting from a ground-breaking direct electron detection (DED) camera with such “Clarity™” and detail, promising a new frontier for EBSD applications!
It will be an exciting year at EDAX and with that, I would like to wish you all a great, prosperous year!

Common Mistakes when Presenting EBSD Data

Shawn Wallace, Applications Engineer, EDAX

We all give presentations. We write and review papers. Either way, we have to be critical of our data and how it is presented to others, both numerically and graphically.

With that said, I thought it would be nice to start this year with a couple of quick tips or notes that can help with mistakes I see frequently.

The most common thing I see is poorly documented cleanup routines and partitioning. Between the initial collection and final presentation of the data, a lot of things are done to that data. It needs to be clear what was done so that one can interpret it correctly (or other people can reproduce it). Cleanup routines can change the data in ways that can either be subtle (or not so subtle), but more importantly they could wrongly change your conclusions. The easiest routine to see this on is the grain dilation routine. This routine can turn noisy data into a textured dataset pretty fast (fig. 1).

Figure 1. The initial data was just pure noise. By running it iteratively through the grain dilation routine, you can make both grains and textures.

Luckily for us, OIM Analysis™ keeps track of most of what is done via the cleanup routines and partitioning in the summary window on either the dataset level or the partition level (fig. 2).

Figure 2. A partial screenshot of the dataset level summary window shows cleanup routines completed on the dataset, as well as the parameters used. This makes your processing easily repeatable.

The other common issue is not including the full information needed to interpret a map. I really need to look at 3 things to get the full picture for an EBSD dataset: the IPF map (fig. 3), the Phase Map (fig. 4) and the IPF Legend (fig. 5) of those phases. This is very important because while the colors used are the same, the orientations differ between the different crystal symmetries.

Figure 3. General IPF Map of a geological sample. Many phases are present, but the dataset is not complete without a legend and phase map. The colors mean nothing without knowing both the phase and the IPF legend to use for that phase.

Below is a multiple phase sample with many crystal symmetries. All use Red-Green-Blue as the general color scheme. By just looking at the general IPF map (fig. 3), I can easily get the wrong impression. Without the phase map, I do not know which legend I should be using to understand the orientation of each phase. Without the crystal symmetry specific legend, I do not know how the colors change over the orientation space. I really need all these legends/maps to truly understand what I am looking at. One missing brick and the tower crumbles.

Figure 5. With all the information now presented, I can actually go back and interpret figure 3 using figures 4 and 5 to guide me.

Figure 4. In this multiphase sample, multiple symmetries are present. I need to know which phase a pixel is, to know which legend to use.

 

 

 

 

 

 

 

 

 

 

 

 

 

Being aware of these two simple ideas alone can help you to better present your data to any audience. The fewer the questions about how you got the data, the more time you will have to answer more meaningful questions about what the data actually means!

Old Eyes?

Dr. Stuart Wright, Senior Scientist EBSD, EDAX

I was recently asked to write a “Tips & Tricks” article for the EDAX Insight Newsletter as I had recently done an EDAX Webinar (www.edax.com/news-events/webinars) on Texture Analysis. I decided to follow up on one item I had emphasized in the Webinar. Namely, the need for sampling enough orientations for statistical reliability in characterizing a texture. The important thing to remember is that it is the number of grain orientations as opposed to the number of orientations measured. But that lead to the introduction of the idea of sub-sampling a dataset to calculate textures when the datasets are very large. Unfortunately, there was not enough room to go into the kind of detail I would have liked to so I’ve decided to use our Blog forum to cover some details about sub-sampling that I found interesting

Consider the case where you not only want to characterize the texture of a material but also the grain size or some other microstructural characteristic requiring a relatively fine microstructure relative to the grain size. According to some previous work, to accurately capture the texture you will want to measure approximately 10,000 grains [1] and about 500 pixels per average grain in order to capture the grain size well [2]. This would result in a scan with approximately 5 million datapoints. Instead of calculating the texture using all 5 million data points, you can use a sub-set of the points to speed up the calculation. In our latest release of OIM Analysis, this is not as big of a concern as it once was as the texture calculations have been multithreaded so they are fast even for very large datasets. Nonetheless, since it is very likely that you will want to calculate the grain size, you can use the area weighted average grain orientation for each grain as opposed to using all 5 million individual orientation measurements for some quick texture calculation. Alternatively, a sub-set of the points through random or uniform sampling of the points in the scan area could be used.

Of course, you may wonder how well the sub-sampling works. I have done a little study on a threaded rod from a local hardware store to test these ideas. The material exhibits a (110) fiber texture as can be seen in the Normal Direction IPF map and accompanying (110) pole figure. For these measurements I have simply done a normalized squared difference point-by-point through the Orientation Distribution Function (ODF) which we call the Texture Difference Index (TDI) in the software.


This is a good method because it allows us to compare textures calculated using different methods (e.g. series expansion vs binning). In this study, I have used the general spherical harmonics series expansion with a rank of L = 22 and a Gaussian half-width of  = 0.1°. The dataset has 105,287 points with 92.5% of those having a CI > 0.2 after CI Standardization. I have elected only to use points with CI > 0.2. The results are shown in the following figure.

As the step size is relatively coarse with respect to the grain size, I have experimented with using grains requiring at least two pixels before considering a set of similarly oriented points a grain versus allowing a single pixel to be a grain. This resulted in 9981 grains and 25,437 grains respectively. In both cases, the differences in the textures between these two grain-based sub-sampling approaches with respect to using the full dataset are small with the 1 pixel grain based sub-sampling being slight closer as would be expected. However, the figure above raised two questions for me: (1) what do the TDI numbers mean and (2) why do the random and the uniform sampling grids differ so much, particularly as the number of points in the sub-sampling gets large (i.e. at 25% of the dataset).

TDI
The pole figure for the 1000 random points in the previous figure certainly captures some of the characteristics of the pole figure for the full dataset. Is this reflected in the TDI measurements? My guess is that if I were to calculate the textures at a lesser rank, something like L = 8 then the TDI’s would go down. This is already part of the TDI calculation and so it is an easy thing to examine. For comparison I have chosen to look at four different datasets: (a) all of the data in the dataset above (named “fine”), (b) a dataset from the same material with a coarser step size (“coarse”) containing approximately 150,000 data points, (c) sub-sampling of the original dataset using 1000 randomly sampled datapoints (“fine-1000”) and (d) the “coarse” dataset rotated 90 degrees about the vertical axis in the pole figures (“coarse-rotated”). It is interesting to note that the textures that are similar “by-eye” show a general increase in the TDI as the series expansion rate increases. However, for very dissimilar textures (i.e “coarse” vs “coarse-rotated”) the jump to a large TDI is immediate.

Random vs Uniform Sampling
The differences between the random and uniform sampling were a bit curious so I decided to check the random points to see how they were positioned in the x-y space of the scan. The figure below compares the uniform and random sampling for 4000 datapoints – any more than this is hard to show. Clearly the random sampling is reasonable but does show a bit of clustering and gaps within the scan area. Some of these small differences show up with higher differences in TDI values than I would expect. Clearly, at L = 22 we are picking up quite subtle differences – at least subtle with respect to my personal “by-eye” judgement. It seems to me, that my “by-eye” judgement is biased toward lower rank series expansions.


Of course, another conclusion would be that my eyesight is getting rank with age ☹ I guess that explains my increasingly frequent need to reach for my reading glasses.

References
[1] SI Wright, MM Nowell & JF Bingert (2007) “A comparison of textures measured using X-ray and electron backscatter diffraction”. Metallurgical and Materials Transactions A, 38, 1845-1855
[2] SI Wright (2010) “A Parametric Study of Electron Backscatter Diffraction based Grain Size Measurements”. Practical Metallography, 47, 16-33.