New features

Looking at the World of Microanalysis in Color

Tara Nylese, Global Applications Manager, EDAX

Several years ago, I was talking to a customer, who asked whether we could change the color scheme of the EDAX TEAM™ software. He said was that it was hard for him to tell the difference between the spectrum background and the cursor. I replied, “Well, the cursor is a lime green and the background is more like a gray-gre…..Oh, wait, you’re colorblind, aren’t you?” Surely enough he was, and while I can’t “see” his perspective, I can listen to and respect it. Thus, the motivation of this blog is to let our customers know that we in Applications listen to them and take their needs seriously.

In this specific case, I am happy to report that we just recently received feedback on the new EDAX APEX™ software, and one comment was that the user really liked the “contrast” of the red spectrum on the white background – see the image below.

More generally, it is one main goal of the EDAX Applications team to make sure that we capture the “real world” customer feedback and incorporate it as much as possible into future product enhancements, bug fixes and new generations of products. Each of our Worldwide Apps team members can talk to upwards of ten customers a week. These conversations are usually in interactions such as support calls, training sessions and demos. At each opportunity, we hear tremendously valuable real-world customer perspective, and very often we learn what we can’t “see” ourselves. Often, if I’m asked to share my thoughts, my words are just a colorful patchwork of years of customer ideas all melded into a microscopy amalgam.

Customer perspective is so important, in fact, that it is a cornerstone of the EDAX App Lab Mission Statement. A few years ago, I compiled about three pages of descriptions of what people thought of when they thought of Apps, and then condensed them down into the following statement that hangs on our HQ App Lab walls.

The EDAX US App Lab uses technical expertise and creativity plus a strong focus on understanding the needs of our internal and external customers to drive excellence in innovative analytical solutions. The applications group supports company-wide efforts to provide real-life value and benefits to our customers which differentiate our products in materials analysis.

Now to get back to the colors which are available for maps in our software. One of the lesser known functions is the ability to select and edit your color palette:

Using this option, you can choose from a 40-color palette, seen here. Remember to click on the element in the periodic chart first, then select your color.

Since I brought up the topic of colorblindness, I’ll also use a colorblind app that simulates how a Red/Green colorblind person sees the world (or our color palette).

Note the green color of O and P, and see how closely it compares to the yellow color of the lanthanide/actinide series!

Finally, to summarize the Applications message: to our current customers – thank you for sharing your thoughts; to all our applications team colleagues – thank you for gathering so much wide-ranging information and promoting the importance of it internally, and to all our future customers – when you chose EDAX, you’re choosing to join a dynamic microanalysis company, which strives to develop the most meaningful features and functions to meet your microanalysis needs.

A Bit of Background Information

Dr. Jens Rafaelsen, Applications Engineer, EDAX

Any EDS spectrum will have two distinct components; the characteristic peaks that originate from transitions between the states of the atoms in the sample and the background (Bremsstrahlung) which comes from continuum radiation emitted from electrons being slowed down as they move through the sample. The figure below shows a carbon coated galena sample (PbS) where the background is below the dark blue line while the characteristic peaks are above.

Carbon coated galena sample (PbS) where the bacground is below the dark blue line while the characteristic peaks are above.

Some people consider the background an artefact and something to be removed from the spectrum (either through electronics filtering or by subtracting it) but in the TEAM™ software we apply a model based on Kramer’s law that looks as follows:Formulawhere E is the photon energy, N(E) the number of photons, ε(E) the detector efficiency, A(E) the sample self-absorption, E0 the incident beam energy, and a, b, c are fit parameters¹.

This means that the background is tied to the sample composition and detector characteristic and that you can actually use the background shape and fit/misfit as a troubleshooting tool. Often if you have a bad background, it’s because the sample doesn’t meet the model requirements or the data fed to the model is incorrect. The example below shows the galena spectrum where the model has been fed two different tilt conditions and an overshoot of the background can easily be seen with the incorrect 45 degrees tilt. So, if the background is off in the low energy range, it could be an indication that the surface the spectrum came from was tilted, in which case the quant model will lose accuracy (unless it’s fed the correct tilt value).


This of course means that if your background is off, you can easily spend a long time figuring out what went wrong and why, although it often doesn’t matter too much. To get rid of this complexity we have included a different approach in our APEX™ software that is meant for the entry level user. Instead of doing a full model calculation we apply a Statistics-sensitive Non-linear Iterative Peak-clipping (SNIP) routine². This means that you will always get a good background fit though you lose some of the additional information you get from the Bremsstrahlung model. The images below show part of the difference where the full model includes the steps in the background caused by sample self-absorption while the SNIP filter returns a flat background.

So, which one is better? Well, it depends on where the question is coming from. As a scientist, I would always choose a model where the individual components can be addressed individually and if something looks strange, there will be a physical reason for it. But I also understand that a lot of people are not interested in the details and “just want something that works”. Both the Bremsstrahlung model and the SNIP filter will produce good results as shown in the table below that compares the quantification numbers from the galena sample.

Table

While there’s a slight difference between the two models, the variation is well within what is expected based on statistics and especially considering that the sample is a bit oxidized (as can be seen from the oxygen peak in the spectrum). But the complexity of the SNIP background is significantly reduced relative to the full model and there’s no user input, making it the better choice for the novice analyst of infrequent user.

¹ F. Eggert, Microchim Acta 155, 129–136 (2006), DOI 10.1007/s00604-006-0530-0
² C.G. RYAN et al, Nuclear Instruments and Methods in Physics Research 934 (1988) 396-402

What an Eclipse can teach us about our EDS Detectors

Shawn Wallace, Applications Engineer, EDAX

A large portion of the US today saw a real-world teaching moment about something microanalysts think about every day.

Figure 1. Total solar eclipse - image from nasa.gov

Figure 1. Total solar eclipse.                                  Image credit-nasa.gov

With today’s Solar Eclipse, you could see two objects that have the same solid angle in the sky, assuming you are in the path of totality. Which is bigger, the Sun or the Moon? We all know that the Sun is bigger, its radius is nearly 400x that of the moon.

Figure 2. How it works.                                             Image credit – nasa.gov

Luckily for us nerds, it is also 400x further away from the Earth than the moon is. This is what makes the solid angle of both objects the same, so that from the perspective of viewers from the Earth, they take up the same area in the sphere of the sky.

The EDAX team observes the solar eclipse in NJ, without looking at the sun!

Why does all this matter for a microanalyst? We always want to get the most out of our detectors and that means maximizing the solid angle. To maximize it, you really have two parameters to play with: how big the detector is and how close the detector is to the sample. ‘How big is the detector’ is easy to play with. Bigger is better, right? Not always, as the bigger it gets, the more you start running in to challenges with pushing charge around that can lead to issues like incomplete charge collection, ballistic deficits, and other problems that many people never think about.

All these factors tend to lead to lower resolution spectra and worse performance at fast pulse processing times.
What about getting closer? Often, we aim for a take-off angle of 350 and want to ensure that the detector does not protrude below the pole piece to avoid hitting the sample. On different microscopes, this can put severe restrictions on how and where the detector can be mounted and we can end up with the situation where we need to move a large detector further back to make it fit within the constraining parameters. So, getting closer isn’t always an option and sometimes going bigger means moving further back.

Figure 3. Schematic showing different detector sizes with the same solid angle. The detector size can govern the distance from the sample.

In the end, bigger is not always better. When looking at EDS systems, you have to compare the geometry just as much as anything else. The events happening today remind of us that. Sure the Sun is bigger than Moon, but the latter does just as good a job of making a part of the sky dark as the Sun does making it bright.

For more information on optimizing your analysis with EDS and EBSD, see our webinar, ‘Why Microanalysis Performance Matters’.

My New Lab Partner

Matt Nowell, EBSD Product Manager, EDAX

It has been an exciting month here in our Draper Utah lab, as we have received and installed our new FEI Teneo FEG SEM. We are a small lab, focusing on EBSD development and applications, and without a loading dock, so timing is critical when scheduling the delivery. So, 3 months ago, we looked at the calendar to pick a day with sunshine and without snow. Luckily, we picked well.

Figure 1: Our new SEM coming off the truck.

Figure 1: Our new SEM coming off the truck.

Once we got the new instrument up and running, of course the next step was to start playing with it. This new SEM has a lot more imaging detectors than our older SEM, so I wanted to see what I could see with it. I chose a nickel superalloy turbine blade with a thermal barrier coating, as it had many phases for imaging and microanalysis. The first image I collected was with the Everhart-Thornley Detector (ETD). For each image shown, I relied on the auto contrast and brightness adjustment to optimize the image.

Figure 2: ETD image

Figure 2: ETD image

With imaging, contrast is information. The contrast in this image shows phase contrast. On the left, gamma/gamma prime contrast is visible in the Nickel superalloy while different distinct regions of the barrier coating are seen towards the right. The next image I collected was with the Area Backscatter Detector (ABS). This is a detector that is positioned under the pole piece for imaging. With this detector, I can use the entire detector, the inner annular portion of the detector, or any of three regions towards the outer perimeter of the detector.

Figure 3: ABS Detector image.

Figure 3: ABS Detector image.

I tried each of the different options, and I selected the inner annular ring portion of the detector. Each option provided similar contrast as seen in Figure 3, but I went with this based on personal preference. The contrast is like the ETD contrast is Figure 2. I also compared with the imaging options using the detector in Concentric Backscatter (CBS) mode, where 4 different concentric annular detectors are available.

Figure 4: T1 Detector (a-b mode).

Figure 4: T1 Detector (a-b mode).

My next image used the T1 detector, which to my understanding is an in-lens detector. In this mode, I selected the a – b mode, so the final image is obtained by subtracting the image from the b portion of the detector from the a portion of the detector. I selected this image because the resultant contrast is reversed from the first couple of images. Here phases that were bright are now dark, and detail within the phases is suppressed.

Figure 5: T2 Detector.

Figure 5: T2 Detector.

My final SEM image was collected with the T2 detector, another in-lens detector option. Here we see the same general phase contrast, but the contrast range is more limited and the detail within regions is again suppressed.

I have chosen to show this set of images to illustrate how different detectors, and their positioning, can generate different images from the area, and that the contrast/information obtained with each image can change. Now I have done a cursory interpretation of the image contrast, but a better understanding may come from reading the manual and knowing the effects of the imaging parameters used.

Figure 6: Always Read the Manual!

Figure 6: Always Read the Manual!

Of course, I’m an EBSD guy, so I also want to compare this to what I can get using our TEAM™ software with Hikari EBSD detectors. One unique feature we have in our software is PRIAS™, which uses the EBSD detector as an imaging system. With the default imaging mode, it subsets the phosphor screen image into 25 different ROI imaging detectors, and generates an image from each when the beam is scanned across the area of interest. Once these images are collected, they can be reviewed, mixed, added, subtracted, and colored to show the contrast of interest, similar to the SEM imaging approach described above.

The 3 most common contrasts we see with PRIAS™ are phase, orientation, and topographic. To capture these, we also have a mode where 3 pre-defined regional detectors are collected during EBSD mapping, and the resulting images available with the EBSD (and simultaneous EDS) data.

Figure 7: PRIAS™ Top Detector Image.

Figure 7: PRIAS™ Top Detector Image.

The first ROI is positioned at the top of the phosphor screen, and the resulting phase contrast is very similar to the contrast obtained with the ETD and ABS imaging modes on the SEM.

Figure 8: PRIAS™ Center Detector Image.

Figure 8: PRIAS™ Center Detector Image.

The second ROI is positioned at the center of the phosphor screen. This image shows more orientation contrast.

Figure 9: PRIAS™ Bottom Detector Image.

Figure 9: PRIAS™ Bottom Detector Image.

The third ROI is positioned at the bottom of the phosphor screen. This image shows more topographical contrast. All three of these images are complementary, both to each other but also to the different SEM images. They all give part of the total picture of the sample.

Figure 10: Defining Custom ROIs in PRIAS™.

Figure 10: Defining Custom ROIs in PRIAS™.

With PRIAS™ it is also possible to define custom ROIs. In Figure 10, 3 different ROIs have been drawn within the phosphor screen area. The 3 corresponding images are then generated, and these can be reviewed, mixed, and then selected. In this case, I selected an ROI that reversed the phase contrast, like the contrast seen with the T1 detector in Figure 4.

Figure 11: PRIAS™ Center Image with EDS Bland Map (Red-Ni, Blue – Al, Green-Zr)

Figure 12: PRIAS™ Center Image with Orientation Map (IPF Map Surface Normal Direction).

figure-12a

Of course, the PRIAS™ information can also be directly correlated with the EDS and EBSD information collected during the mapping. Figure 11 shows an RGB EDS map while Figure 12 shows an IPF orientation map (surface normal direction with the corresponding orientation key) blended with the PRIAS™ center image. Having this available adds more information (via contrast) to the total microstructural characterization package.

I look forward to using our new SEM, to develop new ideas into tools and features for our users. I imagine a few new blogs posts should come from it as well!

Considerations for your New Year’s Resolutions from Dr. Pat

Dr. Patrick Camus, Director of Research and Innovation, EDAX

The beginning of the new calendar year is a time to reflect and evaluate important items in your life. At work, it might also be a time to evaluate the age and capabilities of the technical equipment in your lab. If you are a lucky employee, you may work in a newly refurbished lab where most of your equipment is less than 3 years old. If you are such a fortunate worker, the other colleagues in the field will be envious. They usually have equipment that is much more than 5 years old, some of it possibly dating from the last century!

Old Jalopy circa 1970 EDAX windowless Si(Li) detector circa early 70’s

In my case, at home my phone is 3 years old and my 3 vehicles are 18, 16, and 3 years old. We are definitely evaluating the household budget this year to upgrade the oldest automobile. We need to decide what are the highest priority items and which are not so important for our usage. It’s often important to sort through the different features offered and decide what’s most relevant … whether that’s at home or in the lab.

Octane Elite Silicon Drift Detector 2017 Dr. Pat’s Possible New Vehicle 2017

If your lab equipment is older than your vehicles, you need to determine whether the latest generation of equipment will improve either your throughput or the quality of your work. The latest generations of EDAX equipment can enormously speed up throughput and the improve quality of your analysis over that of previous generations – it’s just a matter of convincing your boss that this has value for the company. There is no time like the present for you to gather your arguments into a proposal to get the budget for the new generation of equipment that will benefit both you and the company.
Best of luck in the new year!

Adding a New Dimension to Analysis

Dr. Oleg Lourie, Regional Manager A/P, EDAX

With every dimension, we add to the volume of data, we believe that we add a new perspective in our understanding and interpretation of the data. In microanalysis adding space or time dimensionality has led to the development of 3D compositional tomography and dynamic or in situ compositional experiments. 3D compositional tomography or 3D EDS is developing rapidly and getting wider acceptance, although it still presents challenges such as the photon absorption, associated with sample thickness and time consuming acquisition process, which requires a high level of stability, especially for TEM microscopes. After setting up a multi hour experiment in a TEM to gain a 3D compositional EDS map, one may wonder Is there any shortcut to getting a ‘quick’ glimpse into 3-dimensional elemental distribution? The good news is that there is one and compared to tilt series tomography, it can be a ‘snapshot’ type of the 3D EDS map.

3D distribution of Nd in steel.

3D distribution of Nd in steel.

To enable such 3D EDS mapping on the conceptual level we would need at least two identical 2D TEM EDS maps acquired with photons having different energy – so you can slide along the energy axis (adding a new dimension?) and use photon absorption as a natural yardstick to probe the element distribution along the X-ray path. Since the characteristic X-rays have discrete energies (K, L, M lines), it might work if you subtract the K line map from the L line or M line map to see an element distribution based on different absorption between K and L or M line maps. Ideally, one of EDS maps should be acquired with high energy X-rays, such as K lines for high atomic number elements, and another with low energy X-rays where the absorption has a significant effect, such as for example M lines. Indeed, in the case of elements with a high atomic number, the energies for K lines area ranged in tens of keV having virtually 0 absorption even in a thick TEM sample.

So, it all looks quite promising except for one important detail – current SDDs have the absorption efficiency for high energy photons close to actual 0. Even if you made your SDD sensor as large 150 mm2 it would still be 0. Increasing it to 200 mm2 would keep it steady close to 0. So, having a large silicon sensor for EDS does not seem to matter, what matters is the absorption properties of the sensor material. Here we add a material selection dimension to generate a new perspective for 3D EDS. And indeed, when we selected a CdTe EDS sensor we would able to acquire X-rays with the energies up to 100 keV or more.

To summarize, using a CdTe sensor will open an opportunity for a ‘snapshot’ 3D EDS technique, which can add more insight about elemental volume distribution, sample topography and will not be limited by a sample thickness. It would clearly be more practical for elements with high atomic numbers. Although it might be utilized for a wide yet selected range of samples, this concept could be a complementary and fast (!) alternative to 3D EDS tomography.

With Great Data Comes Great Responsibility

Matt Nowell, EBSD Product Manager, EDAX

First, I have to acknowledge that I stole the title above from a tweet by Dr. Ben Britton (@BMatB), but I think it applies perfectly to the topic at hand. This blog post has been inspired by a few recent events around the lab. First, our data server drives suffered from multiple simultaneous hard drive failures. Nothing makes you appreciate your data more than no longer having access to it. Second, my colleague and friend Rene de Kloe wrote the preceding article in this blog, and if you haven’t had the opportunity to read it, I highly recommended it. Having been involved with EBSD sample analysis for over 20 years, I have drawers and drawers full of samples. Some of these are very clearly labeled. Some of these are not labeled, or the label has worn off, or the label has fallen off. One of these we believe is one of Rene’s missing samples, although both of us have spent time trying to find it. Some I can recognize just by looking, others need a sheet of paper with descriptions and details. Some are just sitting on my desk, either waiting for analysis or around for visual props during a talk. Here is a picture of some of these desk samples including a golf club with a sample extracted from the face, a piece of a Gibeon meteorite that has been shaped into a guitar pick, a wafer I fabricated myself in school, a rod of tin I can bend and work harden, and then hand to someone else to try, and a sample of a friction stir weld that I’ve used as a fine grained aluminum standard.

fig-1_modified
Each sample leads to data. With high speed cameras, it’s easier to collect more data in a shorter period of time. With simultaneous EDS collection, it’s more data still. With things like NPAR™, PRIAS™, HR-EBSD, and with OIM Analysis™ v8 reindexing functionality, there is also a driving force to save EBSD patterns for each scan. With 3D EBSD and in-situ heating and deformation experiments, there are multiple scans per sample. Over the years, we have archived data with Zip drives, CDs, DVDs, and portable hard drives. Fortunately, the cost for storage has dramatically decreased in the last 20+ years. I remember buying my first USB storage stick in 2003, with 256 MB of storage. Now I routinely carry around multiple TBs of data full of different examples for whatever questions might pop up.

cost-per-gigabyte-large_modified
How do we organize this plethora of data?
Personally, I sometimes struggle with this problem. My desk and office are often a messy conglomerate of different samples, golf training aids (they help me think), papers to read, brochures to edit, and other work to do. I’m often asked if I have an example of one material or another, so there is a strong driving force to be able to find this quickly. Previously I’ve used a database we wrote internally, which was nice but required all of us to enter accurate data into the database. I also used photo management software and the batch processor in OIM Analysis™ to create a visual database of microstructures, which I could quickly review and recognize examples. Often however, I ended up needing multiple pictures to express all the information I wanted in order to use this collection.

blog-fig-3_modified

To help with this problem, the OIM Data Miner function was implemented into OIM Analysis™. This tool will index the data on any given hard drive, and provide a list of all the OIM scan files present. A screenshot using the Data Miner on one of my drives is shown above. The Data Miner is accessed through this icon on the OIM Analysis™ toolbar. I can see the scan name, where it is located, the date associated with the file, what phases were used, the number of points, the step size, the average confidence index, and the elements associated with any simultaneous EDS collection. From this tool, I can open a file of interest, or I can delete a file I no longer need. I can search by name, by phase, or by element, and I can display duplicate files. I have found this to be extremely useful in finding datasets, and wanted to write a little bit about it in case you may also have some use for this functionality.