Hough Transform

How to Get a Good Answer in a Timely Manner

Shawn Wallace, Applications Engineer, EDAX

One of the joys of my job is troubleshooting issues and ensuring you acquire the best results to advance your research. Sometimes, it requires additional education to help users understand a concept. Other times, it requires an exchange of numerous emails. At the end of the day, our goal is not just to help you, but to ensure you get the right information in a timely manner.

For any sort of EDS related question, we almost always want to look at a spectrum file. Why? There is so much information hidden in the spectrum that we can quickly point out any possible issues. With a single spectrum, we can quickly see if something was charging, tilted, or shadowed (Figure 1). We can even see weird things like beam deceleration caused by a certain imaging mode (Figure 2). With most of these kinds of issues, it is common to run into major quant related problems. Any quant problems should always start with a spectrum.

Figure 1. The teal spectrum shows a strange background versus what a normal spectrum (red) should look like for a material.

This background information tells us that the sample was most likely shadowed and that rotating the sample to face towards the detector may give better results.

Figure 2. Many microscopes can decelerate the beam to help with imaging. This deceleration is great for imaging but can cause EDS quant issues. Therefore, we recommend reviewing the spectrum up front to reduce the number of emails to troubleshoot this issue.

To save the spectrum, right-click in the spectrum window, then click on Save (Figure 3). From there, save the file with a descriptive name, and send it off to the applications group. These spectrum files also include other metadata, such as amp time, working distance, and parameters that give us so many clues to get to the bottom of possible issues.

Figure 3. Saving a spectrum in APEX is intuitive. Right-click in the area and a pop-up menu will allow you to save the spectrum wherever you want quickly.

For information on EDS backgrounds and the information they hold, I suggest watching Dr. Jens Rafaelsen’s Background Modeling and Non-Ideal Sample Analysis webinar.

The actual image file can also help us confirm most of the above.

Troubleshooting EBSD can be tricky since the issue could be from sample prep, indexing, or other issues. To begin, it’s important to rule out any variances associated with sample preparation. Useful information to share includes a description of the sample, as well as the step-by-step instructions used to prepare the sample. This includes things like the length of time, pressure, cloth material, polishing compound material, and even the direction of travel. The more details, the better!

Now, how do I know it is a sample prep problem? If the pattern quality is low at long exposure times (Figure 4) or the sample looks very rough, it is probably related to sample preparation (Figure 4). That being said, there could be non-sample prep related issues too.

Figure 4. This pattern is probably not indexable on its own. Better preparation of the sample surface is necessary to index and map this sample correctly.

For general sample prep guidelines, I would highly suggest Matt Nowell’s Learn How I Prepare Samples for EBSD Analysis webinar.

Indexing problems can be challenging to troubleshoot without a full data set. How do I know my main issues could be related to indexing? If indexing is the source, a map often appears to be very speckled or just black due to no indexing results. For this kind of issue, full data sets are the way to go. By full, I mean patterns and OSC files. These files can be exported out of TEAM/APEX. They are often quite large, but there are ways available to move the data quickly.

For the basics of indexing knowledge, I suggest checking out my latest webinar, Understanding and Troubleshooting the EDAX Indexing Routine and the Hough Parameters. During this webinar, we highlight attributes that indicate there is an issue with the data set, then dive into the best practices for troubleshooting them.

As for camera set up, this is a dance between the microscope settings, operator’s requirements, and the camera settings. In general, more electrons (higher current) allow the experiment to go faster and cover more area. With older CCD based cameras, understanding this interaction was key to good results. With the newer Velocity cameras based on CMOS technology, the dance is much simpler. If you are having difficulty while trying to optimize an older camera, the Understanding and Optimizing EBSD Camera Settings webinar can help.

So how do you get your questions answered fast? Bury us with information. More information lets us dive deeper into the data to find the root cause in the first email, and avoids a lengthy back and forth exchange of emails. If possible, educate yourself using the resources we have made available, be it webinars or training courses. And always, feel free to reach out to my colleagues and me at edax.applications@ametek.com!


Dr. Stuart Wright, Senior Scientist EBSD, EDAX

The city has recently started burying a pipe down the middle of one of the roads into my neighborhood. There were already a couple of troublesome intersections on this road. The construction has led to several accidents in the past couple of weeks at these intersections and I am sure there are more to come.

A question from a reviewer on a paper I am co-authoring got me thinking about the impact of intersections of bands in EBSD patterns on the Hough transform. The intersections are termed ‘zone axes’ or ‘poles’ and a pattern is typically composed of some strong ones where several high intensity bands intersect as well as weak ones where perhaps only two bands intersect.

To get an idea of the impact of the intersections on the Hough transform, I have created an idealized pattern. The intensity of the bands in the idealized pattern is derived from the peaks heights from the Hough transform applied to an experimental pattern. For a little fun, I have created a second pattern by blacking out the bands in the original idealized pattern, leaving behind only the intersections. I created a third pattern by blacking out the intersections and leaving behind only the bands. I have input these three patterns into the Hough transform. As I expected, you can see the strong sinusoidal curves from the pattern with only the intersections. However, you can also see peaks, where these sinusoidal curves intersect and these correspond (for the most part) to the bands in the pattern.

In the figure, the middle row of images are the raw Hough Transforms and the bottom row of images are the Hough Transforms after applying the butterfly mask. It is interesting to note how much the Hough peaks differ between the three patterns. It is clear that the intersections contribute positively to finding some of the weaker bands. This is a function not only of the band intensity but also the number of zone axes along the length of the band in the pattern.

Eventually the construction on my local road will be done and hopefully we will have fewer accidents. But clearly, intersections are more than just a necessary evil

Hats Off/On to Dictionary Indexing

Dr. Stuart Wright, Senior Scientist EBSD, EDAX

Recently I gave a webinar on dynamic pattern simulation. The use of a dynamic diffraction model [1, 2] allows EBSD patterns to be simulated quite well. One topic I introduced in that presentation was that of dictionary indexing [3]. You may have seen presentations on this indexing approach at some of the microscopy and/or materials science conferences. In this approach, patterns are simulated for a set of orientations covering all of orientation space. Then, an experimental pattern is tested against all of the simulated patterns to find the one that provides the best match with the experimental pattern. This approach does particularly well for noisy patterns.

I’ve been working on implementing some of these ideas into OIM Analysis to make dictionary indexing more streamlined for datasets collected using EDAX data collection software – i.e. OIM DC or TEAM. It has been a learning experience and there is still more to learn.

As I dug into dictionary indexing, I recalled our first efforts to automate EBSD indexing. Our first attempt was a template matching approach [4]. The first step in this approach was to use a “Mexican Hat” filter. This was done to emphasize the zone axes in the patterns. This processed pattern was then compared against a dictionary of “simulated” patterns. The simulated patterns were simple – a white pixel (or set of pixels) for the major zone axes in the pattern and everything else was colored black. In this procedure the orientation sampling for the dictionary was done in Euler space.
It seemed natural to go this route at the time, because we were using David Dingley’s manual on-line indexing software which focused on the zone axes. In David’s software, an operator clicked on a zone axis and identified the <uvw> associated with the zone axis. Two zone axes needed to be identified and then the user had to choose between a set of possible solutions. (Note – it was a long time ago and I think I remember the process correctly. The EBSD system was installed on an SEM located in the botany department at BYU. Our time slot for using the instrument was between 2:00-4:00am so my memory is understandably fuzzy!)

One interesting thing of note in those early dictionary indexing experiments was that the maximum step size in the sampling grid of Euler space that would result in successful indexing was found to be 2.5°, quite similar to the maximum target misorientation for modern dictionary indexing. Of course, this crude sampling approach may have led to the lack of robustness in this early attempt at dictionary indexing. The paper proposed that the technique could be improved by weighting the zone axes by the sum of the structure factors of the bands intersecting at the zone axes.
However, we never followed up on this idea as we abandoned the template matching approach and moved to the Burn’s algorithm coupled with the triplet voting scheme [5] which produced more reliable results. Using this approach, we were able to get our first set of fully automated scans. We presented the results at an MS&T symposium (Microscale Texture of Materials Symposium, Cincinnati, Ohio, October 1991) where Niels Krieger-Lassen also presented his work on band detection using the Hough transform [6]. After the conference, we hurried back to the lab to try out Niels’ approach for the band detection part of the indexing process [7].
Modern dictionary indexing applies an adaptive histogram filter to the experimental patterns (at left in the figure below) and the dictionary patterns (at right) prior to performing the normalized inner dot-product used to compare patterns. The filtered patterns are nearly binary and seeing these triggered my memory of our early dictionary work as they reminded me of the nearly binary “Sombrero” filtered patterns– Olé!
We may not have come back full circle but progress clearly goes in steps and some bear an uncanny resemblance to previous ones. I doff my hat to the great work that has gone into the development of dynamic pattern simulation and its applications.

[1] A. Winkelmann, C. Trager-Cowan, F. Sweeney, A. P. Day, P. Parbrook (2007) “Many-Beam Dynamical Simulation of Electron Backscatter Diffraction Patterns” Ultramicroscopy 107: 414-421.
[2] P. G. Callahan, M. De Graef (2013) “Dynamical Electron Backscatter Diffraction Patterns. Part I: Pattern Simulations” Microscopy and Microanalysis 19: 1255-1265.
[3] S.I. Wright, B. L. Adams, J.-Z. Zhao (1991). “Automated determination of lattice orientation from electron backscattered Kikuchi diffraction patterns” Textures and Microstructures 13: 2-3.
[4] Y.H. Chen, S. U. Park, D. Wei, G. Newstadt, M.A. Jackson, J.P. Simmons, M. De Graef, A.O. Hero (2015) “A dictionary approach to electron backscatter diffraction indexing” Microscopy and Microanalysis 21: 739-752.
[5] S.I. Wright, B. L. Adams (1992) “Automatic-analysis of electron backscatter diffraction patterns” Metallurgical Transactions A 23: 759-767.
[6] N.C. Krieger Lassen, D. Juul Jensen, K. Conradsen (1992) “Image processing procedures for analysis of electron back scattering patterns” Scanning Microscopy 6: 115-121.
[7] K. Kunze, S. I. Wright, B. L. Adams, D. J. Dingley (1993) “Advances in Automatic EBSP Single Orientation Measurements.” Textures and Microstructures 20: 41-54.

Back to Basics

Dr. René de Kloe, Applications Specialist, EDAX

When you have been working with EBSD for many years it is easy to forget how little you knew when you started. EBSD patterns appear like magic on your screen, indexing and orientation determination are automatic, and you can produce colourful images or maps with a click of a mouse.

Image 1: IPF on PRIAS center EBSD map of cold-pressed iron powder sample.

All the tools to get you there are hidden in the EBSD software package that you are working with and as a user you don’t need to know exactly how all of it happens. It just works. To me, although it is my daily work, it is still amazing how easy it sometimes is to get high quality data from almost any sample even if it only produces barely recognisable patterns.

Image 2: Successful indexing of extremely noisy patterns using automatic band detection.

That capability did not just appear overnight. There is a combination of a lot of hard work, clever ideas, and more than 25 years of experience behind it that we sometimes just forget to talk about, or perhaps even worse, expect everybody to know already. And so it is that I occasionally get asked a question at a meeting or an exhibition where I think, really? For example, some years ago I got a very good question about the EBSD calibration.

Image 3: EBSD calibration is based on the point in the pattern that is not distorted by the projection. This is the point where the electrons reach the screen perpendicularly (pattern center).

As you probably suspect EBSD calibration is not some kind of magic that ensures that you can index your patterns. It is a precise geometrical correction that distorts the displayed EBSD solution so that it fits the detected pattern. I always compare it with a video-projector. That is also a point projection onto a screen at a small angle, just like the EBSD detection geometry. And when you do that there is a distortion where the sides of the image on the screen are not parallel anymore but move away from each other. On video projectors there is a smart trick to fix that: a button labelled keystone correction which pulls the sides of the image nicely parallel again where they belong.

Image 4: Trapezoid distortion before (left) and after (right) correction.

Unfortunately, we cannot tell the electrons in the SEM to move over a little bit in order to make the EBSD pattern look correct. Instead we need to distort the indexing solution just so that it matches the EBSD pattern. And now the question I got asked was, do you actually adjust this calibration when moving the beam position on the sample during a scan? Because otherwise you cannot collect large EBSD maps. Apparently not everybody was doing that at that time, and it was being presented at a conference as the invention of the century that no EBSD system could do without. It was finally possible to collect EBSD data at low magnification! So, when do you think this feature will be available in your software? I stood quiet for a moment before answering, well, eh, we actually already have such a feature that we call the pattern centre shift. And it had been in the system since the first mapping experiments in the early 90’s. We just did not talk about it as it seemed so obvious.

There are more things like that hidden in the software that are at least as important, such as smart routines to detect the bands even in extremely noisy patterns, EBSD pattern background processing, 64-bit multithreading for fast processing of large datasets, and efficient quaternion-based mathematical methods for post-processing. These tools are quietly working in the background to deliver the results that the user needs.
There are some other original ideas that date back to the 1990’s that we actually do regularly talk about, such as the hexagonal scanning grid, triplet voting indexing, and the confidence index, but there is also some confusion about these. Why do we do it that way?

The common way in imaging and imaging sensors (e.g. CCD or CMOS chips) is to organise pixels on a square grid. That is easy and you can treat your data as being written in a regular table with fixed intervals. However, pixel-to-pixel distances are different horizontally and diagonally which is a drawback when you are routinely calculating average values around points. In a hexagonal grid the point-to-point distance is constant between all neighbouring pixels. Perhaps even more importantly, a hexagonal grid offers ~15% more points on the same area than a square grid, which makes it ideally suited to fill a surface.

Image 5: Scanning results for square (left) and hexagonal (right) grids using the same step size. The grain shape and small grains with few points are more clearly defined in the hexagonal scan.

This potentially allows improvements in imaging resolution and sometimes I feel a little surprised that a hexagonal imaging mode is not yet available on SEMs.
The triplet voting indexing method also has some hidden benefits. What we do there is that a crystal orientation is calculated for each group of three bands that is detected in an EBSD pattern. For example, when you set the software to find 8 bands, you can define up to 56 different band triangles, each with a unique orientation solution.

Image 6: Indexing example based on a single set of three bands – triplet.

Image 7: Equation indicating the maximum number of triplets for a given number of bands.

This means that when a pattern is indexed, we don’t just find a single orientation, we find 56 very similar orientations that can all be averaged to produce the final indexing solution. This averaging effectively removes small errors in the band detection and allows excellent orientation precision, even in very noisy EBSD patterns. The large number of individual solutions for each pattern has another advantage. It does not hurt too much if some of the bands are wrongly detected from pattern noise or when a pattern is collected directly at a grain boundary and contains bands from two different grains. In most cases the bands coming from one of the grains will dominate the solutions and produce a valid orientation measurement.

The next original parameter from the 1990’s is the confidence index which follows out of the triplet voting indexing method. Why is this parameter such a big deal that it is even patented?
When an EBSD pattern is indexed several parameters are recorded in the EBSD scan file, the orientation, the image quality (which is a measure for the contrast of the bands), and a fit angle. This angle indicates the angular difference between the bands that have been detected by the software and the calculated orientation solution. The fit angle can be seen as an error bar for the indexing solution. If the angle is small, the calculated orientation fits very closely with the detected bands and the solution can be considered to be good. However, there is a caveat. What now if there are different orientation solutions that would produce virtually identical patterns? This may happen for a single phase where it is called pseudosymmetry. The patterns are then so similar that the system cannot detect the difference. Alternatively, you can also have multiple phases in your sample that produce very similar patterns. In such cases we would typically use EDS information and ChI-scan to discriminate the phases.

Image 8: Definition of the confidence index parameter. V1 = number of votes for best solution, V2 = mumber of votes for 2nd best solution, VMAX= Maximum possible number of votes.

Image 9: EBSD pattern of silver indexed with the silver structure (left) and copper structure (right). Fit is 0.24″, the only difference is a minor variation in the band width matching.

In both these examples the fit value would be excellent for the selected solution. And in both cases the solution has a high probability of being wrong. And that is where the confidence index or CI value becomes important. The CI value is based on the number of band triangles or triplets that match each possible solution. If there are two indistinguishable solutions, these will both have the same number of triangles and the CI will be 0. This means that there are two or more apparently valid solutions that may all have a good fit angle. The system just does not know which of these solutions is the correct one and thus the measurement is rejected. If there is a difference of only 10% in matched triangles between alternative orientation solutions in most cases the software is capable of identifying the correct solution. The fit angle on its own cannot identify this problem.

After 25 years these tools and parameters are still indispensable and at the basis of every EBSD dataset that is collected with an EDAX system. You don’t have to talk about them. They are there for you.

The Hough Transform – An Amazing Tool.

Shawn Wallace, Applications Engineer, EDAX

Part of my job is understanding and pushing the limits of each part of our systems. One of the most fundamental parts of the EBSD system is the Hough Transform. The Hough Transform role is finding the lines on an EBSD pattern. This is the first step in indexing a pattern (Fig. 1). If this step is not consistent, the quality of any indexing and any derivative data is questionable. A normal user does not really need to understand all the intricacies of every part of the system, but it still is worthwhile to understand how your data and data quality can be affected.

Figure 1: On the left are the overlaid lines found via the Hough Transform. On the right is the Indexed solution overlaid based on the Hough. The quality of the indexed solution is based on the quality of the Hough.

With that in mind, I ran an experiment on a steel sample to see how far the Hough could be pushed and still give consistent indexing. For this experiment, I used our Hikari Super at a series of different binnings between its native resolution of 640X480 Pixels at 1×1 binning down to 35×26 pixels at 18×18 binning. All pixel resolutions are noted in Table 1. I kept my Hough Settings and beam settings consistent. My only other variable was exposure to get the camera to be equally saturated at around 0.85 saturation.

I expected the lower binning Patterns to be consistent and they were (Fig. 2). All three Euler Angles between the 1×1, 2×2, 4×4, and 8×8, were within 0.4 degrees of each other. Pushing the camera and the Hough even further really surprised me though.

Figure 2: Indexed Pattern for the lower binning showed a remarkable consistency in indexing.

Figure 3: The indexing results still held their consistency even for highest binning settings used.

I expected some drop off with the consistency of the orientation when I dropped my binning to 10×10, 16×16, and even 18×18 and it did not fully materialize (Fig. 3). The range did broaden in the Euler Angles, specifically ᶲ₂’s range increased to 3 degrees, but that is change of <1% given the entire range for ᶲ₂ is 360 degrees. Table 1 shows the data is the raw form. Overall, the data is great, from low to high binning with minimal loss in in our indexing metrics (CI and Fit) and consistency in Euler Angles except for the 18×18 binning. That is where we have found our limit, specifically when it comes to indexing metrics. We see a sharp drop off in the CI. The pixilation of the pattern has gotten to a point where it is difficult to find a unique solution. This drop off is why we tell our customer that 16×16 is the limit of binning they should use for reliable, high quality data.

Table 1. Indexing Metrics and Euler Angles for all data points.

With all that said, most EBSD work is not on a single orientation, but a map. Does this hold true on a map? It does. In Figure 4 and Figure 5, we can see the mapping results for 2×2 binning and 10×10 binning. Both indexed at 99.9% with their average CI’s being 0.89 and 0.84 respectively, with very little change in orientations. This level of data quality across binnings is why EDAX uses the Hough. It is an amazing little tool.

Figure 4. This map was taken at 2×2 binning. Internal deformation of the grains is visible, with inclusions between relatively undeformed.

Figure 5. This map was taken at 10×10 binning in approximately the same area as Figure 4. Again, internal deformation is showed in the larger grain, while the inclusions are undeformed.