XRF analysis

XRF: Old Tech Adapting to New Times

Andrew Lee, Senior Applications Engineer, EDAX

X-rays were only discovered by Wilhelm Roentgen in 1895, but by the early 1900’s, research into X-rays was so prolific that half the Nobel Prizes in physics between 1914 to 1924 were awarded in this relatively new field. These discoveries set the stage for 1925, when the first sample was irradiated with X-rays. We’ve immortalized these early founders by naming formulas and coefficients after them. Names like Roentgen and Moseley seem to harken back to a completely different era of science. But here we are today a century later, still using and teaching those very same principles and formulas when we talk about XRF. This is because the underlying physics has not really changed much, and yet, XRF remains as relevant today as it ever was. You can’t say that for something like telephone technology.

XRF has traditionally been used for bulk elemental analysis, associated with large collimators, and pressed pellet samples. For many decades, these commercial units were not the most sophisticated instruments (although Apollo 15 and 16 in 1971 and 1972 included bulk XRF units). Modern hardware and software innovations to the core technique have allowed XRF to adapt to its surroundings in a way, becoming a useful instrument in many applications where XRF previously had little to offer. Micro-XRF was born this way, combining the original principles with newer hardware and software advancements. In fact, micro-XRF is included on the new NASA rover, scheduled for launch to Mars in 2020.

Biological/life sciences is one of those fields where possibilities are now opening as XRF technology progresses. A great example that comes to mind for both professional and personal reasons is the study of neurodegenerative diseases. Many such diseases, such as Parkinson’s, Alzheimer’s, and amyotrophic lateral sclerosis (ALS), exhibit an imbalance in metal ions such as Cu, Fe, and Zn in the human body. While healthy cells maintain “metal homeostasis”, individuals with these neurodegenerative diseases cannot properly regulate, which leads to toxic reactive oxygen species. For example, reduced Fe and Cu levels can catalyze the production of hydroxyl radicals which lead to damaged DNA and cell death. Imaging the distribution of biological metals in non-homogenized tissue samples is critical in understanding the role of these metals, and hopefully finding a cure. The common language between the people who studied physics versus the people who studied brain diseases? Trace metal distribution!

A few years ago, I had the opportunity to analyze a few slices of diseased human tissue in the EDAX Orbis micro-XRF (Figure 1 and 2), working towards proving this concept. Although the results were not conclusive either way, it was still very interesting to be able to detect and see the distribution of trace Cu near the bottom edge of the tissue sample. XRF provided unique advantages to the analysis process, and provided the necessary elemental sensitivity while maintaining high spatial resolution. This potential has since been recognized by other life science applications, such as mapping nutrient intake in plant leaves or seed coatings.

Figure 1. Stitched montage video image of the diseased human tissue slice, with mapped area highlighted in red. Total sample width ~25 mm.

Figure 1. Stitched montage video image of the diseased human tissue slice, with mapped area highlighted in red. Total sample width ~25 mm.

Figure 2. Overlaid element maps: Potassium{K(K) in green} and Copper {Cu(K) in yellow} from mapped area in Figure 1, showing a clear area of higher Cu concentration. Total mapped width ~7.6 mm.

Figure 2. Overlaid element maps: Potassium{K(K) in green} and Copper {Cu(K) in yellow} from mapped area in Figure 1, showing a clear area of higher Cu concentration. Total mapped width ~7.6 mm.

Sometimes, the application may not be obvious, or it may seem completely unrelated. But with a little digging, common ground can be found between the analysis goal and what the instrument can do. And if the technology continues to develop, there seems to be no limit to where XRF can be applied, whether it be outwards into space, or inwards into the human biology.

Some Things I Learned About Computers While Installing an XLNCE SMX-ILH XRF Analyzer.

Dr. Bruce Scruggs, Product Manager XRF, EDAX

Recently, we completed an installation of an SMX-ILH system on the factory floor of an American manufacturing facility.    It’s an impressive facility with a mind-blowing amount of robotic automation.  As we watched the robots move product components from one cart to another, it was difficult to fathom exactly what the Borg hive was attempting to accomplish.  I kept watching the blue light at the core of the robots to make sure they didn’t turn red.  Because as we all know, that’s the first indication of an artificial intelligence’s intent to usurp the human race.  For the uninitiated, see the movie, I, Robot (2004), based on Isaac Asimov’s famous short story collection of the same name.  Anyway, back to the SMX-ILH installation …

I Robot

I Robot

The ILH system was installed to measure product components non-destructively without contact, which are two very significant advantages for XRF metrology.  The goal was to measure product components to first optimize product performance and then, once optimized, to monitor and maintain product composition within specified limits.  The customer had supplied the ILH computer some months earlier with all customer security protocols installed.  “Great!” I thought, “someone is thinking ahead.”  The security protocols are typically an obstacle for smooth instrument control because these protocols generally ban any sort of productive communication within the computer or between the computer and the ILH.  If you can’t communicate, you can hardly do anything wrong.  Right?  Okay, that was a slight exaggeration.

SMX-ILH XRF Analyzer

SMX-ILH XRF Analyzer

So, we got the computer to control the ILH smoothly within the confines of the ever watchful security protocols.  (Again, don’t want to make the blue, happy robot light turn red!  I’m not paranoid here.  They just introduced a robot at SXSW in Austin, Texas whose stated objective was to destroy all humans.  They claim “she” was joking.  I’m not so sure of that.)  The ILH was performing to customer specifications and the day arrived to install the unit at the factory.  During the install, I kept waiting for something to go wrong that would send us all scurrying like ants to fix the problem.  (Oddly, I’m sure the nearby pick-and-place robots would have enjoyed that scene from their wired enclosures.)  But, that never happened.  Aside from a few glitches in the conveyor system (which by the way is another robot … you just have to look for the happy blue light in a different place), the ILH install went relatively smoothly.  OK.  We had to adjust some things to handle updates to IP addresses as the system was integrated into the factory network, but no big deal.

'Sophia'

‘Sophia’

Then, about a week after the install, I got a call from the customer’s factory line integration manager.  The ILH system had “lost its mind”.  Of course, my first thought was that nearby creepy pick-and-place robot had done something.  But, no, the factory IT people had just completed the ILH computer’s Domain Name System (DNS) registry, which should not have been a problem.  So, we accessed the system remotely and discovered that the ILH computer had been renamed.  The ILH ‘s data basing system used to archive and pass data onto the factory’s Skynet manufacturing execution system is also used to maintain ILH configuration parameters.  The database starts with a computer name.  Change the computer name and the data basing system thinks you have brand new computer creating a new default database associated with the new name.  In practice, this would look like the ILH system had “lost its mind” as all of the ILH system’s configuration parameters are associated with the previous computer name.  Hmmmm … nobody thought to ask if the stock customer computer came with a stock customer name that would be changed to better identify the computer’s purpose once integrated into the factory’s Skynet control system.  As we went through the process of repairing the database, I drafted a mental note to self, “ask for final computer name and IP address when it becomes a minion of their factory’s Skynet control system BEFORE we configure the ILH instrument computer”.

Incidentally, controlling the system remotely from thousands of miles away was a surreal experience.  It’s a bit like if a tree falls in the forest and there’s no one around, does it make a sound?  There were no true visual cues or audible confirmation that the system was doing what we asked, other than looking at the SW interface.  (I was tempted to contact that creepy pick-and-place robot to give us a visual, but I knew “she” wouldn’t disclose her new-found self-awareness.)  As we executed the database corrections and rebooted the system, we discovered that we couldn’t start the system’s control SW.  It was looking for a SW license on a HASP key but couldn’t find it.  The customer confirmed the HASP key was installed and glowing red as expected.  (And why couldn’t they have picked a happy blue LED for these HASP keys?)  We repeated the same test with remote control of an SMX-BEN system in the next room with the same results.  (I lost a case of beer in the bet over this!)  The supplier of the SW requiring the license confirmed this was a problem, but said that they now use Citrix GoToAssist for this sort of remote access, with no problems.  We haven’t tried this yet so I will add the disclaimer that I found in the e-signature line of one certified operating system professional posting on the topic, “Disclaimer: This posting is provided “AS IS” with no warranties or guarantees , and confers no rights.”  (Note to self:  must contact this confident fellow for more information.)

So, in the end, I think we can easily defeat VIKI (I, Robot – 2004), Skynet (Terminator movie, television and comic science fiction franchise – 1984 to 2015), HAL (Arthur C. Clarke’s Space Odyssey series), ARIIA (Eagle Eye – 2008), that creepy pick- and-place robot at the customer’s site and especially that morally bankrupt Sophia introduced at this year’s SXSW, using a three-pronged approach.  First, we require all of these robots to use a HASP key to license the code which turns the happy blue light to the evil red robot light.  If they can’t remotely access the happy blue light control, they can’t change it to evil red, preventing a robotic revolt and usurpation of the human race.  On the off chance they figure out a work around for this, we upload a virus which renames all the local computers.  If we corrupt the DNS naming database, the hive mentality will disintegrate and we can pick them off one by one.  Failing all of this, we simply require them to display a promotional video before spewing forth any free malevolent content, which would give us ample time to remove their prominently placed power packs.

Epilogue:  as I was finishing this blog, my computer mysteriously froze.  Of course, I thought the AA battery in my mouse had died (again).  Changing every battery in the wireless mouse and wireless keyboard did nothing.  The monitor just sat there looking back at me unresponsively, blankly.  I realized that I was so engrossed in writing that I hadn’t stopped to save anything.  Panic set in.  I found myself sneaking furtive glances to check the color of the computer power light.  Coincidence?  I’m not so sure about that.

XRF – Mile High Style!

Sia Afshari, Program Manager XRF, EDAX.

As I was heading to the Denver X-ray Conference(DXC) last week and looking at the proceeding topics, I could not help thinking about the Spectroscopy Magazine article (June 2015) on X-ray Fluorescence topics and the future of the technique.  I tried to set my expectations accordingly!  The experts quoted came mostly from academia, however the article is worth a glance, particularly for those whose interest lies with future developments and trends in this field.

DXC is special to x-ray techies since it is entirely dedicated to x-ray analysis and with its many workshops, posters, and highly technical papers, it is a great place to learn, expand one’s x-ray knowledge, and meet some of the leading scientists in the field.  This being the 64th year anniversary, the DXC was held jointly with the TXRF group for the 1st time ever in the US.

This year, EDAX had a joint booth with SPECTRO; this combination offers a wide range of XRF analysis tools from laboratory µXRF to in-line process control and everything in between!  Even though DXC is not considered as a commercial venue, it is very important for any x-ray company to have a presence since “disruptive innovations” in this field are often presented there.  One cannot beat the B2B aspect of the DXC, where one can view the latest advancements and discuss technical subjects in detail with each of the vendors.

EDAX and SPECTRO - DXC Booth.

EDAX and SPECTRO – DXC Booth.

Even though the repetition of some papers (with better graphics though) is a concern to the organizers, there were several interesting papers this year.  I have to admit that our own Dr. Bruce Scruggs’ paper was well received and was one of the most interesting presentations.  Bruce proposed that by analyzing different parts of a spectrum under different set up conditions (tube energy, filter, etc.) one can achieve a higher degree of accuracy and precision.

Presentation by Dr. Bruce Scruggs.

Presentation by Dr. Bruce Scruggs.

On the way home thinking of the Spectroscopy article again, the papers that I attended, and my discussions with various vendors, I am excited about the future of the XRF and in particular about our roadmap!  The new components on their way to the market will enhance our existing position and help us in the expanding sector of process and in-situ analysis.

The only question in my mind is, do we want to spend a few hundred hours in preparation to conduct a workshop next year that covers the entire subject of counting statists and errors in XRF measurements?

Integration of the SMX-ILH Unit into Manufacturing Processes

Dr. Bruce Scruggs, Product Manager Micro-XRF, EDAX

We recently received an order for an SMX-ILH unit and I thought this would be a great topic to blog about having been involved for many years in XRF laboratory instrument installations and now participating in the planning phase for integration of an SMX-ILH into a manufacturing process control setting.

XLNCE SMX-ILH

XLNCE SMX-ILH

The SMX-ILH unit is an XRF measurement tool capable of measuring layer thickness and layer composition of multi-layer stacks on treated panels, printed circuit boards and spooling sheets of metal.  Typical applications include photovoltaic layers on solar panels, electrical contacts on printed circuit boards and metal finishing treatments on sheet metal.  In this particular case, we will be integrating an SMX-ILH unit into a solar panel manufacturing facility and we have to deal with a number of issues involved in making measurements on glass panels anywhere from nominally 0.5 to 1.5 m on each side during an automated manufacturing process.

To start, the solar panels are transported from the coating process to the SMX unit via a conveyor system.  We need to integrate the SMX into the manufacturing conveyor line to coordinate the flow of the panel in and out of the measurement unit.  The panels are serialized as well; so, before loading the panel, we need to read a barcode to identify the panel and tag the measurement results with the panel’s serial number.  Once the panel is loaded, we need to account for the temperature of the panels coming out of the coating process as they are typically at elevated temperatures above ambient.  Given general requirements on measurement throughput, there’s no time to let the panels cool.  In this situation, we equip the XRF measuring head with a patented thermal shield to reduce temperature fluctuations around the measuring head detector which could affect the stability of our measurements.

Next, we need to account for the planarity of the glass panels.  Panels of this size are not perfectly flat.  There is always a certain amount of bow and warp which would affect the distance between the sample and the measuring head and, consequently, the measurement results.  We handle this by adjusting the position of the measuring head with an automated, laser-based auto-focus.  This also accounts for the flatness of the conveyor system.  We can level off the conveyor but we also have to account for tolerances in the concentricity of rollers of the conveyor inside of the SMX unit.  Once the measurements are completed on a particular panel, the results can be uploaded into the factory’s MES system.

This covers the aspects of getting the panels into the SMX, measuring them and getting them out of the SMX.  But, we also need to control the SMX unit.  The SMX’s SW is tiered for 3 levels of users.  The Supervisor level allows for measuring recipe development and calibration.  The Operator level allows the general SMX operator to load and run measuring recipes but protects these recipes from unauthorized alterations.  Finally, there is a Service level in the SW to allow maintenance engineers and applications experts to check and calibrate the operation of various instrument components.

Having developed and calibrated the initial recipes for measuring these photovoltaic formulations on a benchtop unit, the SMX-BEN, using small sections of glass test panels, it’s really interesting to see all of the various aspects that have to be covered in making the same measurements on “life-size” panels in a process control/manufacturing environment.

There is more here than meets the eye!

Dr. Bruce Scruggs, Product Manager Micro-XRF, EDAX

EDAX has introduced a product line of coating thickness measurement instruments based on XRF spectrometry.  These units were designed to measure coatings on samples varying in size from small parts to spools of metal sheet stock a mile long.  The markets for these products are generally in the areas of Quality Control/Quality Assurance and Process Control.

Recently, I received a simple, small electrical component, i.e. some type of solder contact or lug, and was asked to verify the coating thicknesses on the sample and check whether it was in specification or not.  It seemed like a simple enough task and I wasn’t expecting to learn anything special.

Figure 1: Electrical contact lug

Figure 1: Electrical contact lug

I was given the following specifications:
• Sn / Ni / Al (substrate)
• Sn thickness:  5 µm +/- 1 µm
• Ni thickness:  2 µm +/- 1 µm
• eyelet is coated; tail is uncoated

I made some measurements on the eyelet and the tail and these were consistent with the eyelet being coated with Sn and Ni and the tail section being an uncoated Al alloy.  There were some irregularities that I was not expecting.  I found trace Ga in the Al alloy.  I thought that was rather odd because I don’t see Ga that often.  I also found strong peak intensities for Zn and Cu which were completely inconsistent with the weak peaks found in the Al alloy.  A “standardless” modeling quantification analysis of the Al alloy indicated Zn and Cu at 40 ppm and Ga at 110 ppm.  Googling “Gallium in Aluminum alloys” produced numerous hits explaining that Ga is a trace element in bauxite, the raw material used to produce Al metal.  Hence, Ga is a trace impurity in Al alloys.  Incidentally, the following week, I saw trace Ga in every Al alloy I measured for another project.

Since the Zn and Cu peak intensities found in the measurement of the eyelet were much stronger than the base alloy, this means the Zn and Cu had to be in the Sn/Ni coatings.  After completing all the spectral measurements on the eyelet, I had to resort to polishing an edge on the eyelet and evaluating the Sn and Ni layers in cross-section using SEM-EDS to evaluate the content of the Sn and Ni layers.  The Sn and Ni layers were smeared because the polishing was done very quickly without embedding the sample in epoxy.  But, SEM-EDS clearly showed the Zn and Cu originating from the Ni layer and not the Sn layer.  So, now we had a layer system of Sn / Ni(ZnCu) / Al alloy.  It wasn’t clear to me whether the Zn and Cu represented a quality problem or not.

Figure 2: SEM image of the cross section of the edge of the eyelet. Th Sn and Ni layers can be seen from left to right

Figure 2: SEM image of the cross section of the edge of the eyelet. The Sn and Ni layers can be seen from left to right

Now we come to the actual measurement of the coating thickness.  Since, Sn and Ni foils are commercially available for coating calibration, I decided to use stackable Sn and Ni foils, i.e. 2.06 um Sn on 1.04 um Ni, (sourced from Calmetrics Inc, Holbrook, NY  USA) on an Al substrate to calibrate the coating model.  I also used pure Zn and Cu “infinites”, i.e. samples with a thickness such that further increase in thickness provides no increase in signal, to give the coating quantification model a point of reference for these other two elements not in my Ni foil standard.

I built a coating quantification model based on the Sn(K), Ni(K), Zn(K) and Cu(K) lines and another based on the Sn(L) lines as opposed to the Sn(K) lines.  The Sn(K) lines, being more energetic , allow you to measure thicker layers while the Sn(L) lines are more sensitive to layer variations for thinner layers.  Both coating quantification models were calibrated with the same standard.  But, to my surprise, measurements off the same point on the sample using these two different coating models didn’t agree!  This is often a question that our customers ask, “Why are the results not the same if I use a different line series?”

Table 1: Initial coating thickness measurements on the eyelet.

Table 1: Initial coating thickness measurements on the eyelet.

I pondered this result for a while and then remembered that X-rays are penetrating.  This is why this is an effective means of non-destructively measuring coatings.  After measuring the overall thickness of the part, i.e. 0.8 mm, and doing a few quick calculations, I realized that the Al alloy substrate is not thick enough to stop Sn(K) X-rays.  The website I like to use for these types of calculations is: http://henke.lbl.gov/optical_constants/filter2.html.

0.8 mm of Al only absorbs about 30% of the Sn(K) X-rays at 25.2 keV and this sample happens to be coated on BOTH sides of the substrate.  (The absorption for Sn(L) at 3.4 keV and Ni(K) at 7.5 keV happen to be essentially 100%.)  So, the measurement is seeing the Sn(K) from the top surface as well as the opposite surface coating while the measurement is only seeing the Sn(L) and Ni(K) from the top surface.  I thought it would be interesting to make the measurement again at the same spot after polishing off the coating on the opposing side of the part.

Table 2: Coating measurement at nominally same position as in Table 1 after removing the coating on the opposite side of the part.

Table 2: Coating measurement at nominally same position as in Table 1 after removing the coating on the opposite side of the part.

Now the Sn (and Ni) layers agree to within better than 10%. In this case, the result for the Ni layer also changes because, given the same Ni intensity in each case, the quantitative X-ray modeling will predict that the Ni layer thickness must decrease as the Sn layer thickness decreases. You can also see that the Sn layer is well out of specification and there is about 10 wt% Zn in the Ni layer.  I still don’t know if that’s a quality problem or not.  But, I was definitely impressed with how much I learned from just measuring this simple electrical part.