parameters of measurement

Why is There an Error in My Measurement?

Sia Afshari, Global Marketing Manager, EDAX
 download (1)

One interesting part of my job has been the opportunity to meet people of all technical backgrounds and engage in conversations about their interest in analytical instrumentation and ultimately what a measurement means!

I recall several years back on a trip to Alaska I met a group of young graduate students heading to the Arctic to measure the “Ozone Hole.”  Being an ozone lover, I start asking questions about the methodology that they were going to use in their attempt to accomplish this important task, especially the approach for comparative analysis about the expansion of the Ozone hole!

I learned that analytical methods used for ozone measurement have changed over time and the type of instrumentation utilized for this purpose has changed along with the advances in technology.  I remember asking about a common reference standard that one would use among the various techniques over the years to make sure the various instrumentation readings are within the specified operation parameters and the collected data represents the same fundamental criteria obtained by different methods, under different circumstances over time.  I was taken by the puzzled look of these young scientists about my questions regarding cross comparison, errors, calibration, reference standards, statistical confidence, etc. that I felt I’d better stop tormenting them and let them enjoy their journey!

Recently I had an occasion to discuss analytical requirements for an application that included both bulk and multilayer coating samples with a couple of analysts.  We talked about the main challenge for multi-layer analysis as being the availability of a reliable type standard that truly represents the actual samples being analyzed.  It was noted that the expectation of attainable accuracy for the specimen where type standards are not available need to be evaluated by consideration of the errors involved and the propagation of such errors through measurements especially when one approaches “infinite thickness” conditions for some of the constituents!

As the demand for more “turn-key” systems increases where users are more interested in obtaining the “numbers” from analytical tools, it is imperative as a manufacturer and SW developer to imbed the fundamental measurement principals into our data presentation in a manner that a measurement result is qualified and quantified with a degree confidence that is easily observable by an operator.  This is our goal as we set to embark on development of the next generation intelligent analytical software.

The other part of our contribution as a manufacturer is the training aspect of the users in their understanding of the measurement principals.  It is imperative to emphasize the basics and the importance of following a set of check sheets for obtaining good results!

My goal is to use a series of blogs as a venue for highlighting the parameters that influence a measurement, underlying reasons for errors in general, and provide references for better understanding of the expectation of performance for analytical equipment in general and x-ray analysis in particular!

So to start:

My favorite easy readings on counting statistics and errors are old ones but classics never go out of style:
• Principles and Practices of X-ray Spectrometric Analysis, by Eugene Bertin, Chapter 11, “Precision and Error: Counting Statistics.”
• Introduction to the Theory of Error, by Yardley Beers (Addison-Wesley, 1953).  Yes, it is old!

Also, Wikipedia has a very nice write up on basic principles of measurement uncertainty that could be handy for a novice and even an experienced user that was recommended by my colleague!  If you don’t believe in Wikipedia, at least the article has a number of linked reference documents for further review.
• https://en.wikipedia.org/wiki/Measurement_uncertainty

As food for thought on measurements, I would consider:
• What I am trying to do?
• What is my expectation of performance in terms of accuracy and precision?
• Are there reference standards that represent my samples?
• What techniques are available for measuring my samples?

With recognition of the facts that:
• There is no absolute measurement technique, all measurements are relative.  There is uncertainty in every measurement!
• The uncertainty in a measurement is a function of systematic errors, random errors, and bias.
• All measurements are comparative in nature, so there is a requirement for reference standards.
• Reference standards that represent the type of samples analyzed provide the best results.
• One cannot measure more accurately than the specified reference standard’s error range.  (Yes reference standards have error too!)
• Fundamental Parameters (FP) techniques are expedients when type standards are not available but have limitations.
• A stated error for a measurement needs to be qualified with the degree of confidence as number, i.e. standard deviation.
• Precision is a controllable quantity in counting statistics by extending the measurement time.
• What else is present in the sample often is as important as the targeted element in analysis. Sample matrix does matter!
• The more complex the matrix of the specimen being measured, the more convoluted are the internal interactions between the matrix atoms.
• Systematic errors are in general referred to as Accuracy and random errors as precision.
• The uncertainties/errors add in quadrature (the square root of the sum of squares).

Till next time, where we will visit these topics and other relevant factors in more details, questions, suggestions, and related inputs are greatly appreciated. By the way, I am still thinking that the ozone layer may not be being measured scientifically these days!