Calibrated Temperature Data Loggers
Temperature data loggers are devices which record temperature over time and store that data for later analysis or can provide alarms in real time such as for use in refrigerators.
Contents
- What is Calibration
- Calibration Types
- Calibration Certificates
- Calibration Standards - UKAS, NIST
- Difference between precision and accuracy
- Calibration uncertainty, measurement uncertainty
- How often to calibrate?
- What is calibration error?
- What is calibration slope?
- What is a three-point calibration?
- Why should you calibrate?
- What is a calibration procedure?
- How do you calibrate a temperature logger?
- Do data loggers need to be calibrated?
- How often should data loggers be calibrated?
- Frequently Asked Questions
What is calibration?
What is the purpose of calibration? It is essential to realise that all measurement instruments have a degree of uncertainty in the measurements they provide. It is impossible to measure any parameter precisely. The question is what amount of variability in the measurement is acceptable; in other words, the measurement tolerance. Various standards set the acceptable level of uncertainty, and the purpose of calibration is to ensure the data provided by the instrument falls within these. In the case of a temperature data logger, calibration is the process used to ensure that the temperatures recorded by a data logger are accurate and conform to the specified calibration standard. Temperature data loggers are used in various critical applications such as food storage and medicine, for instance, ensuring that vaccines are stored at the correct temperatures. This means that they must provide accurate readings at all times, and calibration ensures that this is the case. The temperature data logger calibration procedure involves comparing the results obtained from one data to those provided by another instrument - for instance, a high accuracy thermometer - the reference instrument that provides very precise readings and is certified regularly by an accredited laboratory. The two instruments record temperatures in the same environment, typically an environmental chamber, over a specified period. If the results vary, adjustments are made to the instrument under calibration, or it is replaced.
Calibration Types
We are often asked what are the different types of calibration? In general, there are two main types. These include 'Traceable Calibration Certificate' and 'UKAS Calibration certificate'. While in broad terms they are similar, there are distinct differences you should be aware of. In traceable calibration, the instrument is referenced against a pre-calibrated device, and the error is calculated. The reference instrument has previously been calibrated to UKAS standards. This is a form of second-generation calibration which guarantees that the instrument is accurate within the stipulated standards. If the instrument falls outside these, it is adjusted and retested until it does conform. The instrument is then provided with a certificate stating that it meets the required standard. Only a UKAS accredited laboratory can carry out a UKAS calibration. Such a laboratory will be certified to ISO/IEC 17025, which specifies requirements for "the competence, impartiality and consistent operation of laboratories" in carrying out tests and calibrations. In this case, a certificate is issued if the instrument passes the calibration standards, but it is not adjusted and retested. Instead, it is returned uncertified to the organisation that requested calibration.
Calibration Certificates
A calibration certificate is an official record that an instrument has been calibrated and complies with the stated calibration standards. It provides to the NIST (National Institute of Standards and Technology). Calibration certificates are provided for both types of calibration – traceable calibration and UKAS calibration. They provide information on the instrument's condition and its performance under the specified test conditions and include its serial number. Where instruments have been adjusted, the certificate will also record the instrument's condition when it was received and its condition following adjustment and recalibration. A UKAS calibration certificate will include the UKAS logo. In addition to confirming that an instrument complied with the specified standard when tested, calibration certificates also provide a historical record of the instrument's performance.
Calibration Standards - UKAS, NIST
Suppose you are trying to discover what is a calibration standard. In that case, you might be confused about the different calibration standards that exist and, in particular, the difference between UKAS and NIST. This can be not very clear, as all calibration standards can be considered links in a chain.
- UKAS stands for the United Kingdom Accreditation Service and is the UK's government-appointed accreditation body. One of its roles is to assess organisations that provide calibration services and provide accreditation to laboratories that meet ISO 17025 standards. The standard applies only to specifically accredited calibration techniques.
- NISTThe National Institute of Standards and Technology is a non-regulatory government body based in the US that has a leading role in advancing measurement science, standards and technology. While US-based, NIST standards are adopted globally. NIST traceable calibration certifies that the laboratory carrying out the calibration can do so to NIST standards, and products made by that organisation comply with NIST-maintained standards. All measurements taken with a NIST certified instrument must have an unbroken measurement chain that leads to NIST standards, with each link in that chain having known and documented uncertainties.
Difference between precision and accuracy
People often confuse the terms precision and accuracy, however, they mean very different things. To summarise:
- Accuracy refers to how close a measurement is to its correct or true value. It is a measure of systematic errors of the measuring instrument.
- Precision refers to how close multiple measurements are to each other regardless of how accurate they might be. In other words, it relates to the reproducibility of the measurements. You can think of it as a measure of random measurement errors or statistical variability.
Note that any measurement system can be accurate and not precise, or precise and not accurate. However, you can only rely on measurements that are both accurate and precise. We refer to measurements that are both accurate and precise as valid.
Calibration Uncertainty and measurement uncertainty
The uncertainty of measurement can have many causes. For instance, it can arise from the measuring instrument, from what is being measured, from the person carrying out the measurement, the environment, and more. Usually, we estimate the uncertainty by carrying out a statistical analysis of multiple measurements and considering the whole measurement process. A measurement is only meaningful if its degree of uncertainty is also recorded. It is essential to differentiate between error and uncertainty. While the error is the difference between the actual and measured value, uncertainty is a quantitative statement of the doubt in the measurement result. Statistically, we use the mean value of multiple measurements to estimate the actual value and the standard deviation (spread) of these measurements as an estimate of the uncertainty. Measurement uncertainty is the overall uncertainty of the measurement, and it includes uncertainties that arise from the causes we have already mentioned. An additional element of the general uncertainty can be attributed to the uncertainty in the calibration of your instrument - its calibration uncertainty. Calibration uncertainty is built into every measurement you make, but it is also quantifiable. Calibration laboratories will always provide information on the calibration uncertainty of their process. It combines several elements:
- Calibration and Measurement Capability (CMC) of the laboratory, in other words, the measurement capability of the laboratory
- Unit Under Test (UUT) resolution and repeatability which is a measurement uncertainty due to resolution and repeatability limitations of the instrument
These are converted into standard deviations used to calculate the combined uncertainty, which is then expanded to the desired confidence level. While you don't need to understand the detail (it is very complicated), you can use the stated calibration uncertainty as an element in your overall measurement uncertainty.
How often to calibrate?
Recommended calibration frequency depends on the individual requirements of your project. The more critical your measurements, the shorter should be your calibration interval. It could be monthly, quarterly, annually or even bi-annually, depending on how often you use the instrument and how confident you need to be in its measured values.
What is calibration error?
Calibration error is the amount of uncertainty in the instrument's output that arises from errors during the calibration procedure. Even the highly accurate instruments used in calibration have errors, but these are known and quantifiable. When you have an instrument calibrated, you will receive a certificate that states the calibration error.
What is calibration slope?
A calibration slope plots the known values on the Y-axis against the measured values on the X-axis. A line is drawn through the data points using linear regression. The equation of the line is y = mx + b, where m is the slope of the line and b is the intercept. We use the line to adjust measurements between set temperature points during calibration.
What is a three-point calibration?
Three-point calibration provides improved accuracy compared to single-point calibration. In the case of a temperature data logger, it involves calibrating at a range of temperatures – high, middle and low - rather than at a single temperature point. The process provides an improved accuracy across the whole range.
Why should you calibrate?
By calibrating an instrument against a known standard you can rely on, such as a high accuracy thermometer, you can gain confidence in the accuracy of the instrument readings. If you fail to calibrate, then you are in danger of recording inaccurate data, which could have profound implications, especially in critical sectors such as pharmaceutical and food storage.
What is a calibration procedure?
A calibration procedure is a document that sets out a certified method for testing and verifying the performance characteristics, tolerances and specifications of a measuring instrument such as a wireless temperature logger. It documents a process of verifying the performance of the instrument under test against its performance specification.
How do you calibrate a temperature logger?
All temperature loggers, including those incorporating Bluetooth for connection to a mobile app, are calibrated against a calibrated reference thermometer. The probes are placed in an environmental chamber. The temperature is adjusted and stabilised. The temperatures provided by the data logger under test are then compared with those indicated by the reference thermometer. If these measurements fall outside the specification, adjustments may be made to bring them back into specification.
Do data loggers need to be calibrated?
How often should data loggers be calibrated?
Data loggers need to be calibrated regularly to ensure that their readings' accuracy is within specified tolerances.
Data loggers should be calibrated regularly, but the interval between calibrations will vary with circumstances. While it is usual to calibrate them annually, you may need more frequent calibration if you use them in critical environments. If a data logger has been physically stressed, for instance, if it has been dropped, it is a wise precaution to have it calibrated immediately.
Frequently Asked Questions
Calibrating your temperature probes is critically essential. All temperature probes can drift over time, leading to inaccurate results. Suppose you are working in a safety-critical industry, such as foods or medicine. In that case, failing to calibrate your probes regularly is potentially dangerous and could even put lives at risk.
To check the accuracy of a digital thermometer, you should compare its readings with those of a NIST certified thermometer. The NIST certification will state the thermometer's accuracy. Compare the readings of both thermometers under a range of test conditions such as ice and water and boiling water. If they give the same values within the required uncertainty limits, then your digital thermometer is accurate.
It would be best if you calibrated your digital thermometer at a range of temperatures. While a rough and ready test is to check its reading in a mixture of ice and water, there is no guarantee that the actual temperature is 0 OC. Any impurities in the water can make a significant difference. Ideally, you should compare its reading with a UKAS or NIST calibrated thermometer and adjust your thermometer to that temperature.
You can do a rough and ready check using an ice bath made from crushed ice and water, but this could vary from 0 oC by one or two degrees, so it's not entirely reliable. If possible, you should check it against a calibrated high accuracy thermometer, especially if you use it in critical situations.
Some digital thermometers are designed to measure specific temperature ranges such as those used in medical and veterinary situations to check body temperature. When reading a temperature less than the typical body temperature range, for instance, room temperature, it will display "LOW".
Over time, the temperatures displayed by a digital thermometer can drift from the actual value. This drift is why it is crucial to calibrate digital thermometers to ensure that they continue to provide accurate readings.
Calibration of instruments is very important, especially if they are being used in critical fields. In some instances, it is a legal obligation to have instruments calibrated regularly. When you have an instrument calibrated by a designated laboratory, you will be provided with a calibration certificate that states the instrument's accuracy.
Temperature calibration is carried out to check the accuracy of a digital thermometer, and in many situations, the procedure also adjusts the instrument's output to improve its accuracy. The procedure sets out how to check the instrument's reading against another device known to be accurate and has undergone a validation test by a certified laboratory.
Questions? Please let us know
There are a lot of temperature monitoring products on the market and we have designed ours to be as easy to use as possible but if you have any questions at all about whether Smashtag is best for your application please get in touch, we’d love to help if we can.