How often do I need to re-calibrate my nuclear gauge?

As good as never!

Modern and sophisticated radiometric systems offer a high degree of user-friendliness and do so without the need for recalibration during regular operation.
Do you need to certify that your instruments is operating according to specifications?  We are happy to support you and confirm the calbration by certificate.

Contact our experts

Information and facts about calibrations

Nuclear gauges work on the principal of attenuation. Basically, every matter interacts with ɣ-radiation and has an attenuating effect. Under process control perspective, this is not only the media to be measured, it is also the steel walls of the vessel, any inside construction, insulation, framework, etc.. Hence, a factory calibration is not feasible. Typically, the system is calibrated in known process conditions, e.g., a full and empty vessel in case of a continuous level measurement or at least one known density of the fluid in the pipeline for a density system.
But despite the greatest care in calibration, often a systematic error applies or is basically inherent in the calibration procedure.

In cases were a rod detector together with a point source is used, the correlation between count rate and filling level is not linear (longer radiation path and larger effective wall thicknesses at the lower end of the measurement range). Fig. 8 illustrates this effect. In case the measurement device now uses a straight line (blue) as a calibration this will yield to systematically reduced accuracy of the system, yet maintaining full reproducibility. Or in other words, the system will always indicate the same level at the same process conditions but with a deviation from the real level (red). A multi-point calibration could be used to linearize the curve, reducing the systematic error and therefore improve accuracy of the measurement (green). Nevertheless, often the reduced accuracy due to the non-exact linearity of the measurement is negligible and the high reproducibility of the measurement is key.


Typical sources of error are:

  • Calibration under the wrong conditions: Calibration should be performed as close as possible to real process conditions. Different temperatures, gas densities, running agitators, spray nozzles, etc. can affect the attenuation. If conditions during calibration are different from those during the actual process, this will result in a systematic error.
  • Using an incorrect reference: calibrating a system assuming an empty vessel (0% level), although there is still a quantity of product in the vessel.
  • For density measurements, samples used to determine the process value in the calibration table are not representative of the real condition
  • Recording of calibration count rates with too short measuring time - see also section "Statistical errors".

Calibration curve