Instrument calibration serves the following purposes.
1. It ensures that readings from a specific instrument are consistent with those of other measurements.
2. It determines the accuracy of the instrument`s readings.
3. It establishes instrument`s reliability.
In most cases of large scale manufacturing plants, instrument calibration is performed by companies that provide calibration services. Companies like OQ International of Illinois specialize in instrument calibration services for a wide range of industory and sector applications.
Tolerance and Calibration IntervalsThe precise mechanisms used to assign tolerance values differ by country and also industry type. Instrument manufacturers generally assign the measurement tolerance, suggesting calibration interval and specifying environmental ranges of storage and use. Additionally, they are also responsible for the actual calibration interval. This is dependent on the likely level of usage of the particular measuring instrument.
Defining the Calibration ProcessThe next step is to the define calibration process. Selection of standards or a standard is one of the visible parts of the calibration process. The standard, ideally, has less than a quarter (1/4) of the measurement uncertainty of a particular device which is being calibrated. After this goal is achieved, accumulated measurement uncertainty of the entire standards involved is considered insignificant when the final measurement is also made with the 4:1 ratio.
It is difficult to maintain a 4:1 accuracy ration with modern equipment. Test equipment which is being calibrated can just be accurate like the working standard. In case accuracy ratio is below (less than) 4:1, calibration tolerance is made smaller to compensate. Only exact match between the device which is being calibrated and the standard is entirely correct calibration when 1:1 ratio is reached. Reducing the accuracy of the device which is being calibrated is another common way for dealing with the capability mismatch.
An Example of Instrument CalibrationHere is an example. A gage which has three percent (3%) manufacturer stated accuracy can actually be changed to four percent (4%) so that a one percent (1%) accuracy standard can be used at 4:1. In case the gage is used in an application which requires sixteen percent (16%) accuracy, reducing the accuracy of the gage to four percent (4%) will have no effects on the final measurements accuracy. This is known as limited calibration.
If the final measurements are requiring ten percent (10%) accuracy, then the three percent (3%) gage can never be better than 3.3:1. The better solution is to adjust the gage`s calibration tolerance. If calibration is done at one hundred (100) units, then one percent (1%) standard would be anywhere between ninety nine (99) and one hundred and one (101) units. Acceptable calibration values where the test equipment is at 4:1 accuracy ratio would be ninety six to one hundred and four (99 to 104) units, inclusive.
Changing acceptable range to ninety seven (97) to one hundred and three (103) units would actually remove potential contribution of the entire standards and preserve a 3.3:1 ratio. Further changes to the acceptable range to ninety eight (98) to one hundred and two (102) restores more than a 4:1 final ratio.
There can be precise connection techniques between the device which is being calibrated and the standard which may affect the calibration. An example is the electronic calibrations which involve analog phenomena. Impedance of cable connection can influence the results directly.
In conclusion, this is a general process of doing calibration on instruments together with 3 main purposes of instrument calibration.