title>Verification Regulation of Tester for Dial Indicator Gauges - JJG 201-2008 - Chinese standardNet - bzxz.net
Home > JJ > Verification Regulation of Tester for Dial Indicator Gauges
Verification Regulation of Tester for Dial Indicator Gauges

Basic Information

Standard ID: JJG 201-2008

Standard Name:Verification Regulation of Tester for Dial Indicator Gauges

Chinese Name: 指示类量具检定仪检定规程

Standard category:National Metrology Standard (JJ)

state:Abolished

Date of Release2008-05-23

Date of Implementation:2008-11-23

Date of Expiration:2019-06-25

standard classification number

Standard ICS number:Metrology and measurement, physical phenomena >> 17.040 Length and angle measurement

Standard Classification Number:General>>Measurement>>A52 Length Measurement

associated standards

alternative situation:Replaces JJG 201-1999

Publication information

publishing house:China Metrology Press

ISBN:155026·J-2365

other information

drafter:Zhang Weidong, Chen Yongkang, Cui Xicai, et al.

Drafting unit:Henan Institute of Metrology, China Institute of Metrology, etc.

Focal point unit:National Technical Committee on Geometry and Engineering Parameters Metrology

Publishing department:General Administration of Quality Supervision, Inspection and Quarantine of the People's Republic of China

competent authority:National Technical Committee on Geometry and Engineering Parameters Metrology

Introduction to standards:

JJG 201-2008 Calibration Procedure for Indicating Gauge Calibrators JJG201-2008 Standard download decompression password: www.bzxz.net
This procedure is applicable to the initial calibration, subsequent calibration and in-use inspection of digital indicating gauge calibrators with a resolution of no more than 0.5μm and a measuring range of no more than 50mm and mechanical indicating gauge calibrators with a graduation value of no more than 10μm and a measuring range of no more than 25mm.

This procedure refers to the following documents:
JJF1001—1998 General metrological terms and definitions
JJF1059—1999 Evaluation and expression of measurement uncertainty
JJF1094—2002 Evaluation of measuring instrument characteristics
When using this procedure, attention should be paid to the use of the current valid versions of the above-mentioned referenced documents.
1 Scope (1)
2 References (1)
3 Overview (1)
4 Metrological performance requirements (2)
4.1 Relative position between the end face of the differential tube and the fixed sleeve (2)
4.2 Distance between the scaled surface of the fixed sleeve and the upper edge of the differential tube edge (2)
4.3 Surface roughness of the measuring surface of the measuring rod (2)
4.4 Flatness of the measuring surface of the measuring rod (2)
4.5 Perpendicularity between the measuring surface of the measuring rod and the axis of rotation of the measuring rod (2)
4.6 Perpendicularity and position between the cylinder axis and the measuring rod axis (2)
4.7 Repeatability (2)
4.8 Indication error (2)
4.9 Return error (3)
5 General technical requirements (3)
5.1 Appearance (3)
5.2 Interaction between various parts (4)
6 Control of measuring instruments (4)
6.1 Verification conditions (4)
6.2 Verification items and main verification instruments (4)
6.3 Verification methods (5)
6.4 Verification result processing (7)
6.5 Verification cycle (7)
Appendix A Evaluation of the uncertainty of the measurement result of the indication error of the indicating measuring instrument calibrator (8)
Appendix B Example of data processing of the indication error and return error of the mechanical dial indicator calibrator (13)
Appendix C Data processing example of indication error and return error of mechanical micrometer calibrator (15)
Appendix D Data processing example of indication error and return error of numerical control indicator calibrator (17)
Appendix E Data processing example of indication error of grating indicator calibrator

Some standard content:

National Metrology Verification Regulation of the People's Republic of China JJG201-—2008
Tester for Dial Indicator GaugesPromulgated on 2008-05-23
Implementation on 2008-11-23
Promulgated by the General Administration of Quality Supervision, Inspection and Quarantine of the People's Republic of China JJG201-2008
Verification Regulation of
TesterforDialIndicatorGaugesJJG201-2008
Replaces JJG201—1999
JJG271—1996
This regulation was approved by the General Administration of Quality Supervision, Inspection and Quarantine of the People's Republic of China on May 23, 2008, and came into effect on November 23, 2008.
Responsible unit: National Technical Committee on Metrology of Geometric Quantity and Engineering Parameters Main drafting unit: Henan Institute of Metrology, China Institute of Metrology and Testing
Luoyang Quality and Technical Supervision Inspection and Testing Center This regulation is entrusted to the National Technical Committee on Metrology of Geometric Quantity and Engineering Parameters to be responsible for the interpretation of this regulation Main drafters:
JJG201-2008
Zhang Weidong (Henan Institute of Metrology), Chen Yongkang (China Institute of Metrology and Testing), Cui Xicai (Luoyang Quality and Technical Supervision Inspection and Testing Center) Participating drafters:
Jia Xiaojie (Henan Institute of Metrology) 1
References
Overview·
Metrological performance requirements||tt ||JJG201—2008
Relative position of the end face of the differential tube and the fixed sleeve·4.2
Distance between the scaled surface of the fixed sleeve and the upper edge of the differential tube edge: Surface roughness of the measuring surface of the measuring rod
Flatness of the measuring surface of the measuring rod,
Verticality between the measuring surface of the measuring rod and the rotating axis of the measuring rodVerticality and position between the cylinder axis and the measuring rod axisRepeatability·
Indication error·
Return error·
General technical requirements
Interaction of various parts
Control of measuring instruments
Verification conditions
Verification items and main verification instruments
Verification method:
Processing of verification results·
Calibration cycle
Appendix A
Appendix B
Appendix C
Appendix D
Appendix E
Appendix F
Appendix G
Evaluation of uncertainty in measurement results of indication error of indicating gauge calibrators Data processing example of indication error and return error of mechanical micrometer calibrators Data processing example of indication error and return error of mechanical micrometer calibrators Data processing example of indication error and return error of CNC indicating gauge calibrators Data processing example of indication error of light shed grating indicating gauge calibrators Technical diagrams and requirements for tools required for calibrating indicating gauge calibrators Format of the inner pages of calibration certificates and calibration result notifications (1)||tt ||(2)
(2)
(2)
(2)
(2)
(3)
(3)
(4)
(5)
(8)
(15)
1 Scope
JJG201—2008
Calibration Procedure for Calibrators of Indicating Measuring Instruments
This procedure applies to the initial calibration, subsequent calibration and in-use inspection of digital indicating measuring instrument calibrators with a resolution not greater than 0.5pm and a measuring range not greater than 50mm and mechanical indicating measuring instrument calibrators with a graduation value not greater than 10μm and a measuring range not greater than 25mm.
References
This specification refers to the following documents:
JJF1001-1998 General metrological terms and definitions JJF1059—1999 Evaluation and expression of measurement uncertainty JF1094--2002 Evaluation of measuring instrument characteristics When using this specification, attention should be paid to using the current valid versions of the above-mentioned referenced documents. 3 Overview
Indicating gauge calibrator (hereinafter referred to as calibrator) is a measuring instrument that uses visual aiming and uses grating sensors or precision lead screws as measurement standards. It is mainly used to calibrate the indication error and return error of indicating meters (pointer, digital display), lever meters and internal diameter meters, as well as large-range micrometers. Calibrators are divided into vertical and horizontal types according to the structural type; they are divided into digital (including CNC and grating types) (see Figure 1 and Figure 2) and mechanical drum type (see Figure 3) according to the reading method; they can be divided into manual, semi-automatic and fully automatic types according to the operation method; they can be divided into micrometer calibrators, vernier calibrators and indicating gauge calibrators according to the instrument function. Figure 1 Schematic diagram of vertical grating indicating gauge calibrator 1-base; 2-micrometer head seat; 3-inspected gauge: 4-column: 5-base; 6-measuring rod: 7-handwheel 4 Metrological performance requirements
JJG201-2008
Figure 2 Schematic diagram of CNC indicating gauge calibrator
1-base: 2-inspected gauge: 3-base: 4-measuring rod: 5-micrometer head seat: 6-connecting rod: 7-stepping motor: 8-display and printing Controller Figure 3 Schematic diagram of mechanical micrometer calibrator
1-base; 2-probe; 3-tested table; 4-table stand; 5-differential cylinder: 6-sleeve Relative position of differential cylinder end face and fixed sleeve 4.1
When the zero scale line on the differential cylinder of the mechanical indicating measuring tool calibrator is aligned with the vertical scale line of the fixed sleeve, the distance between the differential cylinder end face and the millimeter scale line of the fixed sleeve is not more than 5 scale values ​​when the pressure line is pressed, and not more than 10 scale values ​​when the line is off. 4.2 The distance between the scale line surface of the fixed sleeve and the upper edge of the differential cylinder edge The distance between the scale line surface of the fixed sleeve and the upper edge of the differential cylinder edge of the mechanical indicating measuring tool calibrator is not more than 0.4mm.
4.3 Surface roughness of the measuring surface of the measuring rod
The surface roughness of the measuring surface of the measuring rod is not more than Rao.1um. Flatness of the measuring surface of the measuring rod
The flatness of the measuring surface of the measuring rod shall not exceed 1um: only the middle convexity is allowed, and the edge collapse is allowed within 0.2mm. 4.5 Perpendicularity between the measuring surface of the measuring rod and the rotating axis of the measuring rod The perpendicularity between the measuring surface of the measuring rod of the dial indicator calibrator and the rotating axis of the measuring rod shall not exceed 1'3\. 4.6 Perpendicularity and position of the cylindrical axis and the measuring rod axis When the column V-shaped clamp seat clamps cylinders of different diameters, the perpendicularity and position of the cylindrical axis and the measuring rod axis are within the clamping length of 50mm, and the verticality and position are not greater than 0.1mm. 4.7 Repeatability
The repeatability of the grating indicating measuring tool calibrator and the CNC dial indicator calibrator shall not exceed 0.1um. 4.8 Indication error
The indication error requirements are shown in Table 1.
Instrument name
Micrometer calibrator
Digital indicator calibrator (mechanical
type, numerical control type)
Grating indicator calibrator
4.9 Return error
JJG201—2008
Table 1 Indication error requirements
Measuring range/mm
Maximum allowable error/μm
Within any 1mm range
Within 2mm range
Any 1mm range
Any 2mm range
Any 5mm range
Any 1mm range
Any 10mm range
Any 25mm range
Any 1mm range
Any 10mm range
Any 30mm range
Any 50mm range
Any 1mm range
Any 2 Within mm range
Within 10mm range
Within any 1 mm range
Within any 2mm range
Within any 10mm range
Within any 30mm range
Within 50mm range
4.9.1 The return error of the grating indicator calibration instrument shall not exceed 0.5um; 4.9.2 The return error of the micrometer calibration instrument shall not exceed 0.5um; 4.9.3 The return error of the dial gauge calibration instrument shall not exceed 1μm. 5 General technical requirements
5.1 Appearance
5.1.1 The coating and plating of each part of the instrument shall be flat, evenly hooked, and of uniform color, without spots, peeling, etc. The working surface of the instrument and accessories shall be free of rust, bruises, burrs and other defects. 5.1.2 The engraved lines of mechanical indicating measuring instruments should be clear and complete; the digital display of digital indicating measuring instruments should be complete and clear; the letters and symbols on the buttons and knobs should be complete and clear; the lighting of the image recognition system should be uniform in brightness, with clear optical imaging and complete images. 3
JJG201-2008
5.1.3 The instrument shall be marked with the manufacturer's name (or factory logo), model and factory number. 5.1.4 The instruments that are subsequently calibrated and tested in use are allowed to have the above defects that do not affect the metrological performance. 5.2 Interaction of various parts
5.2.1 The moving parts of the instrument move smoothly, flexibly and without obstruction, and the stop parts act reliably. 5.2.2 The display and printing systems of the digital display indicating measuring instrument calibrator function normally. 6 Measuring instrument control
Measuring instrument control includes initial calibration, subsequent calibration and in-use inspection. 6.1 Calibration conditions
6.1.1 Environmental conditions
6.1.1.1 When calibrating mechanical and CNC calibrators, the temperature of the laboratory is (20±5)℃, and the temperature change is ≤1C/h. When calibrating grating calibrators, the temperature of the laboratory is (20±2)℃, and the temperature change is <1℃/h. 6.1.1.2 The relative humidity in the calibration room shall be ≤70%. 6.1.1.3 The temperature equilibrium time of the instrument under test and the calibration equipment in the calibration room shall be no less than 8 hours. 6.2 Calibration items and main calibration equipment
The calibration items and main calibration equipment of the calibration instrument are shown in Table 2. Table 2 Verification items and main verification instruments of the calibrator No.
Interaction of each part
Verification item
Relative position of the end face of the differential cylinder and the fixed sleeve Distance between the scaled surface of the fixed sleeve and the upper edge of the edge of the differential cylinder Surface roughness of the measuring surface of the measuring rod
Flatness of the measuring surface of the measuring rod
Verticality of the measuring surface of the measuring rod and the rotating axis of the measuring rod When the column V-shaped clamp holds cylinders of different diameters, the verticality and position of the cylinder axis and the measuring rod axis Repeatability
Indication error
Return error
Main verification instruments
Feeler gauge MPE: ± 12μm
Surface roughness comparison sample
MPE:-17%~+12%
2nd level plane flat crystal
1\Autocollimator
1st level normal micrometer;
2nd level square and special spindle
Resolution of 0.1um inductance
micrometer
3rd or 4th level gauge block; resolution of 0.01μm inductance
micrometer; torsion spring comparator with a graduation value of 1pμm
Note: "10" in the table means that it should be calibrated, and "2" means that it can be not calibrated. 4
In use
6.3 Verification method
6.3.1 Appearance
Visual observation.
6.3.2 Interaction of various parts
Visual observation and manual test.
JJG201—2008
6.3.3 Relative position of differential tube end face and fixed sleeve Visual observation and manual test.
6.3.4 The distance between the fixed sleeve scribed surface and the upper edge of the differential tube edge is measured by feeler gauge.
6.3.5 Surface roughness of the measuring surface of the measuring rod
Measured by comparison method using surface roughness comparison sample. 6.3.6 Flatness of the measuring surface of the measuring rod
Measured by light wave interference method using plane flat crystal. 6.3.7 Perpendicularity between the measuring surface of the measuring rod and the rotating axis of the measuring rod Adjust the autocollimator and the measuring surface of the measuring rod so that the reflected image of the measuring surface is in the field of view of the autocollimator. Rotate the measuring rod within the entire measuring range, find the minimum and maximum positions of the reflected image from the field of view, read the readings on the autocollimator, and half of the difference between the two readings is the calibration result.
6.3.8 Perpendicularity and position between the axis of the cylinder and the axis of the measuring rod when the column V-shaped clamp holds cylinders of different diameters For the calibration instrument with a measuring rod diameter of 6.5 mm, use the No. 1 and No. 2 spindles in Table F.1 of Appendix F to measure respectively; for the calibration instrument with a measuring rod diameter of 8 mm, use the No. 3 and No. 4 spindles in Table F.1 of Appendix F to measure respectively. When measuring verticality, clamp the special spindle on the column V-shaped clamp, and use a square ruler and a feeler gauge to measure at the upper and lower positions of the column respectively (see Figure 4). Figure 4 Test position of perpendicularity between cylinder axis and measuring rod axis 1 - cylinder; 2 - square; 3 - measuring rod of calibration instrument When measuring the position, use the normal micrometer to measure at the measuring position shown in Figure 5. The difference between the measured value and the cylinder diameter is the calibration result.
6.3.9 Repeatability
Install the inductive micrometer in the hole of the base (or bracket) of the grating indicator calibrator or CNC dial indicator calibrator to be tested, adjust its probe to contact the measuring surface of the calibrator's measuring rod, repeatedly move the calibrator's measuring rod 10 times in the same direction, so that the inductive micrometer displays the same value, record the maximum change of the calibrator's display value 10 times, and take the maximum change of 5
JJG201-2008
Figure 5 Positionality test position diagram of cylinder axis and measuring rod axis 1-cylinder; 2-normal micrometer probe: 3-calibrator measuring rod value 1/3.08 as the calibration result.
6.3.10 Indication error
6.3.10.1 The micrometer calibrator and grating indicator calibrator shall be calibrated with 3-level gauge blocks and an indicator whose indication error limit does not exceed ±0.1um (such as an inductive micrometer with a resolution of 0.01um). The vernier indicator calibrator (including mechanical and numerical control types) shall be calibrated with 4-level gauge blocks and an indicator whose indication error does not exceed ±1um (such as a torsion spring meter with a graduation value of 1um). The measuring sections and measuring intervals of various indicating gauge calibrators are shown in Table 3. Table 3 Measuring sections and measuring interval types of various indicating measuring instrument calibrators
Mechanical micrometer calibrator
Mechanical dial indicator calibrator
Grating indicating type
Measuring instrument calibrator
CNC dial indicator calibrator
Measuring range/mm
or 013
Measured measuring section
0mm~2mm
0mm~5mm
Select a 2mm section
0mm~25mm
Select a 10mm section
Select a 1mm section
0mm~10mm
Select a 1mm section||t t||0mm~50mm
Select a 10mm segment
Select a 1mm segment
0mm~10mm
Select a 1mm segment
0mm~50mm
Select a 10mm segment
Select a 1mm segment
Measurement interval or measurement point
The measurement interval is 0.2mm
The measurement interval is 1mm
The measurement interval is 0.2mm
The measurement points are 4.12mm,
9.21mm.14.36mm.
20.5mm, 24mm
The measurement interval is 2 mm
Measurement interval is 0.2mm
Measurement interval is 1mm
Measurement interval is 0.2mm
Measurement interval is 10mm
Measurement interval is 1mm
Measurement interval is 0.2mm
Measurement interval is 2mm
Measurement interval is 0.2mm
Measurement points are 4.12mm,
9.24mm.14.36mm
20.5 mm24mm、29mm、
39mm、49mm
The measuring interval is 2mm
The measuring interval is 0.2mm
JJG201—2008
6.3.10.2 Method for checking the indication error of grating indicator calibrator and micrometer calibrator Before measuring, install the three-bead workbench (see Appendix F Figure F.2) on the measuring rod, install the indicator probe in the bracket hole, and adjust the bracket so that the probe is aligned with the center of the three-bead workbench. Align the calibrator to the starting position of the selected measuring section, and then calibrate in the order of measuring sections from large to small. For the calibration of the subdivided measuring section, the section with the larger error in the previous measuring section should be selected.
When measuring, first place a gauge block with a size of 1mm on the three-bead workbench, adjust the indicator probe to contact the gauge block and make its indication zero. Then replace the gauge block according to the selected measurement interval (see Table 3), rotate the hand wheel or differential of the calibration instrument to the test point, and read the error value of each point on the indicator. This measurement should be carried out on the positive and negative strokes of the selected measurement section. After measuring the positive stroke of each measurement section, the coin needs to be moved 10 divisions in the positive stroke, and then measured in the reverse direction. During the measurement process, no adjustment should be made and the moving direction of the measuring rod should not be changed. If the size of the gauge block used cannot measure the selected measurement range in sequence at one time, it should be measured in sections and its error value should be calculated. 6.3.10.3 Vernier indicator calibration is the calibration of the indication error of the indicator (mechanical and CNC). Before measurement, install the indicator probe in the hole of the dial base and calibrate in the order of measuring sections from large to small. Adjust it to the starting position of the selected measuring section, and then adjust the probe to the starting position of the selected measuring section. The calibration of the measuring section should be carried out in a section with a larger error in the previous measuring section.
When measuring,
clamp a gauge block of size 1 between the indicator probe and the measuring surface of the measuring rod of the calibrator, adjust the indicator to align with "zero", change the gauge blocks in turn according to the measuring interval (see Table 3), rotate the calibrated hand wheel or differential cylinder to the inspected point, and read the error value of each point on the indicator. The measurement should be carried out on the reverse stroke of the selected measuring section.
After measuring the positive stroke of each measuring section, it is necessary to move 10 divisions to the positive stroke and then measure in the reverse direction. During the measurement process, no adjustment or change in the moving force of the measuring rod should be made. Digital verification processing
The indication error of the instrument is determined by the positive and negative values ​​of each measuring section.
The difference between the maximum and the minimum value of the indication error of the inspected point in the reverse stroke wwW.bzxz.Net
In the calculation, the indication error of each inspected point is calculated by the following formula: e
Indication error of the inspected point in the stroke calibration o
Reading of the indicator at the inspected point m
△L. Deviation of the center length of the gauge block used for zero position, um; Deviation of the center length of the gauge block used for calibration AL
For details on data processing, see Appendix B
LISHING
GD and E.
U.S. calibration instrument indication error is 1/3 of the other methods. The measurement uncertainty U5 can also be used.
6.3.11 Return error
Take the difference between the two readings of the inspected points in each measuring section in the data of 6.3.10 on the forward and reverse strokes. The absolute value is the return error of the inspected point. The maximum value of the return error of each inspected point is taken as the calibration result. 6.4 Processing of calibration results
The calibration instrument that meets the requirements of this regulation after calibration shall be issued with a calibration certificate. If it does not meet the requirements of this regulation, a calibration result notice shall be issued, and the unqualified items shall be indicated. 6.5 Calibration cycle
The calibration cycle of the calibration instrument shall generally not exceed 1 year Appendix A
JJG201—2008
Evaluation of uncertainty in measurement results of indication error of indicating measuring instrument calibration instrument A.1 Measurement method
According to the provisions of this regulation, the indication error of grating indicating measuring instrument calibration instrument (such as an inductive micrometer with a resolution of 0.1um) is measured using an indicator (such as an inductive micrometer with a resolution of 0.01um) and a three-level gauge block. Comparison method is used for calibration. It is expressed as the difference between the maximum value of the indication error and the minimum value of the indication error in the entire measuring range of the instrument. A.2 Mathematical model
yemaxemin
Where:
emax, emin
Where:
Where: ei
Indication error of the instrument under test, um;
represents the maximum value and minimum value of the indication error of the inspected point, um. e=a(L△L)
Indication error of the i-th inspected point in the calibration of this stroke, um; Reading of the indicator when inspecting the i-th inspected point, um; Deviation of the center length of the gauge block used for zero position, um; Deviation of the center length of the gauge block used for the i-th inspected point, um. Considering the influence of laboratory temperature, Formula A.2 can be rewritten as e=Lm-(a+LL)+Lm·amAtm+(Lh-L)·aAt where: Lm
αm, αb
Atp, Atm
indication of the instrument under test at the test point, mm; reading of the inductive micrometer at the test point, um; are the actual sizes of the calibration and zero-use gauge blocks, mm, respectively; are the thermal expansion coefficients of the instrument under test and the gauge block, ℃-1 are the temperatures at which the gauge block and the instrument under test deviate from 20℃, ℃. Assume: 0αmαi,
o,=At—At,
AtAtm~Ath
L=LnL—Lo, 4
e=Lm-(a+Lh-L)+L..At+L:αo
A.3 Variance and sensitivity coefficient
u=(y)=u(L)+u(a)+cu()+cu(L)+cu()+cu() Where: c=
Y=—/2,
A.4 Calculation of standard uncertainty
=—2,
ay=J2L·ah
A.4.1 Uncertainty component u (L.)Ch introduced by the quantization error of the tested instrument
In view of the fact that the resolution of the tested instrument is greater than the measurement repeatability, the quantization error of the tested instrument is taken as the introduced uncertainty component. The resolution of the instrument under test is 0.2um, and its quantization error is 0.1um. Assuming that it is uniformly distributed in the range of "-0.1um to +0.1um", then:3 Variance and sensitivity coefficient
u=(y)=u(L)+u(a)+cu()+cu(L)+cu()+cu() Where: c=
Y=—/2,
A.4 Calculation of standard uncertainty
=—2,
ay=J2L·ah
A.4.1 Uncertainty component u(L.)Ch
In view of the fact that the resolution of the instrument under test is greater than the measurement repeatability, the quantization error of the instrument under test is taken as the introduced uncertainty component. The resolution of the instrument under test is 0.2um, and its quantization error is 0.1um. Assuming that it is uniformly distributed in the interval of "-0.1um~+0.1um", then: 83 Variance and sensitivity coefficient
u=(y)=u(L)+u(a)+cu()+cu(L)+cu()+cu() Where: c=
Y=—/2,
A.4 Calculation of standard uncertainty
=—2,
ay=J2L·ah
A.4.1 Uncertainty component u(L.)Ch
In view of the fact that the resolution of the instrument under test is greater than the measurement repeatability, the quantization error of the instrument under test is taken as the introduced uncertainty component. The resolution of the instrument under test is 0.2um, and its quantization error is 0.1um. Assuming that it is uniformly distributed in the interval of "-0.1um~+0.1um", then: 8
Tip: This standard content only shows part of the intercepted content of the complete standard. If you need the complete standard, please go to the top to download the complete standard document for free.