title>Calibration Specification for Length Measuring Machine - JJF 1066-2000 - Chinese standardNet - bzxz.net
Home > JJ > Calibration Specification for Length Measuring Machine
Calibration Specification for Length Measuring Machine

Basic Information

Standard ID: JJF 1066-2000

Standard Name:Calibration Specification for Length Measuring Machine

Chinese Name: 测长机校准规范

Standard category:National Metrology Standard (JJ)

state:in force

Date of Release2000-05-08

Date of Implementation:2001-01-01

standard classification number

Standard Classification Number:General>>Measurement>>A52 Length Measurement

associated standards

alternative situation:JJG 54-1984

Publication information

publishing house:China Metrology Press

ISBN:155026-1114

Publication date:2004-04-22

other information

drafter:Zhang Yuwen, Chen Zhaoju, etc.

Drafting unit:Aviation Industry Research Institute No. 304

Focal point unit:National Technical Committee on Geometric Length Metrology

Publishing department:State Administration of Quality and Technical Supervision

Introduction to standards:

JJF 1066-2000 Calibration specification for length measuring machines JJF1066-2000 Standard download decompression password: www.bzxz.net
This specification is applicable to the calibration of length measuring machines with a size up to 6000mm.


Some standard content:

National Metrology Technical Specification of the People's Republic of China JJF1066—2000
Calibration Specification for Length Measuring Machine
Calibration Specification for Length Measuring Machine2000-05-08 Issued
Implementation on 2000-10—01
Issued by the State Administration of Quality and Technical Supervision
JJF1066—2000
Calibration Specification for Length Measuring Machine
JJF 1066—2000
Replaces JJG541984
This specification was approved by the State Administration of Quality and Technical Supervision on May 8, 2000, and came into effect on October 1, 2000.
Responsible unit: National Technical Committee for Geometric Length Metrology Drafting unit: Aviation Industry No. 04 Research Institute This specification is entrusted to the National Technical Committee for Geometric Length Metrology to be responsible for interpreting this specification Main drafters:
Zhang Yuwen
Chen Zhaoju
Participating drafters:
Zhou Zili
JJF1066—2000
(Aviation Industry No. 304 Research Institute)
(Aviation Industry No. 304 Research Institute)
(Aviation Industry No. 04 Research Institute)
Scope·
Referenced documents
Metrological characteristics·
Calibration conditions
Calibration items and calibration methods.
Handling of calibration results
Recalibration time interval
Appendix A
Format of the inner pages of the calibration certificate
JJF1066—2000
(2)
1 Scope
JJF1066-2000
Calibration specification for length measuring machines
This specification is applicable to the calibration of length measuring machines with a size of up to 6000mm. 2 Cited documents
The provisions contained in the following documents constitute the provisions of this specification through reference in this specification. When this specification is published, the versions shown are valid. All standards or procedures will be revised, and the parties using this standard should explore the possibility of using the latest versions of the following documents. JB/T7400—1994 Length measuring machine
JJG894—1995 Calibration procedures for standard push ring gauges. 3 Overview
Length measuring machine is an optical mechanical measuring instrument for length measurement, used to directly measure precision measuring tools; such as the calibration rod of the scale; and can use the higher-level gauge block to calibrate the lower-level gauge block and measure other precision parts by comparison method. The graduation value of the length measuring machine is 0.001mm, and the measuring range is (0~1000), (0~2000), (0~3000), (0~6000) mm.
4 Metrological characteristics
4.1 Appearance and interaction
4.1.1 The manufacturer's name or factory logo, factory number and MC mark should be engraved on the instrument. 4.1.2 There should be no defects such as rust, bruises and obvious scratches on the working surface; there should be no defects that affect the appearance and quality on the non-working surface.
4.1.3 All moving parts should work smoothly without jamming and jumping: the effect of tightening screws should be effective. 4.1.4 All scales and index lines in the instrument field of view should be parallel without parallax, and the millimeter scale should be parallel to the moving direction of the measuring seat and should not have obvious deflection.
4.1.5 The scale lines should not be broken, knotted or thickened more than half of the scale width. 4.1.6 The instrument in use should have no appearance defects that affect the accuracy of use. 4.2 The angular runout of the measuring seat or tailstock moving along the base guide rail shall not exceed 10" in the horizontal and vertical planes. 4.3 The coaxiality of the measuring axis and the tail tube measuring axis shall not exceed 0.2mm. 4.4 The adjustment error of the tail tube measuring rod adjustment component shall not exceed 0.3um. 4.5 The reliability of workbench adjustment
Using a spherical measuring cap shall not exceed 0.2μm; using a ±8mm flat measuring cap shall not exceed 0.3μm. 4.6 The indication error caused by the inconsistency of the magnification of the measuring seat and the tailstock objective lens shall not exceed (1+L/1000)um, where L is the calibrated length in mm.
4.7 The indication error of the scale
The indication error shall not exceed the provisions of Table 1
Inspected part
4.8 Indication variability
Micrometer scale
J3F 1066-2000
Metric scale
+0.2: +0.25
(0.6+L/200)
Outer dimension measurement is not more than 0.1um, inner dimension measurement is not more than 0.5um4.9 The reliability of internal measurement accessories is not more than 0.8ums4.10 Calibration ring gauge aperture: measurement uncertainty U=0.5m (K3). 4.11 The flexibility of V-shaped bracket adjustment is not more than 0.3una5
Accurate conditions
5.1 Environmental conditions
Temperature: (20±1)°C:
Room temperature change is not more than 0.3°c/h;
Relative thickness is not more than 75%:
Metric scale
+ (0.5+ L/100)
The time for the calibrated instrument to balance the temperature in the room shall be no less than 24 hours for the measuring range up to 3000mm and no less than 48 hours for the measuring range up to 6000mm. The temperature difference between the calibrated instrument and the standard instrument shall not exceed 0.3℃. 5.2 Calibration standards and other equipment
The size of the second-order blocks is 1, 1.02, 1.03, 1.04, 1.06, 1.08, 1.10mm; the size of the second-order blocks is 20, 40, 60, 80, 100, 200, 300, 400, 500, 600, 700, 800900.1 000 mm three-equal gauge block;
Dual laser interferometer;
Standard ring gauges of 20mm and 50mm. 6 Calibration items and calibration methods
6.1 Appearance and appropriate functions of each part
Observation and test
Mirror reading
6.3 Coaxiality of measuring shaft axis and tail pipe measuring axis 2
JJF 1066--2000
The micrometer bracket with the positioning block is placed on the guide surface of the base of the length measuring machine. The height difference A between the tail tube and the light tube is measured with the horizontal surface of the guide as the base surface, and the front and rear offset of the tail tube and the light tube is measured with the vertical surface of the guide as the base surface. The coaxiality A of the light tube and the tail tube is calculated according to formula (1). 4=2+43
The coaxiality can be measured by the following method: remove the optical meter tube on the measuring seat, and the moving tailstock can smoothly pass through the inner hole of the measuring seat, which is in compliance with the requirements.
6.4 Adjustment error of the tail tube measuring rod adjustment component Install a flat measuring cap with a diameter of 8mm on the measuring rod of the light tube and the tail tube, move the tailstock and the measuring seat until the two flat measuring caps are in contact, and turn the fine screw of the tail tube measuring rod to align the zero line of the micrometer scale or a certain scale line adjacent to it with the index line. Then adjust the screws for adjusting the two diameters of the tail tube measuring rod respectively to find the minimum value. Restore the state before adjustment and adjust twice according to the process described above. The minimum value should not be greater than the requirement of 4.4. Remove the flat measuring cap from the measuring rod, install the spherical measuring cap, and calibrate the center according to the above method. 6.5 Reliability of workbench adjustment
6.6 Indication error caused by the inconsistency of the magnification of the objective lens of the measuring seat and tail seat 6.6.1 Direct calibration
Adjust the spherical measuring cap installed on the measuring rod of the light tube and tail tube to the correct state, and then move the tail seat so that the cold line of the decimeter scale is 0.7mm or 0.5 rmir to the left of the center of the field of view (projection screen reading mechanism). Move the measuring seat to align the zero line of the millimeter scale with the zero line of the decimeter scale, and then record the reading of the micrometer scale. Move the scale holder again so that the zero line of the decimeter scale is at 0.7mm or 0.3m to the right of the center of the field of view. Similarly, align the zero line of the decimeter scale with the zero line of the decimeter scale, and record the reading of the micrometer scale: the difference between the two readings should not exceed 4.6.
When calibrating at the position of each decimeter scale below 1000mm, use the corresponding size of the gauge block. When calibrating at the position of each decimeter scale above 1000mm, use the segmented method gauge and the size of 1.4mm or 1mm gauge block. 6.6.2 Segmented calibration
For length measuring machines with a size greater than 1000mm, calibrate by the segmented method. When using the segmented method gauge, move the scale holder to the position of the decimeter scale to be inspected, and make the double line 0.7mm or 0.6mm to the left of the center of the field of view.5mm3 Move the measuring seat to zero position, and place the positioning bracket and the auxiliary optical meter bracket on the corresponding side of the measuring seat and the tailstock. Adjust the auxiliary optical meter so that the spherical measuring cap on its measuring rod contacts the spherical measuring cap on the tail tube measuring rod, and adjust them to the correct state, and slightly move the screw of the tail tube to make the indication of the auxiliary optical meter at zero position. Fine-move the measuring seat to align the zero line of the millimeter scale with the double lines of the decimeter scale, adjust the positioning bracket so that its positioning surface contacts the spherical measuring cap of the light tube, and use the auxiliary fine-motion screw to make the indication of the micrometer scale at zero. 6.6.3 Calibration of Laser Interferometer
Install the laser interferometer as shown in the figure, align the tailstock with any decimeter scale position, and make the image of the decimeter double scale 0.7mm to the left of the center of the field of view of the measuring seat, move the measuring seat to align the zero position scale of the millimeter scale, and then reset the reading of the laser interferometer: move the tailstock so that the image of the decimeter double scale is 0.7mII to the right of the center of the field of view of the measuring seat, move the measuring seat to re-align the zero position of the millimeter scale with the decimeter double scale, and read the laser interferometer. The difference between the two readings should not exceed the requirement of 4.6.
Figure 1 Laser interferometer calibration device
1-Dual frequency laser interferometer; 2-Measuring seat; 3~~Interferometer; 4 Reflector 6.7 Scale indication error
6.7.1 Micrometer scale indication error
The micrometer scale indication error is calibrated directly with a second-class gauge block or calibrated with a third-class gauge block by the "pairing method". The calibration should be at least distributed in ±30, ±60 and ±901m16 positions. Before calibration, the spherical measuring cap installed on the optical tube and tail tube measuring rod should be adjusted to the correct state. The size of the gauge block selected for each inspection point and the number of pairs are shown in Table 2. During calibration, the first block is aligned to the zero position, the second gauge block is calibrated to the indication error of the inspection point, and then the second gauge block is aligned to the zero position, and the third gauge block is calibrated to the indication error of the inspection point, and so on, it is advisable to the last gauge block of the required paired gauge block. When calibrating the positive scale, the size of the gauge block is increased in increments; when calibrating the negative scale, the size of the gauge block is increased and decreased in increments of 4.
Comparison point/um
Pairing number
JJF1066-2000
The indication error of each calibration point is calculated according to (2). Gauge block size/mm
1, 1.03, 1.06, 1.09,1.12
1,1.06,1.12,1.18.1.24
1,1.09,1.18,1.27
(L, - L).1 000
Wherein: ——the reading at the calibration point (um); L the actual size of the gauge block used at the calibration point (mm); L, one by one to align the actual size of the zero gauge block (mm) (um)
For the opponent's non-optical meter tube type micron scale, use 4 blocks with sizes of 1, 1.03, 1.06, and 1.09mm to calibrate the micron scale at 0, 30, 60, and 90m positions. 6.7.2 Error of indication of millimeter scale
6.7.3 Error of indication of decimeter scale
6.7.3.1 Direct calibration
Use three equal gauge blocks to calibrate points at intervals of 100mm. Use the gauge blocks to calibrate directly within the range of (0-1000) mm. For the range exceeding 1.000mm, use the gauge block to calibrate in sections. Install the spherical measuring cap under the measuring rod, move the housing seat and the measuring seat, so that the two spherical measuring caps are in contact, and align the zero position of the decimeter scale with the zero position of the micrometer scale. Use the adjustment screw on the tail tube to adjust the spherical measuring cap to the correct position. Move the micrometer scale fine screw to align the micrometer scale to zero. Remove the tailstock, install the 100mm gauge block on the workbench, then move the tailstock on the calibration position so that its spherical surface contacts the working surface of the gauge block, adjust the workbench to rotate in the horizontal and vertical directions to find the minimum value of the gauge block. The micro-motion measuring seat makes the required position of the millimeter scale align with the double scale lines of the calibrated decimeter scale. At this time, record the reading from the micrometer reading device. The difference between the reading of the calibration point and the deviation value of the block used is the indication error of the calibration point. In this way, calibrate each decimeter position below 1000mm in turn.
6.7.3.2 Segment calibration
The indication error of the calibration point is calculated according to formula (3): 8a + [(h + a) (b + ao1 1 boz t a2) -2 where: the indication error of the decimeter scale as the zero position of this section; the deviation of the gauge block.
Use the same method as above to calibrate the accuracy error of other decimeter scales. (3)
Each calibration point of the decimeter scale should be calibrated at least twice. The difference between the two calibration results should not be greater than ± (0.53×103L) μm. The average value is taken as the accuracy error of the point. (L—nominal size of the variable calibration point in mm)
Figure 2 Segmented calibration device
1—optical tube: 2-head seat; 3-positioning bracket; 4-~auxiliary tail tube; 5—auxiliary optical meter 6-auxiliary optical meter bracket; 7-block; 8-working table: 9-tower tube: 10 tailstock 6.7.3.3
Laser interferometer calibration
6.8 Indication variability
6.8.1, external dimensions
JJF 1066--2000
Install the spherical measuring cap on the measuring rod, move the tailstock and the measuring seat to make the two spherical measuring caps contact each other, use the adjusting screw on the tail tube to adjust the spherical measuring cap to the correct position, then install the 50mm gauge block on the workbench, make the spherical measuring cap contact with the working surface of the gauge block, adjust the workbench to find the minimum value of the gauge block. Then turn the dial for at least 10 times, observe the change of the indicator, and the difference between the maximum and minimum values ​​shall not exceed the requirement of 4.8. 6.8.2 Inner dimension
Install a small measuring hook on the measuring shaft and the special tail pipe, so that the measuring head of the measuring hook contacts the inner hole of the 20mm diameter ring gauge placed on the workbench, adjust the workbench to find the maximum diameter of the ring gauge, turn the measuring hook 10 times, and observe the change of the indicator. The difference between the maximum and minimum values ​​is the calibration value. 6.9 Verification of standard ring gauges
The standard ring gauges shall be verified in accordance with the verification procedures for standard ring gauges in JJ(894-1995. 6.10 When the V-shaped bracket T is used for repeatability
calibration, first adjust the spherical measuring increments installed on the light tube and the tail tube measuring rod to the correct state, and then align the zero line of the millimeter scale with the zero line of the decimeter scale. Turn the tail tube fine-motion screw to align the micrometer scale to zero. After moving the tailstock to the 500mm scale position, install two V-shaped brackets on the guide at the appropriate position between the measuring seat and the seat. Place the gauge block on the V-shaped bracket and lift the bracket so that the gauge block is on the measuring axis. Move the measuring tailstock, and use the fine-motion mechanism of the measuring seat again to align the micrometer scale. Align the zero line with the double lines of the decimeter scale, then use the nut of the rib-shaped bracket to lower the bracket and find the minimum value of the gauge block. Change this state, raise and lower the bracket again, and find the minimum value of the gauge block. This calibration should be performed at least 3 times, and the difference in the minimum value should not exceed 4, meeting the requirements of 11. 6.11 Reliability of internal measurement accessories
The small internal measuring hook is calibrated with a 20mm diameter ring gauge, and the internal measuring hook is calibrated with a 50mm diameter gauge. The internal measurement accessories are installed on the light tube and the tail tube and adjusted to the correct position. Install the ring gauge on the workbench so that the measuring hook contacts the inner hole of the ring gauge, and at the same time, the micrometer scale is at about zero position, and the workbench is moved horizontally to find ring A
JJF 1066-2000
The maximum value of the hole diameter of the ring gauge, rotate the workbench around the horizontal axis, and find the minimum value of the hole diameter of the ring gauge. Repeat the above method, and write down the reading after the indication stabilizes. Then change the state of the workbench, operate again according to the above method, and write down the reading after the indication stabilizes. The difference between the two readings should not exceed the requirement of 4.9. 7 Processing of calibration results
The calibrated length measuring machine is issued a calibration certificate. The calibration certificate should give the calibration results and measurement uncertainty of the millimeter and decimeter scales.
Recalibration time interval
The recommended recalibration time interval is 1 year.7mm, move the measuring seat to align the zero position of the millimeter scale, and then the laser interferometer reading is cleared: move the tailstock to make the image of the decimeter double scale 0.7mII to the right of the center of the field of view of the measuring seat, move the measuring seat to aim the zero position of the millimeter scale with the decimeter double scale again, and read the laser interferometer. The difference between the two readings should not exceed the requirement of 4.6.
Figure 1 Laser interferometer calibration device
1-Dual frequency laser interferometer; 2-Measuring seat; 3~~Interferometer; 4 Reflector 6.7 Scale indication error
6.7.1 Micrometer scale indication error
The micrometer scale indication error is calibrated directly with a second-level block or calibrated with a third-level block by the "pairing method". The calibration should be distributed at least at ±30, ±60 and ±901m16 positions. Before calibration, adjust the spherical measuring cap installed on the optical tube and tail tube measuring rod to the correct state. The size of the gauge block selected for each inspection point and the number of pairs are shown in Table 2. During calibration, the first block is aligned to the zero position, the second block is used to calibrate the indication error of the inspection point, and then the second block is aligned to the zero position, the third block is used to calibrate the indication error of the inspection point, and so on, until the last block of the required paired gauge blocks. When calibrating the positive scale, the size of the gauge block is increased in an ascending manner; when calibrating the negative scale, the size of the gauge block is increased and decreased in an ascending and descending manner according to 4
.
Accuracy point/um
Number of pairs
JJF1066-2000Www.bzxZ.net
The indication error of each calibration point is calculated according to (2). Gauge block size/mm
1, 1.03, 1.06, 1.09,1.12
1,1.06,1.12,1.18.1.24
1,1.09,1.18,1.27
(L, - L).1 000
Wherein: ——the reading at the calibration point (um); L the actual size of the gauge block used at the calibration point (mm); L, one by one to align the actual size of the zero gauge block (mm) (um)
For the opponent's non-optical meter tube type micron scale, use 4 blocks with sizes of 1, 1.03, 1.06, and 1.09mm to calibrate the micron scale at 0, 30, 60, and 90m positions. 6.7.2 Error of indication of millimeter scale
6.7.3 Error of indication of decimeter scale
6.7.3.1 Direct calibration
Use three equal gauge blocks to calibrate points at intervals of 100mm. Use the gauge blocks to calibrate directly within the range of (0-1000) mm. For the range exceeding 1.000mm, use the gauge block to calibrate in sections. Install the spherical measuring cap under the measuring rod, move the housing seat and the measuring seat, so that the two spherical measuring caps are in contact, and align the zero position of the decimeter scale with the zero position of the micrometer scale. Use the adjustment screw on the tail tube to adjust the spherical measuring cap to the correct position. Move the micrometer scale fine screw to align the micrometer scale to zero. Remove the tailstock, install the 100mm gauge block on the workbench, then move the tailstock on the calibration position so that its spherical surface contacts the working surface of the gauge block, adjust the workbench to rotate in the horizontal and vertical directions to find the minimum value of the gauge block. The micro-motion measuring seat makes the required position of the millimeter scale align with the double scale lines of the calibrated decimeter scale. At this time, record the reading from the micrometer reading device. The difference between the reading of the calibration point and the deviation value of the block used is the indication error of the calibration point. In this way, calibrate each decimeter position below 1000mm in turn.
6.7.3.2 Segment calibration
The indication error of the calibration point is calculated according to formula (3): 8a + [(h + a) (b + ao1 1 boz t a2) -2 where: the indication error of the decimeter scale as the zero position of this section; the deviation of the gauge block.
Use the same method as above to calibrate the accuracy error of other decimeter scales. (3)
Each calibration point of the decimeter scale should be calibrated at least twice. The difference between the two calibration results should not be greater than ± (0.53×103L) μm. The average value is taken as the accuracy error of the point. (L—nominal size of the variable calibration point in mm)
Figure 2 Segmented calibration device
1—optical tube: 2-head seat; 3-positioning bracket; 4-~auxiliary tail tube; 5—auxiliary optical meter 6-auxiliary optical meter bracket; 7-block; 8-working table: 9-tower tube: 10 tailstock 6.7.3.3
Laser interferometer calibration
6.8 Indication variability
6.8.1, external dimensions
JJF 1066--2000
Install the spherical measuring cap on the measuring rod, move the tailstock and the measuring seat to make the two spherical measuring caps contact each other, use the adjusting screw on the tail tube to adjust the spherical measuring cap to the correct position, then install the 50mm gauge block on the workbench, make the spherical measuring cap contact with the working surface of the gauge block, adjust the workbench to find the minimum value of the gauge block. Then turn the dial for at least 10 times, observe the change of the indicator, and the difference between the maximum and minimum values ​​shall not exceed the requirement of 4.8. 6.8.2 Inner dimension
Install a small measuring hook on the measuring shaft and the special tail pipe, so that the measuring head of the measuring hook contacts the inner hole of the 20mm diameter ring gauge placed on the workbench, adjust the workbench to find the maximum diameter of the ring gauge, turn the measuring hook 10 times, and observe the change of the indicator. The difference between the maximum and minimum values ​​is the calibration value. 6.9 Verification of standard ring gauges
The standard ring gauges shall be verified in accordance with the verification procedures for standard ring gauges in JJ(894-1995. 6.10 When the V-shaped bracket T is used for repeatability
calibration, first adjust the spherical measuring increments installed on the light tube and the tail tube measuring rod to the correct state, and then align the zero line of the millimeter scale with the zero line of the decimeter scale. Turn the tail tube fine-motion screw to align the micrometer scale to zero. After moving the tailstock to the 500mm scale position, install two V-shaped brackets on the guide at the appropriate position between the measuring seat and the seat. Place the gauge block on the V-shaped bracket and lift the bracket so that the gauge block is on the measuring axis. Move the measuring tailstock, and use the fine-motion mechanism of the measuring seat again to align the micrometer scale. Align the zero line with the double lines of the decimeter scale, then use the nut of the rib-shaped bracket to lower the bracket and find the minimum value of the gauge block. Change this state, raise and lower the bracket again, and find the minimum value of the gauge block. This calibration should be performed at least 3 times, and the difference in the minimum value should not exceed 4, meeting the requirements of 11. 6.11 Reliability of internal measurement accessories
The small internal measuring hook is calibrated with a 20mm diameter ring gauge, and the internal measuring hook is calibrated with a 50mm diameter gauge. The internal measurement accessories are installed on the light tube and the tail tube and adjusted to the correct position. Install the ring gauge on the workbench so that the measuring hook contacts the inner hole of the ring gauge, and at the same time, the micrometer scale is at about zero position, and the workbench is moved horizontally to find ring A
JJF 1066-2000
The maximum value of the hole diameter of the ring gauge, rotate the workbench around the horizontal axis, and find the minimum value of the hole diameter of the ring gauge. Repeat the above method, and write down the reading after the indication stabilizes. Then change the state of the workbench, operate again according to the above method, and write down the reading after the indication stabilizes. The difference between the two readings should not exceed the requirement of 4.9. 7 Processing of calibration results
The calibrated length measuring machine is issued a calibration certificate. The calibration certificate should give the calibration results and measurement uncertainty of the millimeter and decimeter scales.
Recalibration time interval
The recommended recalibration time interval is 1 year.7mm, move the measuring seat to align the zero position of the millimeter scale, and then the laser interferometer reading is cleared: move the tailstock to make the image of the decimeter double scale 0.7mII to the right of the center of the field of view of the measuring seat, move the measuring seat to aim the zero position of the millimeter scale with the decimeter double scale again, and read the laser interferometer. The difference between the two readings should not exceed the requirement of 4.6.
Figure 1 Laser interferometer calibration device
1-Dual frequency laser interferometer; 2-Measuring seat; 3~~Interferometer; 4 Reflector 6.7 Scale indication error
6.7.1 Micrometer scale indication error
The micrometer scale indication error is calibrated directly with a second-level block or calibrated with a third-level block by the "pairing method". The calibration should be distributed at least at ±30, ±60 and ±901m16 positions. Before calibration, adjust the spherical measuring cap installed on the optical tube and tail tube measuring rod to the correct state. The size of the gauge block selected for each inspection point and the number of pairs are shown in Table 2. During calibration, the first block is aligned to the zero position, the second block is used to calibrate the indication error of the inspection point, and then the second block is aligned to the zero position, the third block is used to calibrate the indication error of the inspection point, and so on, until the last block of the required paired gauge blocks. When calibrating the positive scale, the size of the gauge block is increased in an ascending manner; when calibrating the negative scale, the size of the gauge block is increased and decreased in an ascending and descending manner according to 4
.
Accuracy point/um
Number of pairs
JJF1066-2000
The indication error of each calibration point is calculated according to (2). Gauge block size/mm
1, 1.03, 1.06, 1.09,1.12
1,1.06,1.12,1.18.1.24
1,1.09,1.18,1.27
(L, - L).1 000
Wherein: ——the reading at the calibration point (um); L the actual size of the gauge block used at the calibration point (mm); L, one by one to align the actual size of the zero gauge block (mm) (um)
For the opponent's non-optical meter tube type micron scale, use 4 blocks with sizes of 1, 1.03, 1.06, and 1.09mm to calibrate the micron scale at 0, 30, 60, and 90m positions. 6.7.2 Error of indication of millimeter scale
6.7.3 Error of indication of decimeter scale
6.7.3.1 Direct calibration
Use three equal gauge blocks to calibrate points at intervals of 100mm. Use the gauge blocks to calibrate directly within the range of (0-1000) mm. For the range exceeding 1.000mm, use the gauge block to calibrate in sections. Install the spherical measuring cap under the measuring rod, move the housing seat and the measuring seat, so that the two spherical measuring caps are in contact, and align the zero position of the decimeter scale with the zero position of the micrometer scale. Use the adjustment screw on the tail tube to adjust the spherical measuring cap to the correct position. Move the micrometer scale fine screw to align the micrometer scale to zero. Remove the tailstock, install the 100mm gauge block on the workbench, then move the tailstock on the calibration position so that its spherical surface contacts the working surface of the gauge block, adjust the workbench to rotate in the horizontal and vertical directions to find the minimum value of the gauge block. The micro-motion measuring seat makes the required position of the millimeter scale align with the double scale lines of the calibrated decimeter scale. At this time, record the reading from the micrometer reading device. The difference between the reading of the calibration point and the deviation value of the block used is the indication error of the calibration point. In this way, calibrate each decimeter position below 1000mm in turn.
6.7.3.2 Segment calibration
The indication error of the calibration point is calculated according to formula (3): 8a + [(h + a) (b + ao1 1 boz t a2) -2 where: the indication error of the decimeter scale as the zero position of this section; the deviation of the gauge block.
Use the same method as above to calibrate the accuracy error of other decimeter scales. (3)
Each calibration point of the decimeter scale should be calibrated at least twice. The difference between the two calibration results should not be greater than ± (0.53×103L) μm. The average value is taken as the accuracy error of the point. (L—nominal size of the variable calibration point in mm)
Figure 2 Segmented calibration device
1—optical tube: 2-head seat; 3-positioning bracket; 4-~auxiliary tail tube; 5—auxiliary optical meter 6-auxiliary optical meter bracket; 7-block; 8-working table: 9-tower tube: 10 tailstock 6.7.3.3
Laser interferometer calibration
6.8 Indication variability
6.8.1, external dimensions
JJF 1066--2000
Install the spherical measuring cap on the measuring rod, move the tailstock and the measuring seat to make the two spherical measuring caps contact each other, use the adjusting screw on the tail tube to adjust the spherical measuring cap to the correct position, then install the 50mm gauge block on the workbench, make the spherical measuring cap contact with the working surface of the gauge block, adjust the workbench to find the minimum value of the gauge block. Then turn the dial for at least 10 times, observe the change of the indicator, and the difference between the maximum and minimum values ​​shall not exceed the requirement of 4.8. 6.8.2 Inner dimension
Install a small measuring hook on the measuring shaft and the special tail pipe, so that the measuring head of the measuring hook contacts the inner hole of the 20mm diameter ring gauge placed on the workbench, adjust the workbench to find the maximum diameter of the ring gauge, turn the measuring hook 10 times, and observe the change of the indicator. The difference between the maximum and minimum values ​​is the calibration value. 6.9 Verification of standard ring gauges
The standard ring gauges shall be verified in accordance with the verification procedures for standard ring gauges in JJ(894-1995. 6.10 When the V-shaped bracket T is used for repeatability
calibration, first adjust the spherical measuring increments installed on the light tube and the tail tube measuring rod to the correct state, and then align the zero line of the millimeter scale with the zero line of the decimeter scale. Turn the tail tube fine-motion screw to align the micrometer scale to zero. After moving the tailstock to the 500mm scale position, install two V-shaped brackets on the guide at the appropriate position between the measuring seat and the seat. Place the gauge block on the V-shaped bracket and lift the bracket so that the gauge block is on the measuring axis. Move the measuring tailstock, and use the fine-motion mechanism of the measuring seat again to align the micrometer scale. Align the zero line with the double lines of the decimeter scale, then use the nut of the rib-shaped bracket to lower the bracket and find the minimum value of the gauge block. Change this state, raise and lower the bracket again, and find the minimum value of the gauge block. This calibration should be performed at least 3 times, and the difference in the minimum value should not exceed 4, meeting the requirements of 11. 6.11 Reliability of internal measurement accessories
The small internal measuring hook is calibrated with a 20mm diameter ring gauge, and the internal measuring hook is calibrated with a 50mm diameter gauge. The internal measurement accessories are installed on the light tube and the tail tube and adjusted to the correct position. Install the ring gauge on the workbench so that the measuring hook contacts the inner hole of the ring gauge, and at the same time, the micrometer scale is at about zero position, and the workbench is moved horizontally to find ring A
JJF 1066-2000
The maximum value of the hole diameter of the ring gauge, rotate the workbench around the horizontal axis, and find the minimum value of the hole diameter of the ring gauge. Repeat the above method, and write down the reading after the indication stabilizes. Then change the state of the workbench, operate again according to the above method, and write down the reading after the indication stabilizes. The difference between the two readings should not exceed the requirement of 4.9. 7 Processing of calibration results
The calibrated length measuring machine is issued a calibration certificate. The calibration certificate should give the calibration results and measurement uncertainty of the millimeter and decimeter scales.
Recalibration time interval
The recommended recalibration time interval is 1 year.1 000
Wherein: ——the reading at the calibration point (um); L the actual size of the gauge block used at the calibration point (mm); L, one by one to align the actual size of the zero gauge block (mm) (um)
For the opponent's non-optical meter tube type micron scale, use 4 blocks with sizes of 1, 1.03, 1.06, and 1.09 mm to calibrate the micron scale at 0, 30, 60, and 90m positions. 6.7.2 Error of indication of millimeter scale
6.7.3 Error of indication of decimeter scale
6.7.3.1 Direct calibration
Use three equal gauge blocks to calibrate points at intervals of 100mm. Use the gauge blocks to calibrate directly within the range of (0-1000) mm. For the range exceeding 1.000mm, use the gauge block to calibrate in sections. Install the spherical measuring cap under the measuring rod, move the housing seat and the measuring seat, so that the two spherical measuring caps are in contact, and align the zero position of the decimeter scale with the zero position of the micrometer scale. Use the adjustment screw on the tail tube to adjust the spherical measuring cap to the correct position. Move the micrometer scale fine screw to align the micrometer scale to zero. Remove the tailstock, install the 100mm gauge block on the workbench, then move the tailstock on the calibration position so that its spherical surface contacts the working surface of the gauge block, adjust the workbench to rotate in the horizontal and vertical directions to find the minimum value of the gauge block. The micro-motion measuring seat makes the required position of the millimeter scale align with the double scale lines of the calibrated decimeter scale. At this time, record the reading from the micrometer reading device. The difference between the reading of the calibration point and the deviation value of the block used is the indication error of the calibration point. In this way, calibrate each decimeter position below 1000mm in turn.
6.7.3.2 Segment calibration
The indication error of the calibration point is calculated according to formula (3): 8a + [(h + a) (b + ao1 1 boz t a2) -2 where: the indication error of the decimeter scale as the zero position of this section; the deviation of the gauge block.
Use the same method as above to calibrate the accuracy error of other decimeter scales. (3)
Each calibration point of the decimeter scale should be calibrated at least twice. The difference between the two calibration results should not be greater than ± (0.53×103L) μm. The average value is taken as the accuracy error of the point. (L—nominal size of the variable calibration point in mm)
Figure 2 Segmented calibration device
1—optical tube: 2-head seat; 3-positioning bracket; 4-~auxiliary tail tube; 5—auxiliary optical meter 6-auxiliary optical meter bracket; 7-block; 8-working table: 9-tower tube: 10 tailstock 6.7.3.3
Laser interferometer calibration
6.8 Indication variability
6.8.1, external dimensions
JJF 1066--2000
Install the spherical measuring cap on the measuring rod, move the tailstock and the measuring seat to make the two spherical measuring caps contact each other, use the adjusting screw on the tail tube to adjust the spherical measuring cap to the correct position, then install the 50mm gauge block on the workbench, make the spherical measuring cap contact with the working surface of the gauge block, adjust the workbench to find the minimum value of the gauge block. Then turn the dial for at least 10 times, observe the change of the indicator, and the difference between the maximum and minimum values ​​shall not exceed the requirement of 4.8. 6.8.2 Inner dimension
Install a small measuring hook on the measuring shaft and the special tail pipe, so that the measuring head of the measuring hook contacts the inner hole of the 20mm diameter ring gauge placed on the workbench, adjust the workbench to find the maximum diameter of the ring gauge, turn the measuring hook 10 times, and observe the change of the indicator. The difference between the maximum and minimum values ​​is the calibration value. 6.9 Verification of standard ring gauges
The standard ring gauges shall be verified in accordance with the verification procedures for standard ring gauges in JJ(894-1995. 6.10 When the V-shaped bracket T is used for repeatability
calibration, first adjust the spherical measuring increments installed on the light tube and the tail tube measuring rod to the correct state, and then align the zero line of the millimeter scale with the zero line of the decimeter scale. Turn the tail tube fine-motion screw to align the micrometer scale to zero. After moving the tailstock to the 500mm scale position, install two V-shaped brackets on the guide at the appropriate position between the measuring seat and the seat. Place the gauge block on the V-shaped bracket and lift the bracket so that the gauge block is on the measuring axis. Move the measuring tailstock, and use the fine-motion mechanism of the measuring seat again to align the micrometer scale. Align the zero line with the double lines of the decimeter scale, then use the nut of the rib-shaped bracket to lower the bracket and find the minimum value of the gauge block. Change this state, raise and lower the bracket again, and find the minimum value of the gauge block. This calibration should be performed at least 3 times, and the difference in the minimum value should not exceed 4, meeting the requirements of 11. 6.11 Reliability of internal measurement accessories
The small internal measuring hook is calibrated with a 20mm diameter ring gauge, and the internal measuring hook is calibrated with a 50mm diameter gauge. The internal measurement accessories are installed on the light tube and the tail tube and adjusted to the correct position. Install the ring gauge on the workbench so that the measuring hook contacts the inner hole of the ring gauge, and at the same time, the micrometer scale is at about zero position, and the workbench is moved horizontally to find ring A
JJF 1066-2000
The maximum value of the hole diameter of the ring gauge, rotate the workbench around the horizontal axis, and find the minimum value of the hole diameter of the ring gauge. Repeat the above method, and write down the reading after the indication stabilizes. Then change the state of the workbench, operate again according to the above method, and write down the reading after the indication stabilizes. The difference between the two readings should not exceed the requirement of 4.9. 7 Processing of calibration results
The calibrated length measuring machine is issued a calibration certificate. The calibration certificate should give the calibration results and measurement uncertainty of the millimeter and decimeter scales.
Recalibration time interval
The recommended recalibration time interval is 1 year.1 000
Wherein: ——the reading at the calibration point (um); L the actual size of the gauge block used at the calibration point (mm); L, one by one to align the actual size of the zero gauge block (mm) (um)
For the opponent's non-optical meter tube type micron scale, use 4 blocks with sizes of 1, 1.03, 1.06, and 1.09 mm to calibrate the micron scale at 0, 30, 60, and 90m positions. 6.7.2 Error of indication of millimeter scale
6.7.3 Error of indication of decimeter scale
6.7.3.1 Direct calibration
Use three equal gauge blocks to calibrate points at intervals of 100mm. Use the gauge blocks to calibrate directly within the range of (0-1000) mm. For the range exceeding 1.000mm, use the gauge block to calibrate in sections. Install the spherical measuring cap under the measuring rod, move the housing seat and the measuring seat, so that the two spherical measuring caps are in contact, and align the zero position of the decimeter scale with the zero position of the micrometer scale. Use the adjustment screw on the tail tube to adjust the spherical measuring cap to the correct position. Move the micrometer scale fine screw to align the micrometer scale to zero. Remove the tailstock, install the 100mm gauge block on the workbench, then move the tailstock on the calibration position so that its spherical surface contacts the working surface of the gauge block, adjust the workbench to rotate in the horizontal and vertical directions to find the minimum value of the gauge block. The micro-motion measuring seat makes the required position of the millimeter scale align with the double scale lines of the calibrated decimeter scale. At this time, record the reading from the micrometer reading device. The difference between the reading of the calibration point and the deviation value of the block used is the indication error of the calibration point. In this way, calibrate each decimeter position below 1000mm in turn.
6.7.3.2 Segment calibration
The indication error of the calibration point is calculated according to formula (3): 8a + [(h + a) (b + ao1 1 boz t a2) -2 where: the indication error of the decimeter scale as the zero position of this section; the deviation of the gauge block.
Use the same method as above to calibrate the accuracy error of other decimeter scales. (3)
Each calibration point of the decimeter scale should be calibrated at least twice. The difference between the two calibration results should not be greater than ± (0.53×103L) μm. The average value is taken as the accuracy error of the point. (L—nominal size of the variable calibration point in mm)
Figure 2 Segmented calibration device
1—optical tube: 2-head seat; 3-positioning bracket; 4-~auxiliary tail tube; 5—auxiliary optical meter 6-auxiliary optical meter bracket; 7-block; 8-working table: 9-tower tube: 10 tailstock 6.7.3.3
Laser interferometer calibration
6.8 Indication variability
6.8.1, external dimensions
JJF 1066--2000
Install the spherical measuring cap on the measuring rod, move the tailstock and the measuring seat to make the two spherical measuring caps contact each other, use the adjusting screw on the tail tube to adjust the spherical measuring cap to the correct position, then install the 50mm gauge block on the workbench, make the spherical measuring cap contact with the working surface of the gauge block, adjust the workbench to find the minimum value of the gauge block. Then turn the dial for at least 10 times, observe the change of the indicator, and the difference between the maximum and minimum values ​​shall not exceed the requirement of 4.8. 6.8.2 Inner dimension
Install a small measuring hook on the measuring shaft and the special tail pipe, so that the measuring head of the measuring hook contacts the inner hole of the 20mm diameter ring gauge placed on the workbench, adjust the workbench to find the maximum diameter of the ring gauge, turn the measuring hook 10 times, and observe the change of the indicator. The difference between the maximum and minimum values ​​is the calibration value. 6.9 Verification of standard ring gauges
The standard ring gauges shall be verified in accordance with the verification procedures for standard ring gauges in JJ(894-1995. 6.10 When the V-shaped bracket T is used for repeatability
calibration, first adjust the spherical measuring increments installed on the light tube and the tail tube measuring rod to the correct state, and then align the zero line of the millimeter scale with the zero line of the decimeter scale. Turn the tail tube fine-motion screw to align the micrometer scale to zero. After moving the tailstock to the 500mm scale position, install two V-shaped brackets on the guide at the appropriate position between the measuring seat and the seat. Place the gauge block on the V-shaped bracket and lift the bracket so that the gauge block is on the measuring axis. Move the measuring tailstock, and use the fine-motion mechanism of the measuring seat again to align the micrometer scale. Align the zero line with the double lines of the decimeter scale, then use the nut of the rib-shaped bracket to lower the bracket and find the minimum value of the gauge block. Change this state, raise and lower the bracket again, and find the minimum value of the gauge block. This calibration should be performed at least 3 times, and the difference in the minimum value should not exceed 4, meeting the requirements of 11. 6.11 Reliability of internal measurement accessories
The small internal measuring hook is calibrated with a 20mm diameter ring gauge, and the internal measuring hook is calibrated with a 50mm diameter gauge. The internal measurement accessories are installed on the light tube and the tail tube and adjusted to the correct position. Install the ring gauge on the workbench so that the measuring hook contacts the inner hole of the ring gauge, and at the same time, the micrometer scale is at about zero position, and the workbench is moved horizontally to find ring A
JJF 1066-2000
The maximum value of the hole diameter of the ring gauge, rotate the workbench around the horizontal axis, and find the minimum value of the hole diameter of the ring gauge. Repeat the above method, and write down the reading after the indication stabilizes. Then change the state of the workbench, operate again according to the above method, and write down the reading after the indication stabilizes. The difference between the two readings should not exceed the requirement of 4.9. 7 Processing of calibration results
The calibrated length measuring machine is issued a calibration certificate. The calibration certificate should give the calibration results and measurement uncertainty of the millimeter and decimeter scales.
Recalibration time interval
The recommended recalibration time interval is 1 year.2 Inner dimensions
Install a small measuring hook on the measuring shaft and the special tail pipe, and make the measuring head of the measuring hook contact the inner hole of the 20mm diameter ring gauge placed on the workbench. After adjusting the workbench to find the maximum diameter of the ring gauge, turn the measuring hook 10 times and observe the change of the indication. The difference between the maximum and minimum values ​​is the calibration value. 6.9 Verification of standard ring gauges
The standard ring gauges shall be verified in accordance with the verification procedures for standard ring gauges in JJ(894-1995. 6.10 When the V-shaped bracket T is used for repeatability
calibration, first adjust the spherical measuring increments installed on the light tube and the tail tube measuring rod to the correct state, and then align the zero line of the millimeter scale with the zero line of the decimeter scale. Turn the tail tube fine-motion screw to align the micrometer scale to zero. After moving the tailstock to the 500mm scale position, install two V-shaped brackets on the guide at the appropriate position between the measuring seat and the seat. Place the gauge block on the V-shaped bracket and lift the bracket so that the gauge block is on the measuring axis. Move the measuring tailstock, and use the fine-motion mechanism of the measuring seat again to align the micrometer scale. Align the zero line with the double lines of the decimeter scale, then use the nut of the rib-shaped bracket to lower the bracket and find the minimum value of the gauge block. Change this state, raise and lower the bracket again, and find the minimum value of the gauge block. This calibration should be performed at least 3 times, and the difference in the minimum value should not exceed 4, meeting the requirements of 11. 6.11 Reliability of internal measurement accessories
The small internal measuring hook is calibrated with a 20mm diameter ring gauge, and the internal measuring hook is calibrated with a 50mm diameter gauge. The internal measurement accessories are installed on the light tube and the tail tube and adjusted to the correct position. Install the ring gauge on the workbench so that the measuring hook contacts the inner hole of the ring gauge, and at the same time, the micrometer scale is at about zero position, and the workbench is moved horizontally to find ring A
JJF 1066-2000
The maximum value of the hole diameter of the ring gauge, rotate the workbench around the horizontal axis, and find the minimum value of the hole diameter of the ring gauge. Repeat the above method, and write down the reading after the indication stabilizes. Then change the state of the workbench, operate again according to the above method, and write down the reading after the indication stabilizes. The difference between the two readings should not exceed the requirement of 4.9. 7 Processing of calibration results
The calibrated length measuring machine is issued a calibration certificate. The calibration certificate should give the calibration results and measurement uncertainty of the millimeter and decimeter scales.
Recalibration time interval
The recommended recalibration time interval is 1 year.2 Inner dimensions
Install a small measuring hook on the measuring shaft and the special tail pipe, and make the measuring head of the measuring hook contact the inner hole of the 20mm diameter ring gauge placed on the workbench. After adjusting the workbench to find the maximum diameter of the ring gauge, turn the measuring hook 10 times and observe the change of the indication. The difference between the maximum and minimum values ​​is the calibration value. 6.9 Verification of standard ring gauges
The standard ring gauges shall be verified in accordance with the verification procedures for standard ring gauges in JJ(894-1995. 6.10 When the V-shaped bracket T is used for repeatability
calibration, first adjust the spherical measuring increments installed on the light tube and the tail tube measuring rod to the correct state, and then align the zero line of the millimeter scale with the zero line of the decimeter scale. Turn the tail tube fine-motion screw to align the micrometer scale to zero. After moving the tailstock to the 500mm scale position, install two V-shaped brackets on the guide at the appropriate position between the measuring seat and the seat. Place the gauge block on the V-shaped bracket and lift the bracket so that the gauge block is on the measuring axis. Move the measuring tailstock, and use the fine-motion mechanism of the measuring seat again to align the micrometer scale. Align the zero line with the double lines of the decimeter scale, then use the nut of the rib-shaped bracket to lower the bracket and find the minimum value of the gauge block. Change this state, raise and lower the bracket again, and find the minimum value of the gauge block. This calibration should be performed at least 3 times, and the difference in the minimum value should not exceed 4, meeting the requirements of 11. 6.11 Reliability of internal measurement accessories
The small internal measuring hook is calibrated with a 20mm diameter ring gauge, and the internal measuring hook is calibrated with a 50mm diameter gauge. The internal measurement accessories are installed on the light tube and the tail tube and adjusted to the correct position. Install the ring gauge on the workbench so that the measuring hook contacts the inner hole of the ring gauge, and at the same time, the micrometer scale is at about zero position, and the workbench is moved horizontally to find ring A
JJF 1066-2000
The maximum value of the hole diameter of the ring gauge, rotate the workbench around the horizontal axis, and find the minimum value of the hole diameter of the ring gauge. Repeat the above method, and write down the reading after the indication stabilizes. Then change the state of the workbench, operate again according to the above method, and write down the reading after the indication stabilizes. The difference between the two readings should not exceed the requirement of 4.9. 7 Processing of calibration results
The calibrated length measuring machine is issued a calibration certificate. The calibration certificate should give the calibration results and measurement uncertainty of the millimeter and decimeter scales.
Recalibration time interval
The recommended recalibration time interval is 1 year.
Tip: This standard content only shows part of the intercepted content of the complete standard. If you need the complete standard, please go to the top to download the complete standard document for free.