Some standard content:
Trial Verification Procedure of DC Digital Ammeter
JJG598—1989
Trial Verification Procedure of DC Digital Ammeter
Verification Kegulalinn of CJDigited Anmeremeter
JJG598—J9R0
This verification procedure was approved by the Technical Commission on April 6, 1989, and came into effect on February 6, 1990. Responsible unit: China Institute of Material Science
Drafting unit: China Institute of Material Science
This regulation is subject to the technical provisions of the drafting unit. This regulation mainly includes: (China Institute of Metrology) (China Institute of Metrology) (China Institute of Metrology) (China Institute of Metrology) G54 Technical requirements: (I) Verification equipment efficiency requirements (II) Quick determination of environmental conditions Four-dimensional verification items and verification method (— Verification date). (I) Error determination procedure (\) Error determination method (yuan) Verification of other items
Verification summary processing and verification period
【I】Handling of verification results
2) Determination of verification results
3) Verification period
4) Verification procedures for DC digital ammeters
5) Trial verification procedures for DC digital ammeters
6) This procedure is applicable to newly produced, in-use and repaired DC digital ammeters (C-DIMs), as well as digital multimeters (DMMs) and digital panel meters. This regulation is also applicable to the calibration of the DC current measurement part of the meter DPM. This regulation is also applicable to the calibration of the relevant parts that convert a certain quantity into DC current for excitation measurement. The main body of the DC digital voltmeter (LXDVM) is first converted into a DC quantity through a DC current-voltage (V) converter and then used to select the digital measurement panel [X\IVM, the display unit is A or mA: for digital multimeters, the DC voltage function is generally tested first, and then the DC voltage is controlled.
2 Technical requirements | |tt||1 Calibration requirements
In order to use correctly and ensure the accuracy of the measurement results, various [C-DIMs must be calibrated. [X:DTM inspection work is generally divided into three types: periodic calibration, initial calibration and random calibration. The DC digital ammeter under inspection: The test shall meet the technical requirements specified in this specification. 2 External and power-on inspection
In order to ensure the safety and correct operation of the instrument, the appearance and power-on inspection of the meter shall be carried out before calibration. 3.1
The appearance and structure are intact, the panel display, and the digital 2.3 The adjustable mechanism of the instrument should be in good working order, the instrument accessories and connection parts should be complete, the power supply voltage and frequency mark of the instrument should be correct.
2.4 After multiple inspections, the instrument should be powered on for a kinetic energy check. The manual stipulates that the convex gas should be checked for energy injection. 2.5 According to the range and measurement range of the instrument to be checked, from low to commercial According to the appropriate DC current signal, check whether the basic switching of the motor and the motor is stopped, and whether the manual and automatic maintenance are carried out. The meter under test should be placed in a constant temperature room for more than 24 hours, and then its main technical indicators should be calibrated. 3.1 Basic measurement is the key to measure the performance of an LXC-DIM, and it should be calibrated in detail. 3.2 If the meter under test has linear error, choose the long difference test point. 3.3 The non-basic test points should consider the upper and lower measurement codes. 3.4 Comprehensive requirements, basic quantity generally has 5-10 calibration points, non-basic negative range takes 3~5 calibration points 3.5 positive, negative can select relatively complete calibration points, or only calibrate the full amount of inventory: 4 Error and technical accuracy must be equal
4.T Error formula is expressed in one of the following forms: 1.1.1 Absolute error expressed by the sum of two temperature differences: =±(%[1%w
Where: 1x
The value of the standard table ( Display value):
Full scale value of the inspection form:
Error coefficient related to the real number:
Error coefficient related to the full scale value
4.1.2 Relative error of the ratio of the target error to the inspected reading: (a%
4.2 Accuracy level
The accuracy levels of DC digital integrated circuit meters are: 0.001, 0.002, 0.005, 0.1, 1.02, 0.05, 0.1U.2, L, 5, 1.6, etc., a total of ten levels, table! IXIM also provides ten levels of grade indicators. [CM is a multi-range instrument, and its different ranges may have different indicators. Table 1 Levels of DC power meter
Note: The core is the power meter annual delivery coefficient. 4.3 Determination of accuracy level
The base annual range is the highest
The accuracy level is mainly divided by the size of the L-core DIM basic range error coefficient and the annual stability error. The grading standards are as follows:
The verification data of the basic error is to be combined with the technical indicators of the verification meter. Calculated by the following formula: = + (% + % || tt || In the formula, b is the basic error coefficient of the meter under test for 24 hours. 4.3.1 The calibration is carried out regularly every year. The DCLIM to be set should be preheated and pre-adjusted but not calibrated under standard conditions to calibrate its annual stable error. The error should be less than = (a% x%). If the meter under test does not have a one-year identification error index, it can be determined by the measurement gate according to the actual measurement results. 4.3.3 After the annual identification error is determined, calibration can be carried out and the basic error of DC-DIM can be checked. This data product meets the specified technical indicators. || tt || meets the LDIM specified in 1 and is given a grade. There should be The last inspection data is required for the first inspection.
Three inspection conditions
【—Standard equipment and requirements
The standard equipment for IXC-DIM inspection must include: temperature-controlled standard battery (group).
5.2 Standard resistor or standard resistor box.
Standard current potentiometer and standard digital voltmeter 5.4 Standard current source.
5.5 Standard digital current meter.
Standard current detector or multi-function standard current source) and other auxiliary monitoring equipment. The combined uncertainty of the whole verification device should be less than 1-1 of the allowed value of the tested T-DJM. 6.1 The short-term stability and adjustment precision of the DC regulated current power supply should be less than 15-【 of the allowed value of the micro-test TDIM. 6.2 The output should be continuously adjustable by external equipment: 6. The recording sensitivity of the control device should be 1/5-~1/10 of the LCIM filling error. 6.4 The standard measuring equipment used should be regularly measured and verified. 6.5
Should try to adopt self-test (calibration), system verification and data processing, in order to improve the work efficiency.
When the accuracy level of the device is not high enough, the result of the determination will be in the fixed area, and strict inspection should be carried out. 6.6
When necessary, the error correction value of the standard setting can be used alone, or a new measurement standard single can be used to determine the correct value. 6. The system (including the oxygen line) of the verification device can always have good shielding and grounding measures, away from strong electricity, away from the field, to avoid external interference,
【II】The environment of the verification is in the empty space
LLIM's standard error stability error is low and the standard conditions specified in Table 2 are used for verification, comparison and use. In order to determine the rated working conditions, the table is divided into A and B according to the use environment.Group A is used in a good environment. Group A is used in a normal environment. Group B is used in a bad environment. Group A is used in a bad environment. Instruments and meters are tested, calibrated and used according to the rated working conditions specified in Group A. See Table 3. 2. Standard conditions for direct current efficiency. Influence quantity
External temperature
Phase furnace humidity
Atmospheric pressure (sea technology and commercial construction)
Internal source frequency
Traffic sea electric distortion
European (stator current)
Ring pipe temperature
Atmospheric pressure (explosion technology and high reading):
Electromagnetic voltage
Physical energy
Traffic electric micro-classic | |tt||Software
Micro-setting hall
i frequency
Table 3 is applicable to the working condition of the DC mosquito meter
20%% (no double required
(up to Mr)
About current ±10%
Rated working 3%
Standard charging is allowed
>3W, *_21wwW.bzxz.Net
2-0 05
The maximum oil gain is not more than 20
in general, the rate can be verified by the industry. Three
adjustment and its frequency are within 12%
(-】Test items
Four test items and test methods
8-DIM, the test mainly includes:
test of the original error, test of the stability error, test of the linear error: test of the display force, full test of the washing, flow system efficiency test, the influence of the positive change of the power supply, etc. (II) Basic water difference test
9 Requirements for the determination of C-[IM, according to the following provisions for inspection To determine the size of its basic error, 91, under standard conditions, start the product for calibration, adjustment and calibration, number, 2, select the calibration point, complete the basic range, and then perform the first calibration on the high and low range codes, 9, 3, the standard error refers to the 24h difference in the performance of the meter. Within the 24h range, the instrument can be operated under reduced power, and can be powered on and off (re-energized and re-energized), and should be heated for the specified time, but the calibration does not require adjustment. 9.4 The number of measurements within each 24h range shall be at least thirteen times, and the error limit of each measured data shall be taken as the basic error of the meter. ||Three) The verification method of the difference
CIM difference verification method can be divided into the following types:
DC standard current method:
Direct comparison (standard digital voltmeter method, direct pulse standard instrument method;
Standard digital voltmeter method.
Attached is the transfer diagram of the LC-IIM verification system of Shanguang. To use a specific scheme, you can choose an economical, reliable and simple method based on the level of the standard equipment and the meter being tested.
DC standard current source method
This method is shown in Figure 1: Assume that the DC standard current source is The standard error is: that is, the actual value. The displayed value of the current source is "x": then the absolute error of the meter is:
The relative error of the standard current source is expressed as a percentage: 110%
This is fast and suitable for large-scale factory calibration of CD. Its calibration error depends on the DC standard current source.
When the standard current source does not meet the requirements and the current is relatively high: it can be used as a general steady current source and matched with a standard digital ammeter for calibration by comparison method: 10.2 Direct comparison method
The connection method is shown in Figure 2. That is, use a DC standard digital current meter (or a standard IMM with voltage-free function) to connect to the output end of a DC current source (stable enough), assume that the indication of the standard product (actual value) is , and the display value of the test meter is , then the absolute error of the meter is obtained; 659
Real power source
(physical)
ELXC-DL
Figure 2 Connection comparison
a= Ix-In
Similarly, the maximum error percentage is expressed as *×100%*×=n×100%
In general, the number of sections of the standard meter is more than that of the test meter. When the two are not comparable, a standard shunt can be used: using this method: zeroing is guaranteed to ensure the accuracy of the standard. For comparison, the standard meter must be regularly verified and calibrated.
10.3 Standard instrument method
In principle, any method that uses a standard resistor to convert the current into a voltage can be used for the calibration of the digital ammeter. In the wiring diagram of XM, R is the standard resistor. The high end of the digital ammeter or the standard voltage measuring device must be connected to the standard resistor. Use a standard potentiometer to measure the voltage at the potential end of the standard resistor. The ratio of the potentiometer indication value to the standard resistor value is the actual value of the meter. The actual voltage indication of the two ends of the standard circuit is,, the actual standard resistance is R, and the indication of the tested circuit is ", then the actual value of the loop current is:
The wire error of the test meter is:
A=Ix-R,
Similarly, the phase error percentage of the tested circuit is: TIN
1x×100%
KN×100%
When using this test method, not only the error of the actual standard resistance R meets the requirements, but also the appropriate resistance value should be selected so that the current passing through it does not exceed its rated current, and the voltage data that can be accurately measured can be obtained. It is important to consider that the positive drop on R is less than the measurement upper limit of the potentiometer used, and it should be ensured that the first F-input of the potentiometer is greater than the indication of the pregnant voltage,
10.4 Standard digital voltage meter method
When there is a small error, the DC-DIM is over-tested and the allowable error is 1/3 to 1/5 of the standard DC-LVM. It can be used instead of the zero-marked sliding potentiometer to conveniently test the ICDIM. As shown in the diagram, this calibration method is sent, and the value of the voltage limit L must also be noted. According to the calibration method for the missing inspection 1X>1DiM, the points are: to ensure that the loop current is as small as possible and the reading of the standard TC-DVM is as close to its full-scale value as possible. At the same time, the X-LVM input voltage is not high enough to raise the additional error value less than the allowable error of 1. In order to meet the requirements of 10.310.4, it is necessary to use a set of withdrawal rulers or standard 1 input resistance box. When 6
is selected in the constant sea oil tank, the monthly shunt accessory expansion The development process is as follows: 10A (or 204) TIM, and its accessories are all tested together. 11 Testing of stability error
11.1 Stability error refers to the error of the main recording device within the specified time period, the error of the main recording device maintaining the correct information or winter point. Stability error includes two kinds of variable quantities: fluctuation and shift. According to the time system, there are short-term shift error and long-term stability error. Self-sinking power supply
pulse standard DC-INTA
Figure 4 multi-standard IK-LM method
11.2 Stability error test should be carried out under the standard conditions listed in Table 2. The test method is the same as the test method described in 10.1~10.4. You can choose any one of them:
1.3 IX-D [M after preheating, preheating, input end short circuit, no longer two In the case of the instrument being adjusted, the potential error is observed. After conversion, a standard current signal is input, and the current stability error without a sharp decrease point is measured. 11. For the measurement of the fluctuating component, automatic testing is used, and the method of automatic recording is used. 11.5 According to the provisions of the instruction manual and the requirements of the instrument being tested, the stability error within a certain specified time period can be measured. Any DIM of any level must be tested for a 24: stability error and a long-term stability difference of one year. 12. Verification of linear error
The characteristic that the DIM reflects the measured current as evenly as possible when measuring current is called linear error. The deviation between the actual transformation curve and the ideal straight line (base line) is called linear error. The linear error is generally within the basic measurement range, and is tested together with the verification of the basic setting difference: the determination method of DIM error is also applicable to the verification of linear error. For a digital ammeter, since the linear error of DCV has been verified, its electrical function error does not need to be verified. It is only necessary to verify its full-scale integral value. (IV) Verification of other items
13 Determination of display capability
For a digital ammeter to be tested, the display capability can be tested during the power-on inspection. The DC current signal is extracted from the power source to observe whether the displayed value of the meter under test changes continuously. Now, 2U[0, is listed as the following for illustration: 0.0000.0000.00010.00020.0000.000.000 9±0.001 0.001 9 ).002 0.-0.008 9 ±.009 00.959 9±1.00[ 01,100 0 =1.999 9 The measurement point is changed without missing any point, and the total display capacity meets the requirements. At the same time, check whether the six readings of the display are reversed, whether there are repeated words, or no side effects. Also check the sound effect point, positive and negative properties, unit symbol and full range display capacity, etc.
14 The determination of resolution
is generally only measured at the highest resolution (carbon) of the minimum range, and can be determined using a private high-resolution measuring device. Make the DC power supply output a current signal. When the detected meter displays a certain value (which can be equal to zero or other values), use the measuring device to adjust the standard value of this value. Then fine-tune the signal source to make the last digit of the meter under test change by 1, and read the actual standard indication of the measuring device. The difference between the two standard indications is 1:1, which is the difference of the attenuation under test. 15. Determination of temperature coefficient of effect
For newly produced XDIM, the temperature coefficient of change should be measured sometimes. After the whole effect meter is preheated and adjusted, it is placed in the control box and input. Factory, (full range, about 10000mA, control the meter to the upper limit temperature ( 20t): 2, use the standard equipment that meets the requirements to measure the indication value and zero position change of the meter under test, and then according to the requirements of the temperature rise and fall test, first bring it to normal, and then gradually reduce the temperature to the specified lower limit (such as 1), while keeping the input constant: 2, and then use the standard equipment to record the indication and zero position change of the meter under test. According to the above test steps, calculate the difference between the pressure indication value of the meter under test at the upper limit and its indication value in the standard product (if necessary, divide the drift value that changes with time during the test, and then divide the temperature difference between the upper and lower limit and the standard temperature, and take the maximum value as the overflow coefficient of the meter under test. 16 The influence of power supply voltage change on the measurement
For the first test and the after-test, it is necessary to verify the influence of power supply voltage change. The error caused by power supply voltage change is a kind of measurement error, which generally refers to the power supply [mains] rated limit of _10%, and under the condition that other influences are kept unchanged, the change of the meter should not exceed the meter index specified in the technical standard.
The measurement is carried out at the highest range, the meter under test is input with 0.B1, the output current is left, and the rated power supply voltage is adjusted to change by ±10%. After: ·Timed call (such as 15min) , the method is to calculate the cavitation amount of the measured value of TTM relative to the measured value when the power supply voltage is fixed: and take the maximum deviation value as the measurement result. The above instrument gives the calibration items and test methods under general conditions, but in some special cases, it is necessary to add some other measurement items according to the user's requirements or the instrument instructions and regulations. The test of input characteristics, the test of anti-glare ability, the test of response time, the influence of AC power frequency change and power distortion, as well as the test of insulation resistance and withstand voltage, etc., are separately stipulated in this regulation.
5. Calibration result processing and calibration cycle
(- one) Processing adopted by the calibration station
17LX-EJIM's calibration should have complete original records, and perform accurate calculations or necessary mathematical processing on the original data (such as adding static value or humidity coefficient to the calibration standard device, etc.). The calibration records are generally kept for one year. B The number of effective digits of the calibration data is generally one more than the accuracy level of the test meter. 19 The data in the calibration record should be completely rounded off: the reading error required by the rounding should not exceed 1/5.-1/10 of the allowable error of the test meter.
20 Measure whether the meter being tested exceeds the allowable error, and use the rounded data to verify whether the calibration data of the test meter is qualified.
21 According to the calibration data, calculate the absolute error of the indication, find the maximum error point, and use the absolute error formula: =x±(%x%fm)
to judge whether the calibration data of the test meter is qualified. It can also be judged by the relevant error formula of the reading, that is: (6%, 5%
2D-DTM for which grading is required, the calibration certificate shall give the 24h Levin error and the annual error of the same period of inspection. At the request of the inspection unit, the registration of the time interval of the other party shall also be given. The LX1>1M and 1 disk departments that require grading shall be replaced according to the grading standards of the technical process, and the more accurate level shall be given a note in the calibration certificate.
24Required grading For 1X-LIM, such as the table sent for inspection by the Ministry of Measurement or the factory technical specifications, or the table sent for inspection by the foreign import inspection, the metrology department shall make a determination and judge whether the verification data is qualified according to the corresponding technical conditions of the production base. 25 Except for the reading difference data, the measurement results of other technical indicators shall be shown on the verification certificate. Appendix 2 compiles a form of signing records.
26 For this 1X-IM, generally the test data are given in the form of .A, and the results are not shown. 27 The instrument with the value and quality of the calibration station shall be issued with a calibration certificate, which shall indicate the validity period and be stamped with the official seal. It shall not be used as a standard measuring instrument. The instrument that cannot be accepted by the control process within the specified period shall not be required to be graded and the unqualified instrument shall be issued with a notification of test results or a certificate of unqualified equipment, which shall give the actual date of the calibration results and be stamped with the official seal. (III) Acceptance period
29DC-LDIM The nature of the report shall be checked: 3I The newly accepted or newly accepted DC-D IM, should be calibrated for the first time before being put into use. 31 "X-DTM" used as a measuring instrument should be calibrated weekly. The calibration cycle should generally not exceed one blood. Under special circumstances, it can be appropriately shortened or extended. Record
Appendix 1
Medical fast format
On the difficult
DC digital ammeter control system
Work high
Zhi Bei such as
K measurement ratio
5× 10-5
High-definition seat
DC standard
telephone source
n building
~5)×*
Standard electrical products
Eastern telephone
(standard positive input
F21xL0F
standard work MON
standard MM
01 part
0-1047k1-
Description; 1. According to the card of 1XM, the verification method can adopt D loss standard instrument decision, mutual building ratio method or 1X standard win current tide lack, 2,D 2-DIM and X-VM cavity reverse traditional mother parallel.563
Calibration source skin
Accuracy level
(I) Basic setting difference
Actual value
(Standard value)
Set to 1
(II) Stable setting difference
【III】Imperial error
(W) Indication capability
(W) Degree
() Temperature coefficient
Attached 2 direct current test record format test display
Drum pull agricultural product indication
ground supply differenceB1, the left and right outlets, adjust the rated power supply voltage to change ± 10%, after: · timed (such as 15min), the measured value of TTM is equal to the cavitation volume of the value of the test object when the power supply voltage is set: and take the maximum deviation value as the measurement result. The above instrument gives the verification items and test methods under the general conditions, but in some special cases, it is necessary to add some other measurement items according to the user's requirements or the instrument instructions and regulations. The test of input characteristics, the test of anti-glare ability, the test of response time, the influence of AC power supply frequency change and power supply distortion, and the test of insulation resistance and withstand voltage are separately specified in this regulation.
5 Verification station result processing and verification cycle
(-1) The processing adopted by the verification station
17LX-EJIM's verification should have complete original records, and perform accurate calculations or necessary mathematical processing on the original data (such as adding static value or humidity coefficient of the verification standard device, etc.). Verification records are generally kept for one year. B The number of effective digits of the inspection data is generally one more than the accuracy level of the test meter. 19 The data in the verification record should be rounded off after calculation: the reading error required by the rounding should not exceed 1/5.-1/10 of the allowable error of the test meter.
20 Measure whether the meter under test exceeds the allowable error, and use the rounded data to verify whether the meter under test is qualified.
21 According to the verification data, calculate the absolute error of the indication, find the maximum error point, and use the absolute error formula: =x±(%x%fm)
to judge whether the verification data of the test meter is qualified. It can also be judged by the relevant error formula of the reading, that is: (6%, 5%
2D-DTM for which grading is required, the calibration certificate shall give the 24h Levin error and the annual error of the same period of inspection. At the request of the inspection unit, the registration of the time interval of the other party shall also be given. The LX1>1M and 1 disk departments that require grading shall be replaced according to the grading standards of the technical process, and the more accurate level shall be given a note in the calibration certificate.
24Required grading For 1X-LIM, such as the table sent for inspection by the Ministry of Measurement or the factory technical specifications, or the table sent for inspection by the foreign import inspection, the metrology department shall make a determination and judge whether the verification data is qualified according to the corresponding technical conditions of the production base. 25 Except for the reading difference data, the measurement results of other technical indicators shall be shown on the verification certificate. Appendix 2 compiles a form of signing records.
26 For this 1X-IM, generally the test data are given in the form of .A, and the results are not shown. 27 The instrument with the value and quality of the calibration station shall be issued with a calibration certificate, which shall indicate the validity period and be stamped with the official seal. It shall not be used as a standard measuring instrument. The instrument that cannot be accepted by the control process within the specified period shall not be required to be graded and the unqualified instrument shall be issued with a notification of test results or a certificate of unqualified equipment, which shall give the actual date of the calibration results and be stamped with the official seal. (III) Acceptance period
29DC-LDIM The nature of the report shall be checked: 3I The newly accepted or newly accepted DC-D IM, should be calibrated for the first time before being put into use. 31 "X-DTM" used as a measuring instrument should be calibrated weekly. The calibration cycle should generally not exceed one blood. Under special circumstances, it can be appropriately shortened or extended. Record
Appendix 1
Medical fast format
On the difficult
DC digital ammeter control system
Work high
Zhi Bei such as
K measurement ratio
5× 10-5
High-definition seat
DC standard
telephone source
n building
~5)×*
Standard electrical products
Eastern telephone
(standard positive input
F21xL0F
standard work MON
standard MM
01 part
0-1047k1-
Description; 1. According to the card of 1XM, the verification method can adopt D loss standard instrument decision, mutual building ratio method or 1X standard win current tide lack, 2,D 2-DIM and X-VM cavity reverse traditional mother parallel.563
Calibration source skin
Accuracy level
(I) Basic setting difference
Actual value
(Standard value)
Set to 1
(II) Stable setting difference
【III】Imperial error
(W) Indication capability
(W) Degree
() Temperature coefficient
Attached 2 direct current test record format test display
Drum pull agricultural product indication
ground supply differenceB1, the left and right outlets, adjust the rated power supply voltage to change ± 10%, after: · timed (such as 15min), the measured value of TTM is equal to the cavitation volume of the value of the test object when the power supply voltage is set: and take the maximum deviation value as the measurement result. The above instrument gives the verification items and test methods under the general conditions, but in some special cases, it is necessary to add some other measurement items according to the user's requirements or the instrument instructions and regulations. The test of input characteristics, the test of anti-glare ability, the test of response time, the influence of AC power supply frequency change and power supply distortion, and the test of insulation resistance and withstand voltage are separately specified in this regulation.
5 Verification station result processing and verification cycle
(-1) The processing adopted by the verification station
17LX-EJIM's verification should have complete original records, and perform accurate calculations or necessary mathematical processing on the original data (such as adding static value or humidity coefficient of the verification standard device, etc.). Verification records are generally kept for one year. B The number of effective digits of the inspection data is generally one more than the accuracy level of the test meter. 19 The data in the verification record should be rounded off after calculation: the reading error required by the rounding should not exceed 1/5.-1/10 of the allowable error of the test meter.
20 Measure whether the meter under test exceeds the allowable error, and use the rounded data to verify whether the meter under test is qualified.
21 According to the verification data, calculate the absolute error of the indication, find the maximum error point, and use the absolute error formula: =x±(%x%fm)
to judge whether the verification data of the test meter is qualified. It can also be judged by the relevant error formula of the reading, that is: (6%, 5%
2D-DTM for which grading is required, the calibration certificate shall give the 24h Levin error and the annual error of the same period of inspection. At the request of the inspection unit, the registration of the time interval of the other party shall also be given. The LX1>1M and 1 disk departments that require grading shall be replaced according to the grading standards of the technical process, and the more accurate level shall be given a note in the calibration certificate.
24Required grading For 1X-LIM, such as the table sent for inspection by the Ministry of Measurement or the factory technical specifications, or the table sent for inspection by the foreign import inspection, the metrology department shall make a determination and judge whether the verification data is qualified according to the corresponding technical conditions of the production base. 25 Except for the reading difference data, the measurement results of other technical indicators shall be shown on the verification certificate. Appendix 2 compiles a form of signing records.
26 For this 1X-IM, generally the test data are given in the form of .A, and the results are not shown. 27 The instrument with the value and quality of the calibration station shall be issued with a calibration certificate, which shall indicate the validity period and be stamped with the official seal. It shall not be used as a standard measuring instrument. The instrument that cannot be accepted by the control process within the specified period shall not be required to be graded and the unqualified instrument shall be issued with a notification of test results or a certificate of unqualified equipment, which shall give the actual date of the calibration results and be stamped with the official seal. (III) Acceptance period
29DC-LDIM The nature of the report shall be checked: 3I The newly accepted or newly accepted DC-D IM, should be calibrated for the first time before being put into use. 31 "X-DTM" used as a measuring instrument should be calibrated weekly. The calibration cycle should generally not exceed one blood. Under special circumstances, it can be appropriately shortened or extended. Record
Appendix 1
Medical fast format
On the difficult
DC digital ammeter control system
Work high
Zhi Bei such as
K measurement ratio
5× 10-5
High-definition seat
DC standard
telephone source
n building
~5)×*
Standard electrical products
Eastern telephone
(standard positive input
F21xL0F
standard work MON
standard MM
01 part
0-1047k1-
Description; 1. According to the card of 1XM, the verification method can adopt D loss standard instrument decision, mutual building ratio method or 1X standard win current tide lack, 2,D 2-DIM and X-VM cavity reverse traditional mother parallel.563
Calibration source skin
Accuracy level
(I) Basic setting difference
Actual value
(Standard value)
Set to 1
(II) Stable setting difference
【III】Imperial error
(W) Indication capability
(W) Degree
() Temperature coefficient
Attached 2 direct current test record format test display
Drum pull agricultural product indication
ground supply difference563
Calibration source
Accuracy level
(I) Basic setting difference
Actual value
(Standard value)
Set to 1
(II) Stable setting difference
【III】Imperial error
(W) Indication capability
(W) Degree
() Temperature coefficient
Attached 2 Direct current test record format Test display
Drum pull agricultural product indication
Ground supply difference563
Calibration source
Accuracy level
(I) Basic setting difference
Actual value
(Standard value)
Set to 1
(II) Stable setting difference
【III】Imperial error
(W) Indication capability
(W) Degree
() Temperature coefficient
Attached 2 Direct current test record format Test display
Drum pull agricultural product indication
Ground supply difference
Tip: This standard content only shows part of the intercepted content of the complete standard. If you need the complete standard, please go to the top to download the complete standard document for free.