Some standard content:
National Standard of the People's Republic of China
GB/T5440—1985
Stereophonic broadcast magnetic recorderPublished on September 29, 1985
Published by the State Bureau of Standards
Implemented on June 1, 1986
National Standard of the People's Republic of China
Stereophonic broadcast magnetic recorder
Stereophonic broadcast magnetic recorderUDC 681.84
GB/T5440—1985
This standard applies to a dual-track, dual-channel, unidirectional stereo recorder for broadcasting with a bandwidth of 6.3mm. 1 Definition of terms
1.1 Short-circuit band flux* (abbreviated as band flux) is the magnetic flux flowing through the reproducing head core with zero magnetic resistance and in close contact with the surface of the magnetic tape over an infinite length. 1.2 Magnetic level**
The short-circuit magnetic flux per unit track width is called magnetic level, and the unit is nWb/m (effective value). 1.3 Working magnetic level*
The magnetic level that makes the volume meter indicate the OVU scale. This standard stipulates: When recording with a reference frequency signal, when the tape speed is 38.1cm/s and 19.05cm/s, 255nWb/m is used as the working magnetic level.
1.4 Reference frequency*
Specific test frequency when conducting electroacoustic performance tests This standard stipulates: When the tape speed is 38.1cm/s and 19.05cm/s, 1000Hz is used as the reference frequency. 1.5 Track*
The trace of magnetization on the magnetic tape by the recording head. 1.6 Channel*
The path for transmitting electrical signals is called a channel.
The channel that transmits the audio signal to be recorded from the signal source to the tape is called a recording channel. The channel that transmits the recorded audio signal from the tape to the output end of the playback system is called the playback channel.
2 Classification
2.1 According to mechanical and electroacoustic performance, it is divided into two levels: Class A and Class B. 2.2 According to the structural form, it is divided into two types: fixed type and portable type (including fixed and portable dual-purpose types). 3 Technical requirements
3.1 Reel direction: The tape reel rotates counterclockwise in the playback state. 3.2 Position of the working surface of the tape: facing the center of the disk. 3.3 Track format.
3.3.1 On a 6.3mm wide tape, dual-track stereo is recorded in one direction, and the protection interval width is 2.0mm or 0.75mm. 2.0mm is recommended.
3.2.2 The tape runs from left to right. Looking from the back of the tape, the upper track (the first track) records the left channel signal; the lower track (the second track) records the right channel signal.
*See GB4013-83 "Terms and Terms for Audio and Video Recording". **See GB543985 "Recording and Exchange of Stereo Broadcasting Programs (Tapes)". Issued by the National Bureau of Standards on September 29, 1985
Implemented on June 1, 1986
GB/T5440-1985
3.3.3 The size, position and shape and position tolerance of the recording track are shown in Figure 1. 26
3.4 Power supply:
Track size
Track size
Figure 2 Transition standard
AC: 220V%50Hz
DC: 6, 9, 12, 18, 24V.
3.5 Continuous working time shall not be less than 8 hours. 4 Basic parameters
2 stations
Position of magnetic tracks on the tape
Position of magnetic tracks on the tape
4.1 Recorders are divided into two grades, A and B. Their mechanical properties shall comply with the provisions of Table 1, and their electroacoustic properties shall comply with the provisions of Table 2. Table 1
Nominal tape speed (cm/s)
Tape speed error (not worse than %)
Wow and wow rate (not worse than %)
Start time (s)
Rewind time (min)
Unweighted
Stop time when fast forwarding and rewinding (s)
Note: During the test, a 1000m tape was used for the fixed recorder and a 360m tape was used for the portable recorder. 2
Nominal tape speed ((cm/s))
Signal-to-noise ratio (dB) (per
channel, not Weighted)
Total harmonic distortion (not worse than
%) (each channel)
Two-track recording magnetic level and playback
audio, recording and playback channel amplitude-frequency characteristics
frequency used in the frequency band (characteristics
and tolerance are shown in Figure 4)
Channel balance (dB)
(less than)
Phase difference between channels (tolerance is shown
in Figure 6)
Playback channel
Recording and playback channel
Playback channel
Recording and playback channel
fi (Hz)www.bzxz.net
f2 (Hz)
fs(Hz)
fi(Hz)
f2(Hz)
fs(Hz)
f4(Hz)
Channel crosstalk (dB) 100~10kHz
Noise cancellation effect (each channel) (
Line input impedance (k)
Line load impedance (Q)
Maximum line input level (dBm)
Minimum line input level (dBm)
Maximum output level (dBm)
GB/T5440—1985
16 000
14 000
14 000
The maximum output level is reduced by 6dB.
Note: The signal-to-noise ratio of the portable recorder is allowed to be 4dB worse than the value listed in the table. 5 Test method
5.1 Test conditions,
Temperature: 15~35℃;
Relative humidity: 45%~80%;
*Frequency f1, f2, fs, f. are the same as the frequencies specified in the amplitude-frequency characteristic column. B
14 000
Power supply frequency tolerance: ±0.5Hz;
GB/T5440—1985
Power supply voltage tolerance: AC ±3%, DC ±4%. 5.1.1 Magnetic tape used: It shall comply with the provisions of relevant national standards. 5.1.2 Standard test tape: It shall comply with the provisions of relevant national standards. 5.1.3 Test instruments: It shall comply with the requirements of Table 3. Table 3
a. Frequency range: 20Hz~20kHz
b. Frequency error: ≤±2%, ±1Hz
Audio signal generator
Electronic millivoltmeter
Distortion meter
Wow and wow meter
Bandpass filter
Digital frequency meter
Oscilloscope
Tensiometer| |tt||c.Amplitude error: ≤±0.5dB
d.Harmonic distortion: ≤0.1%
e.Output impedance: 600Q
Range: 1mV~300v (60~+50dBm)
b.Frequency characteristics: 20Hz~20kHz±2%
20~500kHz±8%
C.Measurement error: <±2%
d.Input impedance: ≥500ko
e.Input capacitance: <40pF
Range: 0.1%~100%
b.Frequency range: 20Hz~20kHz
c.Measurement error: ≤±5%
a.Jitter range: 0.03%~3%
b. Starting time range: 1~10s
c. Measuring frequency: 3150Hz
d. Indication mode: Peak
e. Weighting network frequency characteristics are shown in Table 4 Figure 2 of this standard. a. Center frequency: 1000Hz
b. Selectivity: The attenuation from the center frequency to 1 octave is greater than 30dBc. Impedance: 6002
a. Accuracy: 3×10-5±1 digits
b. Maximum gate time: 10s
&. Two-trace oscilloscope with electronic switch
b. Frequency range: 0~1MHz
Range: 0~500g
5.2 Mechanical performance test method
GB/T5440—1985
The test should be carried out on a full reel of tape. The belt speed error and jitter rate test should be carried out at the head and tail of the belt respectively, and the difference value should be taken. When measuring the belt speed error, the gate time of the digital frequency meter should be 10 seconds. 5.2.1 Tape speed error
Definition: The error of the average tape speed to the nominal tape speed in a period of
in the playback state is expressed as a percentage. Method: Connect a tape speed test tape of not less than 5m in length to the beginning and end of a full tape reel, play it back, measure the output signal frequency f, and calculate it according to the following formula:
5.2.2 Wow and flutter rate*
Tape speed error = 150×100%
Definition: The percentage of the parasitic frequency deviation of the recorded signal caused by the irregular movement of the tape to the recorded signal frequency. Method: The recorder measures the recording-playback wow and flutter rate. If the recorder is used exclusively for playback, the playback wow and flutter rate is measured. During the test, the wow and flutter rate test tape is connected to the beginning and end of a full tape reel.
5.2.2.1 Playback wow and flutter rate*
Play the wow and flutter rate test tape, connect the output to the wow and flutter meter, and read the wow and flutter rate. Take the smaller value between the beginning and the end of the tape. 5.2.2.2 Recording-playback jitter rate*
Record the tape with a 3150Hz signal at the beginning and the end of the tape for several minutes, then play the tape three times each, measure the jitter rate and take the average value, and then take the smaller value between the two. Table 4 Frequency characteristics of weighted network for jitter rate measurement Frequency (Hz)
Attenuation (dB))
*See the provisions of GB1788-79 "Broadcast Recorder" for the definition. Fill
Difference (dB)
0.315~0.5Hz±4dB
0.5~<4Hz±2dB
4Hz±0dB
>4~50Hz±2dB
50~200Hz±4dB
5.2.3 Start-up time*
,20.d 0.681
GB/T5440—1985
Figure 3 Weighted network response
41630020
Definition: The time from operating the start device to the jitter rate reaching twice the factory index value. Method: Record a 3150Hz signal at the beginning of the tape for about 30 seconds. Then play the second half of this tape and measure the start-up time with a jitter meter.
5.2.4 Rewind time*
Definition: The time it takes to rewind a specified length of tape at the highest speed. Method: Start the rewind device and the timer at the same time. When the tape is completely rewound, stop the timer. 5.2.5 Fast forward and rewind stop time
Definition: The time it takes from the operation of the "stop" button to the complete stop of the tape in the fast forward and rewind state. Method: When the tape is fast forwarded or rewound to half a reel, operate the "stop" button and the timer at the same time. When the tape stops, stop the timer at the same time.
5.3 Preparation conditions for electroacoustic performance test
5.3.1 Clean the parts in contact with the tape and demagnetize them*. 5.3.2 Add a load resistor to each channel.
5.3.3 Preset the playback channel (the preset order is: first perform the first channel, then the second channel). 5.3.3.1 Calibrate the playback head height.
5.3.3.2 Preset the playback channel with the playback calibration tape. Play the azimuth calibration part, calibrate the playback head position, and calibrate the phase difference between the two channels to the minimum value.
5.3.3.3 Play the reference magnetic level part, and adjust the playback volume to an output level of +10dBm. 5.3.4 Preset the recording-playback channel (the preset order is: first perform the first channel, then the second channel). 5.3.4.1 Recording head height calibration.
5.3.4.2 Preset the recording-playback channel with the recording-playback channel test tape: use the same frequency and magnetic level as the azimuth calibration part of the playback calibration tape, calibrate the recording head position, and calibrate the phase difference between the two channels to the minimum value. 5.3.4.3 Adjust the bias current* according to the regulations of the tape manufacturer. 5.3.4.4 Record the signal with a reference frequency of +10dBm, and adjust the recording volume until the recorded signal on the tape reaches the reference magnetic level. *5.4 Electroacoustic performance test method (test order: first test the first channel, then the second channel) If it is necessary to change the above preset state for measurement, it should be restored to the original state after the measurement. *See the definition in GB1778-79.
5.4.1 Channel amplitude-frequency characteristics
GB/T5440-1985
The tolerance of the amplitude-frequency characteristics measured for each channel shall comply with the provisions of Figure 4. Figure
5.4.1.1 Playback channel amplitude-frequency characteristics*
Play the frequency characteristic part of the playback calibration tape and measure the amplitude-frequency characteristic curve. 5.4.1.2 Recording and playback channel amplitude-frequency characteristics
Use the input level of -10dBm, record and play the frequencies of the amplitude-frequency characteristic part of the playback calibration tape, and measure the amplitude-frequency characteristic curve. 5.4.2 Signal-to-noise ratio*
Definition: The difference between the playback output signal level and the noise level, expressed in dB. The signal level refers to the output level of the playback calibration tape with the reference magnetic level.
Method:
a. Playback channel signal-to-noise ratio: Remove the tape from the playback head or use a non-magnetic tape to play, and measure the output noise level. The ratio of the above playback output signal level to the noise level here is calculated to be the playback channel signal-to-noise ratio. b. Recording channel signal-to-noise ratio: Record for several minutes at the reference frequency magnetic level. Then connect the input end to a 600Q well-shielded resistor to mute the recorded part. The difference in output level before and after mute is the recording channel signal-to-noise ratio. 5.4.3 Total harmonic distortion*
5.4.3.1 Total harmonic distortion of playback channel
Use the line induction method to add a reference frequency signal to the playback head, so that the output level of the playback amplifier is +10dBm, and use a distortion meter to measure the total harmonic distortion.
5.4.3.2 Total harmonic distortion of recording and playback channel
Record the reference frequency and reference magnetic level for several minutes. Then replay the recorded part and use a distortion meter to measure the total harmonic distortion. 5.4.4 Balance of recording and playback channels (level tolerance) Definition: When the left and right channels are recorded and played back at the specified frequency and level, the difference in output level of the two channels is expressed in dB. Method: Input a 10dBm level signal from the same audio signal generator to the two channels at the same time, and record the frequencies of the amplitude-frequency characteristic part of the playback calibration tape. Rewind the recorded tape and replay it, measure the output level of each frequency of each track respectively, and calculate the level difference of each point, which should meet the requirements of Table 2.
5.4.5 Phase difference between channels
Definition: When the two channels input the same phase signal, the phase difference between the two channels at the output is expressed in degrees. 5.4.5.1 Phase difference of playback channel
Play the calibration band frequency correction part, send the output to YA and YB of the two-trace oscilloscope respectively, and observe the waveforms of the various frequencies of the calibration band (see Figure 5). The phase difference between the two channels is calculated according to the following formula and expressed in degrees. =360°×
*See the provisions of GB1778-79 for the definition.
The test results should comply with the provisions of Figure 6.
5.4.5.2 Phase difference of recording and playback channels
GB/T5440—1985
The same phase signal of the same audio signal generator is sent to the input of the two channels, and the frequencies of the frequency characteristic part of the playback calibration tape are used for recording and playback. The output ends are respectively sent to YA and YB of the two-trace oscilloscope to observe the phase difference of the frequencies. The calculation method and the specified difference are the same as those in Item 5.4.5.1.
5.4.6 Crosstalk between channels
Definition: The ratio of the level of the test signal at the output end of one channel to the level of the crosstalk signal at the output end of the other channel, expressed in dB. Method: Both channels are in the recording working state, one channel is added with a signal, and the output level should reach 106dBm, and a well-shielded 600 resistor is added to the input end of the other channel. After recording for a few minutes, rewind the tape and measure the playback output level of the channel without a signal during playback. The ratio of the levels measured at the output of the two channels shall comply with the provisions of Table 2. 5.4.7 Muting effect*
Record a signal with a reference frequency that is 6 dB greater than the reference level for several minutes. Then mute a portion of the recorded signal. Measure the difference in level between the unmuted portion and the muted portion through a bandpass filter whose center frequency is consistent with the reference frequency, expressed in dB. 5.4.8 Minimum input level*
Definition: The minimum input level recorded to the reference level. Method: Set the volume control of the recording amplifier to the maximum position, record to the reference level at the reference frequency, and measure the input level. 5.4.9 Maximum line input level*
Definition: The maximum input level that the line input of the recording amplifier can withstand when corresponding to the specified total harmonic distortion. Method: Set the volume control of the recording amplifier to the maximum position, record the reference frequency signal, so that the recorded level is 6 dB lower than the reference level, and measure the total harmonic distortion. Then gradually turn down the volume controller and increase the input signal to keep the output level constant until the total harmonic distortion is twice the original value, and measure the input level. Note: For recorders with the volume controller before the preamplifier stage, this indicator does not need to be evaluated. 5.4.10 Maximum output level*
Definition: The output level corresponding to the total harmonic distortion of 1% at the output of the playback amplifier. Method: Use the coil induction method to add a reference frequency signal to the playback head. Gradually increase the signal until the total harmonic distortion at the output reaches 1%, *See the definition in GB1778-79.
Measure the output level.
Inspection rules
According to the provisions of GB1778-79 Part 6.
Marking, packaging, transportation and storage
According to the provisions of GB1778-79 Part 7.
Additional instructions:
GB/T5440-1985
This standard was proposed by the Ministry of Radio and Television of the People's Republic of China. This standard was drafted by the Video Recorder Factory of the Ministry of Radio and Television. 9
Tip: This standard content only shows part of the intercepted content of the complete standard. If you need the complete standard, please go to the top to download the complete standard document for free.