title>JB/T 10389-2002 Feasibility Design of Fieldbus Intelligent Instruments - JB/T 10389-2002 - Chinese standardNet - bzxz.net
Home > JB > JB/T 10389-2002 Feasibility Design of Fieldbus Intelligent Instruments
JB/T 10389-2002 Feasibility Design of Fieldbus Intelligent Instruments

Basic Information

Standard ID: JB/T 10389-2002

Standard Name: Feasibility Design of Fieldbus Intelligent Instruments

Chinese Name: 现场总线智能仪表 可行性设计

Standard category:Machinery Industry Standard (JB)

state:in force

Date of Release2002-12-27

Date of Implementation:2003-04-01

standard classification number

Standard ICS number:Mechanical Manufacturing>>Industrial Automation Systems>>25.040.40 Measurement and Control of Industrial Processes

Standard Classification Number:Instruments and meters>>Instruments and meters>>N04 Basic standards and general methods

associated standards

Publication information

publishing house:Mechanical Industry Press

other information

Focal point unit:National Industrial Process Measurement and Control Standardization Committee

Publishing department:State Economic and Trade Commission

Introduction to standards:

This standard specifies the procedures, methods and main review contents for the review of reliability work items of fieldbus intelligent instruments that comply with HART, FF, PROFIBUS, LONWORKS and other protocols during the development and production process. JB/T 10389-2002 Feasibility Design of Fieldbus Intelligent Instruments JB/T10389-2002 Standard download decompression password: www.bzxz.net

Some standard content:

ICS25.040.40
Machinery Industry Standard of the People's Republic of China
JB/T 10389—2002
Reliability design assess and censor for fieldbus intelligence instrument2002-12-27Issued
2003-04-01Implemented
Issued by the State Economic and Trade Commission of the People's Republic of ChinaForeword
2 Normative references.
3 Terms and definitions
General requirements.
4.1 Division of the review stages.
4.2 Review organization...
5 Specific requirements,
5.1 Review content and material requirements in the review stage5.2 Review activity procedures..
5.3 Review project content and requirements
6 Review methods
6.1 Review items and scoring standards
6.2 Calculation of evaluation results
6.3 Conclusion..
7 Review report content and requirements,
7.1 Review report content
7.2 Reliability design analysis report content and requirements
Appendix A (informative appendix) Reliability review report (format) A.1 Reliability review report cover
A.2 Reliability review report related forms
Table 1 Reliability review team personnel and responsibilities
Table 2 Instrument reliability review items and scoring standards
Ying Daye
JB/T 10389—-2002
The structure and expression method of this standard are compiled in accordance with the provisions of GB/T1.1-2000. Appendix A of this standard is an informative appendix.
This standard is proposed by the China Machinery Industry Federation. This standard is under the jurisdiction of the National Technical Committee for Industrial Process Measurement and Control Standardization. JB/T10389—2002
This standard was drafted by Chongqing Industrial Automation Instrumentation Research Institute, and Shanghai Industrial Automation Instrumentation Research Institute, Shanghai Automation Instrumentation Co., Ltd., Beijing Yuandong Instrument Co., Ltd., and Chongqing Chuanyi General Factory Co., Ltd. participated in the drafting. The main drafters of this standard are: Sun Huaiyi, Li Jiajia, Zhu Yinxing, Dong Guoqin, Liu Yuxin, Liu Qin, and Fu Yanzhen. 1 Scope
Reliability Design Review of Fieldbus Intelligent Instruments JB/T 10389—2002
This standard specifies the procedures, methods, and main review contents for reviewing the reliability work items of fieldbus intelligent instruments (hereinafter referred to as instruments) that comply with protocols such as HART, FF, PROFIBUS, and LONWORKS during the development and production process. This standard is applicable to the reliability design review of instruments at the stages of scheme design, design finalization, and production finalization. The reliability review of other instruments can also be used as a reference. 2 Normative References
The clauses in the following documents become the clauses of this standard through reference in this standard. For any referenced document with a date, all subsequent amendments (excluding errata) or revisions are not applicable to this standard. However, parties to an agreement based on this standard are encouraged to investigate whether the latest versions of these documents can be used. For any referenced document without a date, the latest version applies to this standard. GB/T3187 Reliability and maintainability terms
GB/T7828 Reliability design review
GJB1391 Failure mode, effects and criticality analysis procedures 3 Terms and definitions
The terms and definitions established in GB/T3187 and the following terms and definitions apply to this standard. 3.1
Software reliability
Software reliability refers to the degree to which the software can provide available services in accordance with the specified design specifications within the specified time and under the specified operating conditions.
Reliability Design Review
Reliability Design Review (hereinafter referred to as Design Review) is to ensure that the design meets the reliability requirements. The review organization composed of experts from relevant aspects (design, production, use, etc.) who are not directly involved in the design will review the product design plan from the perspective of reliability according to the design and review table method determined in advance.
Reliability Work Items
Reliability Work Items refers to the reliability design report (including prediction and allocation, failure analysis and improvement measures, application of reliability design technology in specific circuit and structural design), performance and function test report, environmental test report, reliability growth test report, reliability identification test report and field trial operation report completed during the instrument development process. 3.4
Reliability Growth Plan
Reliability Growth Plan refers to the reliability level that should be achieved in the main stage of instrument development in order to achieve the reliability target value specified in the contract, and supervises and manages its reliability growth. 4 General requirements
4.1 Division of review stages
The reliability review of instruments is generally divided into three stages, namely the scheme design stage, the design finalization stage and the production finalization stage. JB/T 10389—2002
Review organization
The management department of the product design unit is responsible for organizing relevant personnel to form a review team for the reliability review of instruments (see Table 1 for review team members and responsibilities). The review team shall put forward evaluation opinions and improvement suggestions according to the review procedures and the review contents of various reliability work, and issue a review report. Table 1 Reliability review team members and positions
(5-9) people
Designer (product)
Designer (peer)
Quality control engineer
Reliability engineer
Standardization engineer
Production process engineer
Sales and procurement engineer
Metrology and testing engineer
5 Specific requirements
Make plans, preside over preliminary review and review meetings, put forward review conclusions, sign reliability design review reports Assist the team leader in organizing and planning the review, Organize the review activities according to the review procedures. Propose the design, collect relevant information to complete the reliability design report and failure analysis report, and prove the correctness of the reliability decision based on the calculation and test data. Review the performance of the design. Analyze and calculate whether the product meets all the requirements of the technical agreement and technical task book, and put forward review opinions. Check the design and test whether the various work meets the requirements of quality control, and ensure the effective implementation of inspection, control and other functions. Review the reliability design plan, reliability design report, failure analysis of key parts and measures taken. Ensure that the reliability index meets the requirements of the technical agreement or technical task book. Review whether the design drawings and documents meet the standards and relevant specifications. Review whether the design can be processed and produced economically and reasonably. Make an estimate of the market sales prospects of the product, and ensure that the parts and materials to be purchased meet the requirements of quality control and production progress. Review the correctness of the selection of measuring instruments and test instruments and the rationality of their use. Refer to the relevant requirements in GB/T7828. 5.1 Review content and data requirements in the review stage. The reliability review of the instrument can be combined with its technical performance review. 5.1.1 Review of the scheme design stage
It is conducted after the overall design scheme of the instrument is completed. 5.1.1.1 Review content
Focus on the overall design concept; the advancement, rationality and feasibility of the design scheme; the assurance of the design to meet the reliability indicators in the contract (or design task book). Review the correctness and effectiveness of the technical measures taken to solve key technologies (i.e. new technologies, new processes, new materials) and weak links, and analyze and review the reliability assurance measures of software functions, interfaces, data structures and other links. 5.1.1.2 Review materials
Design task book; rough reliability prediction and allocation report of the scheme: preliminary research report of key technologies; reliability growth plan; list of key raw materials and components; software requirements analysis manual. 5.1.2 Review of the design finalization stage
Review conducted after the prototype of the instrument is trial-produced and the reliability growth test of the prototype is completed. Through the review, the design finalization prototype is comprehensively evaluated and the reliability is signed.
5.1.2.1 Review Content
Focus on reviewing the rationality of instrument design and the application effect of reliability design technology. Review the test situation and the analysis, treatment and improvement measures of the failures that occurred during the test.
5.1.2.2 Review Materials
Reliability design report (including detailed reliability prediction and allocation, failure analysis and improvement measures, application analysis of software and hardware reliability design technology, etc.); performance and function test report of the whole machine or functional unit: environmental adaptability test report: reliability growth test report; 2
JB/T 10389--2002
On-site trial operation report; quality certification report of key raw materials and components and complete set of design documents and key process documents, etc. 5.1.3 Review in the finalization stage
Review conducted before the finalization production of the instrument after the trial production is over. The engineering level achieved by the instrument is confirmed through the review, and the reliability level is certified and countersigned.
5.1.3.1 Review Content:
Focus on the manufacturability and operability of the instrument. Review the analysis of various test results of small batch trial production; the correctness and completeness of process documents; quality control and quality inspection methods: the effectiveness of design and process improvement measures proposed in the trial production. 5.1.3.2 Review materials:
Trial production summary report, full performance test report, environmental test report, reliability verification test report, quality analysis, improvement measures effectiveness report and process documents required for formal production, etc. 5.2 Review activity procedures
5.2.1 Preparation
Implementation The management department shall propose review requirements according to the stage of instrument development, organize the review team, formulate activity plans and reliability review project inspection score sheets (see Table 2).
The person in charge of instrument development shall prepare various materials, documents, test reports, prototypes, etc. related to the review content and submit them to the review team 15 days before the formal review.
5.2.2 Pre-review
Based on the division of labor and scope of responsibilities, the review team members shall pre-review the submitted software and hardware materials respectively, record the problems found and feedback to the project leader.
5.2.3 Formal review meeting agenda
a) The main designer of the instrument shall make a reliability design analysis report; the review team shall raise questions, discuss issues of disagreement, and point out deficiencies and directions for improvement; b)
Evaluate and score the review content item by item: c
d) Study and discuss the review conclusions.
5.2.4 Follow-up management
a) When the evaluation results meet the requirements, write a review report, and after approval by the technical leader in charge, transfer to the next stage; when the evaluation results do not meet the requirements, improvements should be made, and re-evaluation should be carried out after the improvements are completed; b)
Countermeasures should be formulated for the problems raised in the review, implemented by people and solved within a time limit, and the quality control department should track and manage. 5.3 Review project content and requirements
5.3.1 Investigation and analysis report
Clearly define the target requirements, collect user opinions, understand and analyze the technical solutions, reliability levels and reliability countermeasures of similar instruments at home and abroad.
5.3.2 Design Task Book
a) Whether there is a clear reliability design target value; b) Whether there are countermeasures and measures taken to achieve the reliability design target value. 5.3.3 Overall Design Scheme
a) The inheritance and advancement of the scheme, the situation of learning from and citing past experience, whether there are innovative breakthroughs, and the degree of standardization and serialization;
b) The rationality of the structure and environmental adaptability; c) Whether the overall layout is easy to assemble, simple to operate, and safe to use; d) The rationality of the circuit design scheme, such as whether integrated, integrated, non-adjustable and less connected circuits are selected. 5.3.4 Reliability Allocation and Prediction
a) Draw the functional block diagram and reliability block diagram of the instrument, and establish the reliability mathematical model of the instrument. 3
JB/T 10389—2002
b) At which level has the reliability allocation been carried out? Whether the allocation method and procedure are correct, reasonable and feasible: whether the design has been modified if problems are found during allocation or if the index requirements cannot be met; whether the reliability prediction results meet the requirements of the reliability design target value in the contract: c) Whether the prediction of the functional unit meets the assigned value; whether a rough reliability prediction has been carried out using the rapid prediction method in the scheme stage; whether a detailed reliability prediction has been carried out using the stress analysis method in the design finalization stage; the source of the predicted failure data (supplier or manual) and the correctness of the prediction method. 5.3.5 Failure Analysis
For specific analysis procedures, please refer to GJB1391.
Mainly review the methods and procedures of failure analysis; the depth of analysis; and applicable analysis charts and tables. a) Failure Mode, Effects and Fatalities Analysis (FMECA); b) Fault Tree Analysis (FTA).
Is it a quantitative analysis or a qualitative analysis;
Analysis conducted at the component level or at the functional unit level; whether a failure analysis has been conducted on the key circuits: whether there is a report on improvement measures for failure analysis. 5.3.6 Application of reliability design technology (i.e. measures to improve reliability in design) 5.3.6.1 Component selection and management
Whether the components work in a critical state; whether they meet the failure rate requirements assigned by the reliability index: whether there are control methods for component quality.
Whether to use preferred components, select according to the components listed in the relevant standards; select according to historical data or usage information of similar applications.
5.3.6.2 Derating design
The rationality of derating; the derating policy of key components: whether other protection measures or local reserve design are considered at the same time for parts with high failure rate.
5.3.6.3 Thermal design
Whether measures are taken to reduce power consumption and heat source; reasonable opening and convection: reduce thermal resistance, enhance heat conduction, use heat sink; take insulation measures for thermosensitive components or reasonable distribution and dispersion of heat source. 5.3.6.4 Tolerance or drift design
In circuit design, whether the circuit statistical analysis of component characteristic changes is considered: analysis of the impact of component deviation, aging and drift; tolerance analysis; whether compensation is made for parameter changes to achieve stable performance; whether the matching of mechanical structural parts and the accuracy of transmission parts have been predicted, etc.
5.3.6.5 Electromagnetic compatibility design
Whether the interference source and interference transmission path analysis have been carried out; measures to eliminate or reduce interference have been taken; measures to improve anti-interference ability have been taken, etc.
Whether a reasonable grounding system has been designed; whether the circuit layout, printed circuit board arrangement and whole machine wiring are reasonable; whether shielding and filtering, current limiting, transient and overstress protection measures are correctly used. 5.3.7 Software reliability design
Mainly review the reliability activities carried out by the software at various stages of its life cycle. a) Requirements/description stage: whether the requirements of the instrument have been fully and accurately understood, and whether all the compiled requirements are consistent, clear, correct, complete and maintainable with the system description; Design stage: focus on checking the functions of the software, the module interface and the degree of assurance of the software to achieve the system requirements; b) 1
c) Coding stage: check and analyze whether the code has correctly implemented the design; 4
JB/T 10389—2002
d) Testing stage: which level of testing has been carried out: component level and (or) system level testing. Through testing, confirm whether the system can ultimately meet all the requirements of the specification: whether it meets the functional and design requirements specified by the components and systems; Maintenance stage: after the software is put into operation, continue to carry out reliability activities, correct the errors found, and ensure the operation of the system. e
5.3.8 Mechanical structure and packaging design
a) Whether the structural layout is reasonable, firm, stable in center of gravity, and resistant to vibration and impact; b) Whether the connection, assembly and interchange of mechanical and electronic parts are easy, and whether they are easy to produce and repair on assembly line; c) Whether they are easy to assemble, firmly installed, convenient to debug, and have good consistency. 5.3.9. Environmental design
a) Whether effective environmental design has been carried out; whether components and materials with high environmental resistance have been selected; b)
Whether effective environmental protection measures have been adopted to reduce the environmental stress of components to a level that can achieve higher reliability.
Human-machine engineering design
Whether there are interlocks to prevent misoperation; structures to prevent misassembly; alarm displays when faults occur; a)
Whether the signs are concise, eye-catching and easy to understand;
c) Whether the product manual meets the specified requirements. Reliability assessment test and evaluation
Provide qualified certificates that meet the acceptable reliability indicators in accordance with the requirements of the reliability growth plan in each stage of instrument development. Whether the performance and function test results meet the requirements of relevant standards. a)
b) Reliability growth test analysis (applicable to the prototype identification stage): 1) Whether the results meet the target value of the reliability growth plan for this stage; 2) Whether the design defects exposed in the growth test are used to continuously improve the design and improve the reliability of the instrument during the development stage. Reliability verification test (applicable to the batch production stage): 1) Verify whether the reliability index of the instrument meets the requirements specified in the contract; 2) Whether the test is carried out in accordance with the specified reliability verification test method; 3) Whether there is a reliability data collection method (including data record table, fault report, analysis requirements, etc.). d) On-site trial operation test (applicable to the prototype and batch production stages): 1) Whether there is a fault record, analysis, and statistical calculation method for on-site operation; 2) Whether there is an on-site user evaluation opinion. 5.3.12 Comprehensive evaluation
A comprehensive evaluation is conducted on the design concept of the instrument; the method, means, depth, breadth and effect of reliability design; the adequacy of reliability testing and the degree of elimination of faults; and the reliability assurance of the product during the design and manufacturing process. 6 Review method
6.1 Review items and scoring criteria
The review of the instrument is conducted by scoring the contents of the review checklist item by item. The review items and scoring criteria are shown in Table 2.
The review team will review each item in the table and give scores according to the following requirements: a) For the evaluation of items 1 to 8, according to the requirements of each review content: full marks for complete implementation;
80% for only a small part of the implementation;
60% for partial implementation;
40% for most of the implementation;
JB/T 10389--2002
0 marks for complete failure.
b) Evaluation of item 9:
1) Performance and function test;
Full marks for complete compliance with the specification requirements;
80% for major performance and functions that meet the requirements;
60% for partial performance and functions that meet the requirements;
40% for most of the performance and functions that do not meet the requirements;
0 marks for complete failure.
2) Reliability qualitative requirements: The reliability technical report should include reliability prediction, allocation, failure analysis, and application of reliability design technology (including software and hardware). 3) Reliability growth test:
Full marks for exceeding the reliability growth plan requirements; 80% for basically meeting the reliability growth plan requirements; 40% for failing to meet the reliability growth plan requirements; 0 marks for not conducting reliability growth analysis.
4) On-site operation test:
Full marks for fully meeting the requirements of on-site users; 80% for basically meeting the requirements of on-site users; 40% for failing to meet the requirements of on-site users; 0 marks for not conducting on-site operation.
5) Reliability verification test: Applicable only to continuous batches of products, and evaluated according to the provisions of the contract or factory standards. Note: If the review is conducted at the design finalization stage of the instrument, no reliability verification test is required, and the full score for the reliability growth test is 10 marks. Table 2 Instrument reliability review items and scoring standards Serial number
Investigation and research
Design task book (or contract)
Overall design plan
Reliability design report
Reliability block diagram
Reliability model
Reliability allocation
Reliability prediction
Failure analysis
Component selection, derating design
Design, thermal design, tolerance, drift
Design, electromagnetic compatibility design
6.2 Calculation of evaluation results
A total score of 80 points or more is excellent:
A total score of 70 points or more is good;
A total score of 60 points or more is medium;bzxz.net
A total score below 60 points is poor.
Actual score
Software reliability
Mechanical structure packaging design
Environmental resistance design
Human-machine engineering design
Reliability assessment and evaluation
Performance and function test
Reliability growth test
On-site operation test
Reliability verification test
Comprehensive evaluation
Actual score
6.3 Conclusion
JB/T 10389—2002
If the evaluation results are a), b), and c), it is considered to meet the requirements and pass the review. Improvements raised in the review must be made on schedule.
If the evaluation result is d), it is considered to be unqualified and failed the review. The reliability work that has not been done should be supplemented or the design should be modified. For the problems raised in the review, countermeasures should be formulated and implemented to the people, and they should be solved within a time limit. After the problems are solved, the review should be re-performed until they are qualified and passed. 7 Contents and requirements of the review report
Contents of the review report
Based on the review results, the review team shall prepare a review report. The review report shall include the following contents: List of review team members and division of labor;
Design objectives and levels to be achieved:
Reviewed items and inspection conclusions;
Review conclusions or suggestions;
e) Memorandum of different opinions.
7.2 Contents and requirements of the reliability design analysis report The reliability design analysis report shall be prepared by the project leader. Its contents shall include the following contents: Design basis, objectives and levels to be achieved;a)
Main features and improvement instructions of the design;
Reliability analysis and test reports completed in this stage; Analysis and countermeasures for important issues and weak links;d)
Reliability design analysis conclusions:
Appendix: Design, test data and relevant original data submitted for review. 7
JB/T 10389—2002
A.1 Cover of reliability review report
Appendix A
(Informative Appendix)
Reliability review report (format)
Reliability review report
Project number:
Project name:
Model specification:
Development unit:
Review date:
Industrial Process Control System Product Reliability Technology Center 0
Year Month Day
A.2 Reliability review report related forms
Project number
Product name
Reliability target value
Suggestions and opinions
Signature of reviewer
Development unit
Model specification
Seal of review unit
JB/T10389—20021 Review Items and Scoring Criteria
The review of the instrument is conducted by scoring the review checklist item by item. The review items and scoring criteria are shown in Table 2.
The review team reviews each item in the table and scores it according to the following requirements: a) For the evaluation of items 1 to 8, according to the requirements of each review content: full marks for complete implementation;
80% for only a small part of the implementation;
60% for partial implementation;
40% for most of the implementation;
0 points for complete implementation.
b) Evaluation of Item 9:
1) Performance and functional test;
Full marks for fully meeting the requirements of the specification;
80% for meeting the requirements for major performance and functions; 60% for meeting the requirements for part of the performance and functions; 40% for not meeting the requirements for most of the performance and functions; 0 marks for not meeting the requirements at all.
2) Qualitative requirements for reliability: The reliability technical report should include reliability prediction, allocation, failure analysis, and application of reliability design technology (including software and hardware). 3) Reliability growth test:
Full marks for exceeding the requirements of the reliability growth plan; 80% for basically meeting the requirements of the reliability growth plan; 40% for not meeting the requirements of the reliability growth plan: 0 marks for not conducting reliability growth analysis.
4) On-site operation test:
Full score for fully meeting the requirements of on-site users; 80% for basically meeting the requirements of on-site users; 40% for failing to meet the requirements of on-site users; 0 score for not being put into operation on site.
5) Reliability verification test: Applicable only to continuous batch products, and evaluated according to the provisions of the contract or factory standards. Note: If the review is conducted at the design finalization stage of the instrument, the reliability verification test is not required, and the reliability growth test has a full score of 10 points. Table 2 Instrument reliability review items and scoring standards Serial number
Investigation and research
Design task book (or contract)
Overall design plan
Reliability design report
Reliability block diagram
Reliability model
Reliability allocation
Reliability prediction
Failure analysis
Component selection, derating design
Design, thermal design, tolerance, drift
Design, electromagnetic compatibility design
6.2 Calculation of evaluation results
A total score of 80 points or more is excellent:
A total score of 70 points or more is good;
A total score of 60 points or more is medium;
A total score below 60 points is poor.
Actual score
Software reliability
Mechanical structure packaging design
Environmental resistance design
Human-machine engineering design
Reliability assessment and evaluation
Performance and function test
Reliability growth test
On-site operation test
Reliability verification test
Comprehensive evaluation
Actual score
6.3 Conclusion
JB/T 10389—2002
If the evaluation results are a), b), and c), it is considered to meet the requirements and pass the review. Improvements raised in the review must be made on schedule.
If the evaluation result is d), it is considered to be unqualified and failed the review. The reliability work that has not been done should be supplemented or the design should be modified. For the problems raised in the review, countermeasures should be formulated and implemented to the people, and they should be solved within a time limit. After the problems are solved, the review should be re-performed until they are qualified and passed. 7 Contents and requirements of the review report
Contents of the review report
Based on the review results, the review team shall prepare a review report. The review report shall include the following contents: List of review team members and division of labor;
Design objectives and levels to be achieved:
Reviewed items and inspection conclusions;
Review conclusions or suggestions;
e) Memorandum of different opinions.
7.2 Contents and requirements of the reliability design analysis report The reliability design analysis report shall be prepared by the project leader. Its contents shall include the following contents: Design basis, objectives and levels to be achieved;a)
Main features and improvement instructions of the design;
Reliability analysis and test reports completed in this stage; Analysis and countermeasures for important issues and weak links;d)
Reliability design analysis conclusions:
Appendix: Design, test data and relevant original data submitted for review. 7
JB/T 10389—2002
A.1 Cover of reliability review report
Appendix A
(Informative Appendix)
Reliability review report (format)
Reliability review report
Project number:
Project name:
Model specification:
Development unit:
Review date:
Industrial Process Control System Product Reliability Technology Center 0
Year Month Day
A.2 Reliability review report related forms
Project number
Product name
Reliability target value
Suggestions and opinions
Signature of reviewer
Development unit
Model specification
Seal of review unit
JB/T10389—20021 Review Items and Scoring Criteria
The review of the instrument is conducted by scoring the review checklist item by item. The review items and scoring criteria are shown in Table 2.
The review team reviews each item in the table and scores it according to the following requirements: a) For the evaluation of items 1 to 8, according to the requirements of each review content: full marks for complete implementation;
80% for only a small part of the implementation;
60% for partial implementation;
40% for most of the implementation;
0 points for complete implementation.
b) Evaluation of Item 9:
1) Performance and functional test;
Full marks for fully meeting the requirements of the specification;
80% for meeting the requirements for major performance and functions; 60% for meeting the requirements for part of the performance and functions; 40% for not meeting the requirements for most of the performance and functions; 0 marks for not meeting the requirements at all.
2) Qualitative requirements for reliability: The reliability technical report should include reliability prediction, allocation, failure analysis, and application of reliability design technology (including software and hardware). 3) Reliability growth test:
Full marks for exceeding the requirements of the reliability growth plan; 80% for basically meeting the requirements of the reliability growth plan; 40% for not meeting the requirements of the reliability growth plan: 0 marks for not conducting reliability growth analysis.
4) On-site operation test:
Full score for fully meeting the requirements of on-site users; 80% for basically meeting the requirements of on-site users; 40% for failing to meet the requirements of on-site users; 0 score for not being put into operation on site.
5) Reliability verification test: Applicable only to continuous batch products, and evaluated according to the provisions of the contract or factory standards. Note: If the review is conducted at the design finalization stage of the instrument, the reliability verification test is not required, and the reliability growth test has a full score of 10 points. Table 2 Instrument reliability review items and scoring standards Serial number
Investigation and research
Design task book (or contract)
Overall design plan
Reliability design report
Reliability block diagram
Reliability model
Reliability allocation
Reliability prediction
Failure analysis
Component selection, derating design
Design, thermal design, tolerance, drift
Design, electromagnetic compatibility design
6.2 Calculation of evaluation results
A total score of 80 points or more is excellent:
A total score of 70 points or more is good;
A total score of 60 points or more is medium;
A total score below 60 points is poor.
Actual score
Software reliability
Mechanical structure packaging design
Environmental resistance design
Human-machine engineering design
Reliability assessment and evaluation
Performance and function test
Reliability growth test
On-site operation test
Reliability verification test
Comprehensive evaluation
Actual score
6.3 Conclusion
JB/T 10389—2002
If the evaluation results are a), b), and c), it is considered to meet the requirements and pass the review. Improvements raised in the review must be made on schedule.
If the evaluation result is d), it is considered to be unqualified and failed the review. The reliability work that has not been done should be supplemented or the design should be modified. For the problems raised in the review, countermeasures should be formulated and implemented to the people, and they should be solved within a time limit. After the problems are solved, the review should be re-performed until they are qualified and passed. 7 Contents and requirements of the review report
Contents of the review report
Based on the review results, the review team shall prepare a review report. The review report shall include the following contents: List of review team members and division of labor;
Design objectives and levels to be achieved:
Reviewed items and inspection conclusions;
Review conclusions or suggestions;
e) Memorandum of different opinions.
7.2 Contents and requirements of the reliability design analysis report The reliability design analysis report shall be prepared by the project leader. Its contents shall include the following contents: Design basis, objectives and levels to be achieved;a)
Main features and improvement instructions of the design;
Reliability analysis and test reports completed in this stage; Analysis and countermeasures for important issues and weak links;d)
Reliability design analysis conclusions:
Appendix: Design, test data and relevant original data submitted for review. 7
JB/T 10389—2002
A.1 Cover of reliability review report
Appendix A
(Informative Appendix)
Reliability review report (format)
Reliability review report
Project number:
Project name:
Model specification:
Development unit:
Review date:
Industrial Process Control System Product Reliability Technology Center 0
Year Month Day
A.2 Reliability review report related forms
Project number
Product name
Reliability target value
Suggestions and opinions
Signature of reviewer
Development unit
Model specification
Seal of review unit
JB/T10389—20022 Contents and requirements of reliability design analysis report The reliability design analysis report shall be prepared by the project leader. Its contents shall include the following: design basis, objectives and the level to be achieved; a)
Main features of the design and improvement instructions;
Reliability analysis and test reports completed in this stage; Analysis and countermeasures for important problems and weak links; d)
Reliability design analysis conclusion:
Appendix: Design, test data and related original data submitted for review. 7
JB/T 10389—2002
A.1 Cover of reliability review report
Appendix A
(Informative Appendix)
Reliability review report (format)
Reliability review report
Project number:
Project name:
Model specification:
Development unit:
Review date:
Industrial Process Control System Product Reliability Technology Center 0
Year Month Day
A.2 Reliability review report related forms
Project number
Product name
Reliability target value
Suggestions and opinions
Signature of reviewer
Development unit
Model specification
Seal of review unit
JB/T10389—20022 Contents and requirements of reliability design analysis report The reliability design analysis report shall be prepared by the project leader. Its contents shall include the following: design basis, objectives and the level to be achieved; a)
Main features of the design and improvement instructions;
Reliability analysis and test reports completed in this stage; Analysis and countermeasures for important problems and weak links; d)
Reliability design analysis conclusion:
Appendix: Design, test data and related original data submitted for review. 7
JB/T 10389—2002
A.1 Cover of reliability review report
Appendix A
(Informative Appendix)
Reliability review report (format)
Reliability review report
Project number:
Project name:
Model specification:
Development unit:
Review date:
Industrial Process Control System Product Reliability Technology Center 0
Year Month Day
A.2 Reliability review report related forms
Project number
Product name
Reliability target value
Suggestions and opinions
Signature of reviewer
Development unit
Model specification
Seal of review unit
JB/T10389—2002
Tip: This standard content only shows part of the intercepted content of the complete standard. If you need the complete standard, please go to the top to download the complete standard document for free.