Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
I&C Safety Guide DRAFT 20110803.doc
Скачиваний:
13
Добавлен:
01.02.2015
Размер:
720.38 Кб
Скачать

10.72. Verification should include the following techniques:

  1. Manual examinations such as reviews, walk-throughs, inspections, and audits,

  2. Static analysis of the source code, and

  3. Dynamic testing.

10.73. Static analysis should be performed on the final version of the software.

10.74. Static analysis techniques used will differ according to the system’s importance to safety and include verification of compliance with design, coding, and standards constraints; control, data and information flow analysis; symbolic execution; and formal code verification.

10.75. All non-functional requirements implemented in software (e.g., numerical precision, timing, and performance) should be tested.

10.76. Operating experience should be used to identify correct anomalies for correction, and to provide further confidence in and its dependability.

10.77. Operating experience can supplement, but cannot replace other verification techniques.

10.78. See 8.152 to 8.168 for guidance relevant to the use of tools for software verification and analysis.

10.79. For software implementation, a test strategy (e.g., bottom-up or top-down) should be determined.

10.80. The test case specifications should ensure adequate testing of:

  1. Interfaces (such as module-module, software-hardware, system boundary);

  2. Data passing mechanisms and interface protocols;

  3. Exception conditions;

  4. All ranges of input variables (using techniques such as equivalence class partitioning and boundary value analysis); and

  5. All modes of system operation.

10.81. It should be possible to trace each of the test cases using a traceability matrix showing the linkage between software requirements, design, implementation and testing.

10.82. To facilitate regression testing, test plans should ensure that tests are repeatable.

10.83. It is also desirable to minimize the human intervention required for repeated tests.

10.84. GS-G-3.1, Ref. [5] provides guidance for ensuring suitability of measuring and test equipment used for testing.

10.85. The test case specifications and effectiveness should be reviewed and any shortfalls against the targets in the verification plan should be resolved or justified.

10.86. Testing personnel should be independent from the development team.

10.87. For systems other than safety systems, independence applies to at least one member of the testing team, not all.

10.88. All I&C system outputs should be monitored during the verification and any variation from the expected results should be investigated. The results of the investigation should be documented.

10.89. Any shortfall in the verification results against the verification plan (e.g., in terms of coverage achieved) should be resolved or justified.

10.90. Detected errors should be analyzed for cause and corrected under the control of agreed modification procedures and regression tested as appropriate.

10.91. Records of the numbers and types of anomalies discovered should be maintained, reviewed for their insight into the development process, and used to implement appropriate process improvements for the benefit of the current and future projects. (See GS-G-3.1, Ref. [5] paragraphs 6.50-6.77 and GS-G-3.5, Ref. [6] paragraphs 6.42-6.69.)

10.92. Verification and analysis documentation should provide a coherent set of evidence that the products of the development are complete, correct, and consistent.

10.93. The verification results, including test records, should be documented, maintained and be available for quality assurance audits and third party assessments.

10.94. For safety systems, the documentation of the verification results should be traceable to and from the test plans and indicate which results failed to meet expectations and how these were resolved.

10.95. Test coverage should be clearly documented.

10.96. For safety systems traceability between each test the related functional requirement should be documented.

10.97. Test documentation should be sufficient to enable the testing process to be repeated with confidence of achieving the same results.

PRE-DEVELOPED SOFTWARE

10.98. Pre-developed Software (PDS) used in I&C systems should have the same level of qualification as for software that is written specifically for the application. See paragraphs 7.67-7.81 for guidance on qualifying any item important to safety, including pre-developed software.

10.99. Commercial-off-the-shelf (COTS) software is PDS that was not developed under a nuclear quality assurance program.

10.100. Because the COTS software was not designed and developed under a nuclear quality assurance program, an alternative approach to that described in this chapter, may be used to verify that the software is acceptable for use in any application that is important to safety.

10.101. The objective of this alternative approach is to verify that the suitability and correctness of COTS software is equivalent to that of software components developed under an acceptable nuclear quality assurance program and can thus provide a reasonable assurance that the commercial grade software will perform the intended safety function or functions.

Соседние файлы в предмете [НЕСОРТИРОВАННОЕ]