Skip all navigation and jump to content Jump to site navigation Jump to section navigation.
NASA Logo + Visit NASA.gov
Assurance Process for Complex Electronics
Home Complex Electronics Background Complex Electronics Assurance Process TECHNIQUES CHECKLISTS Site Map
Life Cycle
PLANNING
V&V
REQUIREMENTS
PRELIMINARY DESIGN
DETAILED DESIGN
IMPLEMENTATION
testing
operationsoperations
SUPPORTING PROCESSES
PRINT THIS SECTION

Assurance Process for Testing Phase

The assurance role during the testing phase is to ensure that the tests are adequate to verify the requirements. Beyond that, good assurance engineers help the project define tests that exercise the system, or system element, in various situations, including fault and failure conditions. The real world is never perfect, and systems need to be able to respond appropriately to the environment they are operating in.

Use the Tailoring chart to determine which activities or analyses are required for a particular assurance classification. Activities that are not required may still be performed, if desired. Assurance activities for complex electronics in the testing phase include:

The table below uses the Complex Electronics Classification to map the activities, and depth of each activity, against the classification. This table allows for easy tailoring of the assurance activities to the device complexity and assurance level.

Tailoring Guidance for Assurance Activities - Testing Phase

 

Low

Moderate

High

Test Verification Review/approve procedure. Witness test or review test results. Review/approve procedure. Witness test and review test results. Review/approve procedure. Witness test and review test results. Ensure rigorous testing.

Problem Trend Analysis

 

Not performed

Review problem reports occasionally

Formal trend analysis

Process Verification Informal Moderately formal Formal Audits
FCA/PCA Performed Performed Performed

Risk analysis

Informal

Informal

Formal

Traceability Analysis Ensure all requirements trace to test, analysis, or inspection. Ensure all requirements and design elements trace to test, analysis, or inspection. Ensure all requirements and design elements trace to test, analysis, or inspection. Perform backward trace.

Test Verification

During testing, the assurance engineer is responsible for reviewing the test procedures (and approving them, in most cases) to ensure that the tests adequately verify the requirements. In addition, the assurance engineer often witnesses the tests, verifying that the success criteria is met for the requirements. Alternately, the assurance engineer may review the test results to ensure that testing occurred as planned.

Test related activities for the assurance engineer include:

  • Verify that the testing strategy has been documented in a plan and/or procedure, and that testing occurs according to the plan.
  • Verify that the planned tests will completely verify the requirements in all reasonably expected situations. This includes verifying the functionality and performance in nominal situations and when other parts of the system have errors. How gracefully does the device handle errors it may encounter? How gracefully can it handle any internal faults?
  • Verify that the planned tests will exercise all modules or other division in the device. Not every level of testing has to exercise all modules, but each module should be tested at some level (device, circuit board, sub-system, or system).
  • Review the test procedures for feasibility, especially if the complex electronics is integrated with other system elements. Can the tests be performed without risk to the system components? Are test points available to allow access to signals required by the procedure? Is the test operating the device in an operational mode, as close as possible to how it will be operated within the system?
  • Review the test plans and procedures to identify any areas where testing is weak. You are looking for modules that are only minimally tested, requirements that are only verified under some circumstances, and other areas where additional testing may be helpful.
  • Witness tests (as agreed to in the project plans) and document any anomalies and problems.
  • Review the test results to verify that no unnoticed anomalies occurred. Sometimes during testing many events are occurring and an anomaly unrelated to the aspect of the particular test may be missed.
  • Ensure that any embedded software in the complex electronics is verified, and that the interfaces between the software and the complex electronics are correct.

One area to pay particular attention to is Commercial Off-the-Shelf (COTS) or re-used IP (Intellectual Property) modules or cores. Make sure the tests show that the IP module is accessed correctly and that it performs as specified. Also ensure that testing verifies that the unused functions, if activated within the device, do not cause a safety hazard or other critical problem.

Problem Trend Analysis

Problem Trend Analysis identifies repetitive problems and assesses how often given problems occur. It also provides a mechanism to track progress of problem resolution. The main objective of this analysis is locating where key problems are occurring and the frequency of occurrence.

Problem Trend Analysis is more of a system-wide activity, rather than focused solely on complex electronics. As such, it should be performed by the quality assurance or systems engineer, to understand where problems are occurring. Regardless of who performs the analysis, a knowledgeable assurance engineer needs to review the problem reports that relate to the complex electronics (and the board, etc. that the chip is part of). Pay particular attention to problems that could indicate design errors in the complex electronics. Also note the number of unexplained anomalies that might relate to the device.

More detail on Problem Trend Analysis can be found in Section 8.2 of NASA Reference Publication 1358 , System Engineering "Toolbox" for Design-Oriented Engineers.

Process Verification

Process assurance activities for this phase include:

  • Verify the defined process are in place and are being followed correctly.
  • Verify configuration management is functioning properly to control revisions to the design that may occur during testing activities.
  • Verify that problems and anomalies are being recorded in the project problem reporting/corrective action system, and that the problem resolutions are correct, approved, and properly implemented.
  • Perform a level of impact analysis for any changes to the design, considering the testing that has occurred before and the possibility of affecting other parts of the system.
  • Ensure that all safety verifications are performed. Keep the system safety engineer apprised of any design changes that may affect safety.

Functional and Physical Configuration Audits (FCA/PCA)

The FCA is the formal examination of the "as-tested" functional characteristics of a configuration item (CI). The audit verifies that the item meets the requirements specified in its functional baseline documentation. Any discrepancies are identified and recorded. Functional configuration audits also assure that the technical documentation accurately reflects the functional characteristics of the device. To perform an FCA, the test procedures and test results used to perform testing are examined against the specifications.

The PCA is the formal examination of the "as-built" configuration of a configuration item (hardware and software) against its technical documentation. The PCA normally includes a detailed audit of engineering drawings, specifications, and technical data (including COTS documentation). For complex electronics, the design documentation, such as the HDL code, should be audited as well. The PCA for a CE is not performed until the FCA has been completed.

Update Analyses

Analyses previously performed should be updated at this time.

Risk Analysis

Evaluate previous risks to identify those that no longer apply or that have changed their priority based on changes in probability or impact. Identify any new risks relevant to this phase of development and determine which require mitigation plans. Check that preventive measures and/or contingency plans exist for all identified risk items and that the risk, with mitigations in place, is acceptable for completing the Testing phase and going operational.

Traceability Analysis

Complete the requirement tracing into the test procedures where the requirements are verified. For high assurance devices, trace backward from the test procedures to ensure that testing is focused on the requirements (what the system has to do) and the design (how it will do those activities). Coverage analysis may be performed to ensure that all elements of the design are exercised in at least one test. Any requirements that are not verified in a test (or by analysis or inspection) should be reported to the project, to ensure that the requirements are included in future tests.

Other Analyses

The other analyses, FMEA, FTA, and Interface, do not require updates during this phase, unless there is a design change.


FirstGov logo + NASA Privacy, Security, Notices NASA Curator: Richard Plastow
NASA Official: Cynthia Calhoun
Last Updated: 12/14/2009