ISO 9001:2015 Clause 8.3.4 Design and development controls

The organization shall apply controls to the design and development process to ensure that:
a) the results to be achieved are defined;
b) reviews are conducted to evaluate the ability of the results of design and development to meet requirements;
c) verification activities are conducted to ensure that the design and development outputs meet the input requirements;
d) validation activities are conducted to ensure that the resulting products and services meet the requirements for the specified application or intended use;
e) any necessary actions are taken on problems determined during the reviews, or verification and validation activities;
f) documented information of these activities is retained.
NOTE Design and development reviews, verification and validation have distinct purposes. They can be conducted separately or in any combination, as is suitable for the products and services of the organization.

1) Design and Development Controls

Once all design inputs are finalised, the next step is to ensure adequate controls are applied so that the outputs of the design and development process are clearly defined and are as per customer’s requirement. Controls can be applied in the form of reviews, verification and validation of design and development activities. While review, verification and validation are done to meet distinct purposes, they can be conducted separately or in any combination as it is suitable for the product and services of the organisation.

  1. Defined outcomes including such as specifications, design intent, functional and performance requirements, customer/end user expectations;
  2. Design review process with functional representation from the customer, engineering, production, quality, project management etc.), design review gates (e.g. preliminary design review, detail design review, critical design review), commercial/technical considerations, authorized progression to next stage;
  3. Verification activities such as modelling, simulations, alternative calculations, comparison with other proven designs, experiments, tests, and specialist technical reviews;
  4. Validation activities such as functional testing, performance testing, trials, prototypes, demonstrations, and simulations;
  5. Management of actions arising from design reviews, verification or validation activities e.g. action registers, ownership, timescales, escalation, changes to risk profile.

Design controls, which may include; design validation, design verification, assurance gate reviews, design review, design checking, safety risk management, design risk management, Design Failure Mode Effects Analysis, value engineering and CAD management are an interrelated set of practices and procedures that are focused on managing the design of a product or service and are intended to reduce uncertainty until a detailed and solidified and approved design is reached. As a system of checks and balances, design control activities make a systematic assessment of the design an integral part of product development. As a result, deficiencies in design input requirements, and discrepancies between the proposed designs and requirements, are made evident and corrected. The essential quality aspects and the regulatory requirements, such as safety, performance, and dependability of a product (whether hardware, software, services, or processed materials) are established during the design and development phase. The controls referred to in the sections below can be incorporated into your design and management process. Although various design controls are described, they are included for illustrative purposes only as there may be alternative ways that are better suited to a particular manufacturer or design activity. The use of these techniques must be proportional to the nature of the risks of the product or service.

  1. CAD management: Drawings should be prepared using computer aided design (CAD) software and should be undertaken in accordance with best practice methods. The production of digital models and drawings must be managed using document control and approval software which has the facility for the use of secure electronic signatures. The process records the individuals who sign off each stage of the work flow and allow the design to proceed to the next stage of the process. The system controls who is allowed to authorise each stage, for example preparers, checkers and approvers will be restricted to people who, under the designer’s competence management system, are competent to carry out that stage of the process. As required, a work flow will be agreed with the Design Team and used to manage and record all stages of the CAD production process. CAD deliverables are monitored to ensure all stages of the process are recorded and auditable.
  2. Value engineering:The primary aim for engineering design is to produce safe, economic and compliant designs that produce the Lowest Total Cost (LTC). Although engineering design costs are monitored and incentivised to be held to a minimum, value engineering will be focused on maximising the opportunities to reduce the LTC. Value Engineering (VE) will be conducted throughout the life cycle of the design project but it is recognised that early VE initiatives usually yield the greatest cost benefits.
  3. Design Failure Mode Effects Analysis (DFMEA): Design Failure Mode Effects Analysis (DFMEA) is an analysis technique which facilitates the identification of potential problems in the design by examining the effects of lower level failures, while providing an objective evaluation of design requirements and design alternatives. Starting early in the design process, the Engineering Manager is usually responsible for completing the design failure mode effects analysis DFMEA before the time preliminary drawings are done, and before any tooling requirements are specified, in order to:
    • Analyze hardware, functions, and products before they are released to production;
    • Identify potential failure modes of products (system, subsystem, and component levels) caused by design deficiencies;
    • Provide an initial design for manufacturing and assembly requirements;
    • Increase the probability that potential failure modes and their effects have been considered in the design and development process;
    • Provide additional information to help plan thorough and efficient test programs;
    • Develop a list of potential failure modes ranked according to their effect on the customer;
    • Establish a priority system for design improvements;
    • Provide an open issue format for recommending and tracking risk reducing actions;
    • Provide future reference to aid in analyzing field concerns;
    • Report risk analysis and DFMEA results at Design Reviews.
  4. Design risk management:Design risk management begins with the development of the design input requirements. As the design evolves, new risks may become evident. To systematically identify and, when necessary, reduce these risks, the risk management process is integrated into the design process. In this way, unacceptable risks can be identified and managed earlier in the design process when changes are easier to make and less costly. The Design Manager should be responsible for implementing regular design risk reviews and for capturing its output in order to ensure that all functional requirements are included and evaluated. The Design Manager should ensure that the design risk analysis reflects the latest configuration of the design solution and that design risk analysis is continually managed and updated with each design modification. All identified risks should be summarized for risk mitigation, communication and knowledge sharing.Elements of a risk assessment include but are not limited to the following;
    • Quality performance (past and current);
    • Required approvals;
    • Assumptions;
    • Requirements;
    • Geographical/political/ethical;
    • Financial;
    • Customer satisfaction;
    • Human resources;
    • Improvement activities;
    • Delivery;
    • Manufacturing capability and capacity;
    • Supplier make/buy decisions and supplier control;
    • Design capability and capacity;
    • Special processes;
    • Design complexity;
    • Manufacturing complexity.
  5. Safety risk management:The legal obligation to produce designs that are safe to manufacture, operate and maintain is embedded into the design processes. The design process should contain appropriate checks and reviews to ensure that the Design Team discharge their responsibilities and produce deliverables that comply with the relevant design standards. Safety in design is provided through Controlling the level of individual technical competence, Defining the processes that establish the framework for the elimination of hazards and mitigation of risks within the design and at interfaces and Ensuring that the design satisfies the project requirements. The Design Team is required to eliminate hazards where possible and to reduce construction, operation and maintenance risks in the final design. It is recognised that the risk profile changes as the design proceeds but the overriding obligation is to reduce the risks to an acceptable minimum. The Design Team are required to:
    • Carry out Designer’s Risk Assessment;
    • Reduce safety risks to be a tolerable ALARP for all parties;
    • Reduce the commercial impact of risk to acceptable levels whilst remaining within the Law; and
    • Know what the risks are at any point in time.
  6. Tools and techniques:The appropriate tools and techniques used by competent personnel and are applied to meet the needs of the unique product or process being designed.The Engineering Manager is responsible for providing a design, which is producible, verifiable, and controllable under the specified production, installation, and operational conditions. Project management tools and methodologies are used to manage the development process in order to deliver timely, profitable solutions.All software that is used in calculations and other design and development activities should be validated, verified and approved. Software developed in-house is validated and approved prior to release. Software documentation includes validation specifications approved by the Engineering Manager and validation records attesting to acceptable performance. Standard and/or commercial CAD and calculation modelling software can be accepted without validation. Software that has been successfully used in design and development prior and has proven to demonstrate successful performance for at least one year may be used without validation testing.All spreadsheets should be validated by manual calculation or alternative analysis methods and records of the process are provided as part of the design submission. The name of the spreadsheet, unique identification, localisation, and the person responsible for the spreadsheet are documented. The records should also include verification, regular verification and other issues such as updates or any problem encountered. Verification is completed after installation and recorded. When setting up a new Excel spreadsheet for calculations, the following good practices reduce the risk of accidental modifications of the template and erroneous data input:
    • All calculating cells shall be locked (Format Cells > Protection > Locked) in order to protect cells containing calculations against unintended modification, except those used for data input;
    • Data validation rules (Data tab > Data Validation) can be applied to data input cells to prevent the introduction of aberrant values;
    • Input messages and Error alert messages should be used to inform the end user of the expected data type and acceptable range;
    • Cells used for presenting the results of the calculations (output) can be identified by a specific colour. When the results are tested against acceptance criteria it is recommended using conditional formatting (Home tab > Conditional Formatting) to highlight out-of-specifications results;
    • The name of the operator responsible for data entry, and the date and time of data entry should be recorded in dedicated input cells or the spreadsheet is printed, signed and dated after calculation;
    • Password protection is recommended for all cells containing calculations (Review tab > Protect Sheet), with only the default options checked;
    • The same password should be used for all sheets and can be documented in the validation file;
    • After protecting each sheet, the workbook structure should also be password protected (Review tab > Protect Workbook). The same password can be used as the one for sheet protection.
  7. Design checking: All design and development output documentation must be reviewed and checked by competent and skilled personnel, and approved by the Engineering Manager prior to release. To provide technical assurance, all designs follow the ‘Prepare’, ‘Check’, and ‘Approve’ process, evidenced by the signatures of competent individuals. The Design Manager should arrange for a design category check to be carried out that is proportionate to the level of risk. The design category checks include a review of the design concepts and assessments in order to critically consider whether the base parameters are valid. The Checkers are required to undertake a review of the CDS to confirm that the approach is reasonable. The Checker should also consider the safety and practicability, and the proper functioning of the proposed design. For Category II (2) and III (3) Checks an independent set of design calculations must be prepared. The check also includes an independent technical assessment to determine and confirm design parameters.The levels of checking are as follows:
    • Category I (1) – Designs may be checked in the same group as that which prepared the design but by a person other than the designer;
    • Category II (2) – Designs may be checked in the Designer’s office by a separate group, which has not been involved in the original design, or by an approved outside organization;
    • Category III (3) – Designs will be checked by an independent engineering organization. A Category III check is applicable for complex or unusual designs.
  8. Design reviews:Design reviews should be carried out after the initial concept stage and again after the detailed design stage and finally, before the design is released. The design review function is carried out at various stages of the design process in order to check that the design solution is in accordance with the original design inputs and objectives and includes identification of concerns, issues and potential problems with the design.Design review meetings should be held at pre-defined points during the development process, with reviews held on an as-needed basis, depending upon the complexity of the design. Participants of design review meetings are competent to evaluate the design stage and discipline under review to permit them to examine the design and its implications.
  9. Single-consultant Design Review (SDR):The Single-consultant Design Review (SDR) is a presentation of the design to relevant stakeholders. These reviews are carried out by the Design Manager when the design has progressed by 20%, 60% and 100%. The purpose of the review is to present evidence at each of these stages to confirm that the design is compliant with the standards and requirements defined in the Conceptual Design Statement. The reviewers are responsible for raising any comments, while the Design Manager should be responsible for capturing comments using the Design Review Meeting Minutes, and referencing the document upon which they are commenting along with their name. If a reviewer cannot attend a session it is their responsibility to ensure adequate cover or to issue their comments the Design Manager for inclusion. The minutes of SDR meetings are recorded. Meeting minutes include a detailed listing of all the documents that have provided the basis of the review. Issues raised may be addressed in the following design stage. Any outstanding issues are recorded in the Design Issues Log (or similar), presented at the Assurance Gate Review meeting as issues for the next design stage and subsequently confirmed as being closed out at the subsequent Gate.
  10. Inter-consultant Design Review (IDR):The Inter-consultant Design Review (IDR) is a presentation of the design of a work package or packages to interfacing Design Teams. These are carried out by the Design Manager when the design has progressed by 20%, 60% and 100%. Its primary purpose is to seek evidence that all interfaces have been agreed and that the design integrates to deliver the requirements. At each IDR an Inter-consultant Design Review Certificate is produced to evidence that all interfacing Design Teams are satisfied with the design under consideration. It should be signed by accepted representatives of the interfacing Design Teams and contain a list of any actions required to close out any exceptions raised but not deemed a bar to acceptance. The reviewers are responsible for issuing any comments in writing using the Design Review Meeting Minutes, and referencing the document upon which they are commenting along with their name. If a reviewer cannot attend a session it is their responsibility to ensure adequate cover. The minutes of IDR meetings are recorded and include a detailed listing of all the documents that have provided the basis of the review. Issues raised may be addressed in the following design stage. Any outstanding issues are recorded in the Design Issues Log (or similar), presented at the Assurance Gate Review Meeting as issues for the next design stage and subsequently confirmed as being closed out at the subsequent Gate. Other instances of design reviews may be required when the Engineering Manager has identified significant design change that requires a review to re-validate the design.
  11. Design verification:Design verification is confirmation by examination and provision of objective evidence that the specified input requirements have been fulfilled. Any approach which establishes conformance with a design input requirement is an acceptable means of verifying the design with respect to that requirement. Complex designs require more and different types of verification activities. The nature of verification activities varies according to the type of design output. Design verification is carried out to check that the outputs from each design phase meet the stated requirements for the phase. Requirements traceability verification is undertaken to ensure that the design fulfils the design concept, while expressing the necessary functional and technical requirements. This process verified throughout the Assurance Gate Reviews. In most cases, verification activities are completed prior to each design review, and the verification results are submitted to the reviewers along with the other design deliverables to be reviewed. The results of the design verification, including identification of the design, method(s), the date, and the individual(s) performing the verification, shall be documented and retained.
  12. Design validation:Design validation is similar to verification, except this time you should check the designed product under conditions of actual use. If you are designing dune buggies, you might take your creation for a spin on the beach. If you are making beverages, you might conduct a consumer taste test. Verification is a documentary review; while validation is a real-world test. Perform design and development validation by ensuring the product meets the specified requirements. Maintain records of validation activities and approvals. Design validation follows successful verification, and ensures, by examination and provision of objective evidence, that each requirement for a particular use is fulfilled. The performance characteristics that are to be assessed are identified, and validation methods and acceptance criteria are established.At the commencement of the design project, the requirements received from the previous design phase form the initial baseline. During design reviews, the requirements are considered to ensure that the right requirements and any assumptions have been captured, to identify missing requirements and ensure that the design intent will meet those requirements.The results of the design validation, including identification of the design, method(s), the date, and the individual(s) performing the validation, shoul be documented and retained. The organization shall have records that the product designed will meet defined user needs prior to delivery of the product to the customer, as appropriate.Methods of validation could include simulation techniques, proto-type build and evaluation, comparison to similar proven designs, beta testing, field evaluations, etc. Irrespective of the methods used, the validation activity should be planned, executed with records maintained as defined in the planning activity.Retain documented information to demonstrate that the any test plans and test procedures have been observed, and that their criteria have been met, and that the design meets the specified requirements for all identified operational conditions e.g. reports, calculations, test results, data, and reviews.
  13. Assurance reviews: The Design Manager should ensure that design reviews are carried out in accordance with the Design Management Plan when the design has progressed by 20%, 60% and 100%. A cross functional, multidisciplinary team (including at least one individual who does not have direct responsibility for the design stage under review) undertake a documented, comprehensive, systematic examination of the design to evaluate its adequacy, to determines the capability of the design to meet the requirements, and to identify problems, whilst ensuring that:
  1. The input for the Design Reviews is captured from all stakeholders;
  2. All open actions from previous Design Reviews are tracked through to closure;
  3. All areas of concern are highlighted for further discussion and risk mitigation;
  4. All design reviews are documented and shared with stakeholders in a timely manner.

The following elements are considered during design reviews:

  1. Customer needs and expectations versus technical specifications;
  2. Ability to perform under expected conditions of use and environment;
  3. Safety and potential liability during unintended use and misuse;
  4. Safety and environmental considerations;
  5. Compliance with applicable regulatory requirements, national, and international standards;
  6. Comparison with similar designs for analysis of previous quality problems and possible recurrence;
  7. Reliability, serviceability, and maintainability;
  8. Product acceptance/rejection criteria, aesthetic specifications and acceptance criteria;
  9. Ease of assembly, installation, and safety factors;
  10. Packaging, handling, storage, shelf life, and disposability;
  11. Failure modes and effects analysis;
  12. Ability to diagnose and correct problems;
  13. Identification, warnings, labelling, traceability, and user instructions;
  14. Manufacturability, including special processes;
  15. Capability to inspect and test;
  16. Materials and components specifications;
  17. Review and use of standard parts.

The reviewers are responsible for raising any comments, while the Design Manager should be responsible for capturing comments using the Design Review Meeting Minutes. Conclusions drawn during design reviews are considered and implemented as appropriate. Not all identified concerns result in corrective actions, the Engineering Manager should decide whether the issue is relevant, or the issue is erroneous or immaterial.In most cases, however, resolution involves a design change, a change in requirements, or a combination of the two. Records of design review meetings are retained and identify those present at the meeting and the decisions reached.

14. Assurance gate reviews: The Assurance Gate Reviews 1 to 3 are the primary control mechanism that provides progressive assurance when evidence is reviewed at defined stages to confirm that the designs produced meet the design project’s objectives, requirements, obligations and that the risks associated with the engineering are identified and fully understood.

  • Gate 1 – (Initial concept (20% complete) The details will be outline only but will define the character, limit and form of manufacture, fabrication or construction.
  • Gate 2 – (Functional design (60% complete) At this stage the design has progressed to an intermediate position (progress check at 60% complete) This Gate is a check point at about the mid-point between Gate 1 and the final design. At the outset of a design project, the target deliverables at Gate 2 are clearly defined so that it provides an interim way point to confirm progress.
  • Gate 3 – (Detailed design ready for manufacture, fabrication or construction (100% complete) At this stage the design is complete and ready to be issued for manufacture, fabrication, or construction. Design details are finalised and fully integrated with other interfaces.

The purpose of the Assurance Gate Review process is to provide progressive assurance during the design stage that the objectives of the design intent can be achieved and that the design can progress successfully to the next stage. The next stage of the design process can only proceed when the Assurance Gate Review is successfully passed. If the evidence submitted at the Assurance Gate Review demonstrates that the design meets the objectives, it will be approved. If the Gate Review Panel decides that the submitted deliverables fall short of the requirements, the design will not pass through the Assurance Gate Review and is therefore prevented from proceeding to the next stage. The Gate Review Panel also known as the ‘Approval Authority’ has the responsibility to make the appropriate decision at each Assurance Gate Review. The Gate Review Panel is a multi-discipline committee formed of members from various departments and stakeholders throughout the organization. The Gate Review Panel members should be selected based on perceived risks, applicable regulatory or legal requirements, technical complexity, financial repercussions and criticality of the product. Department representation should include: Quality, Manufacturing, Engineering, Sales, Planning, Purchasing, Business Development, Contract, Legal, or others as deemed necessary. Formal, documented design and development Assurance Gate Reviews should be held at appropriate stages of the design and development cycle and include representatives from all concerned functions and stakeholders. Each Assurance Gate Review focuses on assessing whether the design deliverables meet all the objectives and appropriate criteria. The minimum approval criteria used for determining whether the design meets the intent are set out below. In addition to these minimum requirements, the Engineering Manager may specify further criteria at the outset of each design stage. The Gate Review Panel is responsible for managing the Gates Review process thereby ensuring that:

  • The design progress and the design status has successfully reached a stage of development appropriate to the Gate being assessed;
  • Cost and programme issues have been agreed and align with budget constraints;
  • The assurance evidence presented to the panel is sufficient to support the Gate requirements;
  • The risks are either designed out, have appropriate mitigation or have been clearly identified and agreed that they can proceed to the next stage;
  • All the necessary deliverables and other legal have been identified, complied with and that the design is compliant with any including undertakings and assurances;
  • At the conclusion of the Gate Review Panel and the Gates Chair Person a shall confer, taking full account of the views of the other Panel Members, and decide whether or not the design submission and presentation meets the Assurance Gate Review objectives and consequently can be given a pass or is prevented from passing the Gate.
  • If the Gates Chair Person decides that missing deliverables or evidence do not impact on the ability of the project to proceed, then a conditional pass may be given, subject to the remaining deliverables being completed within a specified time.
  • The conditions and timescales are conveyed to the Design Manager at the Review;
  • Where conditions are raised that are potentially of a significant risk, consideration shall be given to inclusion of the conditions;
  • The Gate Review Panel’s findings and decisions are recorded, together with any supporting data.

The Design Review Meeting Minutes should capture the results of the Gate Review Panel’s review. It serves as a record of the review and summarises the findings. The key aspects of the report are recording the evidence presented to satisfy the approval criteria and using this to support the decision regarding pass or resubmission. It is the Design Manager’s responsibility to assemble and present to the Gate Review Panel sufficient evidence, see table of deliverables below, when the design has progressed to 20%, 60% and 100%, to enable the Gate Review Panel to discharge their duties. Key design deliverables that are associated with the Assurance Gate Review are provided to the Gate Review Panel at least 5 working days prior to the scheduled review date

2) The organization shall apply controls to the design and development process

Organizations can apply various controls to the design and development process within a Quality Management System (QMS) to ensure that products, services, or processes meet quality standards and customer requirements. These controls are crucial for maintaining consistency, minimizing errors, and achieving the desired outcomes. Here are some key controls that organizations can implement in the design and development process within their QMS:

  1. Design and Development Planning: Establish a comprehensive plan that outlines the objectives, scope, schedule, and resources required for the design and development process. This plan should consider risk management and quality objectives.
  2. Requirements Management: Effectively capture, document, and manage requirements from customers, stakeholders, and relevant regulations or standards. Ensure that changes to requirements are controlled and well-documented.
  3. Risk Management: Identify, assess, and mitigate risks associated with the design and development process. Develop risk mitigation plans to address potential issues and uncertainties.
  4. Change Control: Implement a formal change control process to evaluate, approve, and document changes to project scope, requirements, or design elements. This helps prevent scope creep and ensures changes are properly managed.
  5. Design and Development Reviews: Conduct regular reviews throughout the design and development process to assess progress, verify compliance with requirements, and identify and address issues or deviations.
  6. Design Verification: Ensure that the design meets specified criteria and is in accordance with established standards or regulations through verification activities, such as testing, simulations, or inspections.
  7. Design Validation: Validate the final design to ensure it meets the actual user needs and intended application. This may involve user testing or field trials.
  8. Configuration Management: Implement configuration control to manage and track changes to design documents, specifications, and related information. Ensure that the correct version of design documentation is used.
  9. Document Control: Maintain robust document control processes to manage design and development documents, including version control, approval processes, and access restrictions.
  10. Testing and Quality Assurance: Develop and execute comprehensive testing plans to identify and correct defects, ensuring that the final product or service meets quality standards.
  11. Prototyping and Modeling: Use prototypes and models to validate design concepts and assess feasibility before committing to full-scale development.
  12. Supplier and Vendor Controls: If external suppliers or vendors are involved in the design and development process, establish controls to monitor their performance, quality, and compliance with contractual requirements.
  13. Performance Metrics and Monitoring: Define key performance indicators (KPIs) and implement monitoring systems to track progress, measure performance, and identify areas for improvement.
  14. Training and Competence Development: Ensure that team members involved in the design and development process have the necessary skills, training, and competence to perform their roles effectively.
  15. Design History File (DHF) or Technical File: Maintain a comprehensive record of all design and development activities, including design inputs, outputs, verification, validation, and change control.
  16. Regulatory Compliance: Ensure that the design and development process complies with relevant regulatory requirements and standards specific to the industry or product.
  17. Customer Feedback and Involvement: Seek customer feedback throughout the design and development process and involve customers or end-users in user acceptance testing and validation.
  18. Post-Market Surveillance: Establish procedures for post-market surveillance and monitoring of products or services after they have been released to identify and address any issues or opportunities for improvement.

These controls should be tailored to the organization’s specific needs, industry, and the complexity of the design and development process. Implementing these controls within a QMS helps ensure that the organization consistently delivers high-quality products, services, or processes that meet customer expectations and regulatory requirements.

3) The results to be achieved are defined;

Ensuring that the results to be achieved are well-defined is a fundamental aspect of controlling the design and development process within a Quality Management System (QMS). This control is critical for clarity, alignment with objectives, and the ultimate success of the project. Here’s how an organization can apply controls to achieve this:

  1. Clearly Define Objectives and Requirements: Begin by establishing clear, measurable objectives for the design and development project. These objectives should be aligned with the organization’s strategic goals and customer requirements. It’s essential to involve relevant stakeholders in defining these objectives to ensure their buy-in and alignment.
  2. Document Requirements: Thoroughly document all requirements, including customer requirements, regulatory requirements, and internal requirements. Use techniques such as a Requirements Traceability Matrix to ensure that each requirement is linked to specific project objectives.
  3. Scope Definition: Clearly define the scope of the project, outlining what is included and what is not included. This prevents scope creep and helps manage expectations throughout the project.
  4. Project Charter: Create a project charter that outlines the purpose, goals, objectives, stakeholders, and constraints of the project. This document should be shared and agreed upon by all relevant parties.
  5. Risk Assessment: Conduct a risk assessment to identify potential risks and uncertainties that could impact the achievement of project objectives. Develop risk mitigation plans to address these risks.
  6. Design and Development Plan: Develop a comprehensive design and development plan that includes project milestones, timelines, resource allocation, and responsibilities. This plan should clearly outline how the project will achieve its defined results.
  7. Design Inputs and Outputs: Document design inputs and outputs. Inputs are the information and requirements that go into the design process, while outputs are the results of the design process. Ensure that there is traceability between inputs, design activities, and outputs.
  8. Validation and Verification: Implement verification and validation processes to confirm that the design and development results meet the defined objectives and requirements. Verification ensures that the design conforms to specifications, while validation ensures that the design meets the user’s needs and intended use.
  9. Document Control: Establish document control processes to ensure that all documentation related to the design and development process, including design specifications, plans, and changes, are well-documented and properly managed.
  10. Change Control: Implement a change control process to manage changes to project objectives, requirements, or scope. Changes should be assessed for their impact on the defined results and approved or rejected based on this assessment.
  11. Communication and Reporting: Establish clear communication channels and reporting mechanisms to keep all stakeholders informed of project progress, issues, and changes. Regularly review and update project status against the defined results.
  12. Review Meetings: Hold regular design and development review meetings to assess progress, identify deviations from the defined results, and take corrective actions as needed.
  13. Customer and Stakeholder Involvement: Involve customers and relevant stakeholders in the design and development process to ensure their requirements are understood and incorporated into the final results.

By applying these controls, an organization can ensure that the results to be achieved are well-defined, monitored throughout the design and development process, and aligned with the organization’s goals and customer expectations. This helps minimize ambiguity, reduces the risk of project failure, and promotes successful project outcomes.

4) Reviews are conducted to evaluate the ability of the results of design and development to meet requirements

Conducting reviews of the design and development process is a critical step in ensuring that the results align with the requirements and objectives. These reviews help evaluate the ability of the design and development outcomes to meet the specified requirements and ensure that the final product, service, or process is of high quality and meets customer expectations. Here’s how an organization can conduct these reviews effectively:

  1. Establish Review Milestones: Define specific review milestones throughout the design and development process. These milestones should align with key stages of the project, such as initial concept design, detailed design, prototype development, and finalization.
  2. Assemble Review Teams: Form cross-functional teams with members who have relevant expertise to conduct the reviews. This may include individuals from design, engineering, quality assurance, and other relevant departments.
  3. Review Criteria: Clearly define review criteria and objectives for each review milestone. These criteria should be based on project requirements, quality standards, and customer expectations.
  4. Documented Information: Ensure that all relevant documentation, including design specifications, plans, test results, and change records, is readily available for review.
  5. Review Meetings: Conduct formal review meetings or sessions where the review teams assess the design and development work against the established criteria. These meetings should be well-documented and include participation from key stakeholders.
  6. Identify Deviations: If deviations from requirements or objectives are identified during the review, ensure that they are documented and classified based on their severity and potential impact. Minor deviations may require corrective actions, while major issues may necessitate a reevaluation of the design or development approach.
  7. Corrective Actions: Implement corrective actions to address identified deviations or issues promptly. Document these actions, assign responsibilities, and establish timelines for resolution.
  8. Traceability: Ensure that there is traceability between the identified issues, corrective actions, and the design and development documentation. This traceability helps verify that corrective actions were effective and that the design aligns with requirements.
  9. Verification and Validation: In addition to design conformity, verify and validate that the design and development results meet user needs and intended use. This may involve user acceptance testing and validation activities.
  10. Management Review: Periodically, conduct management reviews to evaluate the overall progress of the design and development process. These reviews should include an assessment of the effectiveness of the review process itself.
  11. Continuous Improvement: Use the insights gained from reviews to drive continuous improvement. Identify recurring issues, root causes, and trends, and take proactive measures to prevent similar issues in future projects.
  12. Communication: Communicate the results of the reviews, including any identified issues and corrective actions, to all relevant stakeholders, both within and outside the project team.
  13. Documentation: Maintain comprehensive documentation of all review activities, findings, decisions, and corrective actions. This documentation serves as a historical record and can be valuable for future reference and auditing.

By conducting regular reviews of the design and development process and evaluating the ability of the results to meet requirements, an organization can ensure that its projects stay on track, adhere to quality standards, and ultimately deliver products, services, or processes that meet customer needs and expectations. These reviews also provide opportunities for continuous improvement and risk mitigation throughout the project life-cycle.

5) Verification activities are conducted to ensure that the design and development outputs meet the input requirements;

Verification is a crucial step in the design and development process within a Quality Management System (QMS). It involves checking and confirming that the design and development outputs meet the input requirements and specified criteria. Verification is typically focused on ensuring that the design conforms to the established requirements and standards. Here’s how an organization can conduct verification effectively:

  1. Define Verification Criteria: Start by clearly defining the verification criteria based on the design and development inputs. These criteria should specify what needs to be checked, measured, or tested to confirm that the design outputs meet the requirements.
  2. Document Verification Activities: Document the specific verification activities that will be performed. These activities may include inspections, reviews, tests, simulations, or any other method that can be used to confirm compliance with requirements.
  3. Traceability: Establish traceability between the design and development inputs and the verification activities. This traceability ensures that each requirement or input is addressed during the verification process.
  4. Verification Plan: Develop a verification plan that outlines the scope, objectives, methods, responsibilities, and schedule for verification activities. This plan should be part of the overall design and development plan.
  5. Execute Verification Activities: Conduct the verification activities according to the established plan. This may involve reviewing design documentation, conducting physical inspections, performing laboratory tests, or running simulations, depending on the nature of the project.
  6. Documentation: Maintain comprehensive records of all verification activities, including the results obtained, any non-conformities identified, and the corrective actions taken.
  7. Review and Analysis: Review the verification results to ensure that they meet the defined criteria and that the design outputs align with the input requirements. Analyze any discrepancies or non-conformities to determine their root causes.
  8. Corrective Actions: If non-conformities or discrepancies are identified during verification, take corrective actions to address them. Document these actions, assign responsibilities, and establish timelines for resolution.
  9. Verification of Changes: Whenever changes are made to the design and development, verify that these changes do not negatively impact the design’s ability to meet the input requirements. This includes reviewing and verifying revised design documents.
  10. Verification Reports: Prepare verification reports summarizing the activities, results, and any follow-up actions taken. These reports serve as documented evidence of compliance with the input requirements.
  11. Communication: Communicate the results of verification to relevant stakeholders, including the project team, management, and quality assurance personnel.
  12. Traceability and Compliance: Ensure that there is traceability between the verified design outputs and the design and development inputs, demonstrating compliance with requirements.
  13. Continuous Improvement: Use insights gained from verification activities to identify opportunities for process improvement, risk mitigation, and enhancing the quality of future design and development projects.

Verification is a systematic and methodical process that helps ensure that the design and development outputs are consistent with the input requirements. It plays a critical role in preventing errors and defects early in the development process, ultimately leading to higher-quality products, services, or processes. Verification also provides documented evidence of compliance, which is important for auditing and regulatory purposes within a QMS.

6) Validation activities are conducted to ensure that the resulting products and services meet the requirements for the specified application or intended use;

Validation activities are essential to ensure that the resulting products and services meet the specific requirements for their intended application or use. Validation goes beyond verifying that the design conforms to specifications (which is the role of verification) and focuses on confirming that the product or service is fit for its intended purpose. Here’s how an organization can conduct validation activities effectively:

  1. Clearly Define Intended Use: Start by precisely defining the intended use or application of the product or service. This should include a detailed understanding of how it will be used by customers or end-users.
  2. Document Validation Criteria: Document the criteria and performance indicators that will be used to determine whether the product or service meets its intended use. These criteria should be based on user needs, customer requirements, and any relevant standards or regulations.
  3. Validation Planning: Develop a validation plan that outlines the scope, objectives, methods, responsibilities, and schedule for validation activities. This plan should be integrated into the overall project plan and design and development process.
  4. Validation Testing: Conduct validation testing or assessments that simulate or replicate real-world conditions and scenarios. This testing should closely mimic the actual conditions in which the product or service will be used.
  5. User Involvement: Involve end-users or representatives of the target audience in the validation process. Their feedback and insights are invaluable for confirming that the product or service meets their needs and expectations.
  6. Data Collection and Analysis: Collect data during validation testing and analyze it against the predefined validation criteria. Ensure that the product or service consistently performs within acceptable limits.
  7. Documentation: Maintain comprehensive records of all validation activities, including test protocols, test results, observations, and any deviations from the criteria.
  8. Review and Analysis: Review the validation results to determine whether the product or service meets the requirements for the specified application or intended use. Analyze any discrepancies or issues to identify root causes.
  9. Corrective Actions: If any non-conformities or issues are identified during validation, take corrective actions to address them. Document these actions, assign responsibilities, and establish timelines for resolution.
  10. Validation Reports: Prepare validation reports summarizing the activities, results, and any follow-up actions taken. These reports serve as documented evidence of the product or service’s fitness for its intended use.
  11. Communication: Communicate the results of validation to relevant stakeholders, including the project team, management, and quality assurance personnel. Ensure that all parties understand the implications of the validation findings.
  12. Regulatory Compliance: Ensure that the validation activities align with any regulatory requirements or industry standards that apply to the product or service.
  13. Continuous Improvement: Use insights gained from validation activities to identify opportunities for enhancing product or service quality, refining design elements, and improving the overall development process.

Validation is a critical step in ensuring that products and services are not only designed correctly but also perform effectively and safely in real-world situations. It provides assurance that the organization’s offerings meet the needs and expectations of customers and end-users and are suitable for their intended applications. Additionally, validation is essential for regulatory compliance in many industries, such as healthcare and aerospace.

7) Any necessary actions are taken on problems determined during the reviews, or verification and validation activities

Taking necessary actions on problems or issues identified during reviews, verification, and validation activities is a vital aspect of quality management within an organization. These actions are essential to address and rectify any identified discrepancies, non-conformities, or areas of improvement. Here’s how an organization can effectively respond to problems determined during these activities:

  1. Problem Identification and Documentation: Ensure that problems, discrepancies, or non-conformities are clearly identified and documented during reviews, verification, and validation activities. This documentation should include details about the issue, its location, and its potential impact.
  2. Immediate Mitigation: If the problem presents an immediate risk or safety concern, take immediate actions to mitigate the risk and protect stakeholders or users. This may involve halting certain activities or initiating temporary measures.
  3. Root Cause Analysis: Conduct a thorough root cause analysis to determine the underlying reasons for the identified problems. Identify whether the issues are isolated or indicative of systemic problems in processes or procedures.
  4. Corrective Actions:Develop and implement corrective actions to address the root causes of the identified problems. Corrective actions should aim to prevent the recurrence of the issue and ensure that similar problems do not occur in the future.
  5. Responsibility Assignment:Assign responsibilities for implementing corrective actions to specific individuals or teams within the organization. Ensure clear ownership and accountability for resolving the problem.
  6. Timelines and Deadlines:Establish timelines and deadlines for completing corrective actions. Setting specific timeframes ensures that actions are taken promptly and effectively.
  7. Validation of Corrective Actions: Verify that the corrective actions are effective in resolving the identified problems. This may involve retesting, re-validation, or review of the changes made to address the issues.
  8. Preventive Actions: Consider implementing preventive actions to avoid similar problems in the future. These actions are proactive measures designed to identify and eliminate potential issues before they occur.
  9. Communication: Communicate the problem, the corrective actions taken, and their results to all relevant stakeholders. Transparency in communication is crucial for maintaining trust and ensuring that everyone is aware of the actions being taken.
  10. Documentation: Maintain detailed documentation of the entire problem-solving process, including the identification of issues, root cause analysis, corrective actions, and validation of effectiveness. This documentation is valuable for auditing and compliance purposes.
  11. Continuous Improvement: Use the lessons learned from addressing problems to drive continuous improvement in processes, procedures, and the overall quality management system. Encourage a culture of learning and adaptation.
  12. Management Review: Periodically review the organization’s responses to problems during management review meetings to ensure that corrective and preventive actions are effective and aligned with strategic goals.
  13. Training and Skill Development: If problems are related to employee skills or knowledge gaps, provide training and skill development opportunities to prevent similar issues in the future.

Taking necessary actions on identified problems is not only a requirement for maintaining a robust quality management system but also a crucial element in ensuring product or service quality, customer satisfaction, and the organization’s overall success. It demonstrates the organization’s commitment to continuous improvement and its ability to respond effectively to challenges.

8) documented information of these activities is retained.

These documents and records are essential for managing the design and development process effectively. Here are some of the key documents and records associated with Clause 8.3.4:

Documents:

  1. Design and Development Plan: This document outlines the overall strategy, objectives, scope, and schedule for the design and development process. It provides a roadmap for the entire project.
  2. Design Inputs: These documents specify the requirements, constraints, and criteria that the design must adhere to. They include customer requirements, regulatory requirements, and internal specifications.
  3. Design Outputs: These documents detail the results of the design process, including drawings, specifications, prototypes, and any other relevant design documentation.
  4. Design Reviews: Records of formal design review meetings, including meeting minutes, action items, and decisions made during the review process.
  5. Design Verification Records: Documentation of activities and results related to design verification, such as test reports, test data, and inspection records.
  6. Design Validation Records: Documentation of activities and results related to design validation, which may include user acceptance test reports, validation protocols, and test data.
  7. Design Changes: Records of any changes made to the design and development, including change requests, change orders, and associated documentation.
  8. Design History File (DHF): A comprehensive file that contains all the documentation and records related to the design and development process. It serves as a complete record of the design history.

Records:

  1. Records of Design and Development Activities: These records demonstrate that the design and development process was carried out in accordance with the plan and that all relevant activities were completed.
  2. Records of Design and Development Reviews: Documentation of the outcomes of design and development reviews, including findings, decisions, and action items.
  3. Records of Design and Development Verification: Evidence that design verification activities were conducted and that the results met the defined criteria.
  4. Records of Design and Development Validation: Evidence that design validation activities were conducted, including user acceptance testing and validation results.
  5. Records of Design Changes: Documentation of any changes to the design, including reasons for the changes, approvals, and the impact on the project.
  6. Records of Training and Competence: Records demonstrating that personnel involved in the design and development process are appropriately trained and competent to perform their roles.
  7. Records of Configuration Management: Documentation of changes made to design documents and the management of different versions and revisions.
  8. Records of Customer and Stakeholder Communication: Records of communication with customers, stakeholders, and relevant external parties regarding design and development activities.
  9. Records of Corrective and Preventive Actions: Documentation of any corrective and preventive actions taken in response to problems, non-conformities, or issues identified during the design and development process.

These documents and records serve as evidence of compliance with ISO 9001:2015 Clause 8.3.4 and demonstrate that the organization has effectively controlled its design and development processes to meet customer requirements and deliver high-quality products, services, or processes. Proper documentation and record-keeping are also essential for audits and assessments of the quality management system.

9) Design and development reviews, verification and validation have distinct purposes.

design and development reviews, verification, and validation are distinct activities within the design and development process, each serving specific purposes in ensuring the quality and suitability of the final product, service, or process. Here’s an overview of their distinct purposes:

  1. Design and Development Reviews:
    • Purpose: Design and development reviews are conducted to evaluate the progress, completeness, and compliance of the design and development process with established plans, requirements, and objectives.
    • Focus: These reviews assess the overall progress and direction of the project, ensuring that it is on track and aligned with customer requirements and organizational goals.
    • Key Elements: Design and development reviews typically involve the examination of design documentation, project milestones, and compliance with design standards and guidelines.
    • Outcome: The outcome of these reviews is a clear understanding of the project’s status and any potential issues or deviations that need to be addressed. Decisions may be made to adjust the project’s course, allocate additional resources, or make changes based on the findings.
  2. Verification:
    • Purpose: Verification activities aim to confirm that the design outputs (e.g., specifications, plans, prototypes) meet the specified design inputs and requirements. Verification ensures that the product or service is being built correctly.
    • Focus: Verification focuses on examining the design and development artifacts to ensure they are consistent with the established requirements and standards. It verifies that the design has been executed accurately.
    • Key Elements: Verification may involve inspections, reviews, tests, and checks to ensure that the design documentation and intermediate outputs conform to the predefined criteria.
    • Outcome: The outcome of verification is evidence that the design outputs align with the design inputs. It provides assurance that the design is on track and meets the specified criteria, reducing the risk of errors or deviations.
  3. Validation:
    • Purpose: Validation activities are performed to ensure that the final product, service, or process meets the actual needs of users and is fit for its intended use or application.
    • Focus: Validation goes beyond verifying the correctness of the design; it assesses whether the product or service fulfills its intended purpose in real-world conditions. It confirms that the design outputs are effective in practice.
    • Key Elements: Validation typically involves testing in realistic scenarios or with actual users to assess performance, usability, functionality, and overall satisfaction.
    • Outcome: The outcome of validation is confirmation that the product or service is suitable for its intended use. It helps ensure that the design has successfully translated into a solution that meets user needs and expectations.

Here are examples of design and development reviews, verification, and validation activities in various industries and contexts:

Design and Development Reviews:

  1. Software Development:
    • Example: Code Review
    • Description: A team of developers conducts a code review to assess the quality, correctness, and adherence to coding standards of a software module or program.
  2. Automotive Engineering:
    • Example: Design Review for a New Vehicle Model
    • Description: Engineers and stakeholders review the design of a new vehicle model to ensure that it meets safety standards, performance criteria, and market demands.
  3. Product Design:
    • Example: Industrial Design Review
    • Description: A design team reviews the aesthetics, ergonomics, and user-friendliness of a product to ensure it aligns with the intended market and user preferences.

Verification:

  1. Manufacturing:
    • Example: Inspection of Machined Parts
    • Description: Quality inspectors use measuring instruments and visual inspections to verify that machined parts meet specified dimensions and tolerances.
  2. Construction:
    • Example: Structural Integrity Verification
    • Description: Structural engineers conduct tests and inspections to verify that a building’s construction meets design specifications and safety standards.
  3. Pharmaceuticals:
    • Example: Laboratory Testing
    • Description: Pharmaceutical companies perform laboratory tests to verify that drug formulations meet potency, purity, and stability requirements.

Validation:

  1. Medical Device Manufacturing:
    • Example: Usability Testing for a New Medical Device
    • Description: Real healthcare professionals or simulated users perform usability testing on a medical device to validate that it can be used effectively and safely in clinical settings.
  2. Software Development:
    • Example: User Acceptance Testing (UAT)
    • Description: End-users or client representatives conduct UAT to validate that a software application meets their specific needs and performs as expected in their operational environment.
  3. Aerospace Engineering:
    • Example: Flight Testing for an Aircraft
    • Description: Aircraft manufacturers conduct flight tests to validate the performance, safety, and reliability of a new aircraft design under real flight conditions.
  4. Food Industry:
    • Example: Taste Testing for a New Food Product
    • Description: Sensory experts or consumers participate in taste tests to validate that a new food product meets taste, texture, and flavor expectations.
  5. Automotive Manufacturing:
    • Example: Crash Testing
    • Description: Automotive manufacturers perform crash tests to validate the safety and structural integrity of vehicles in collision scenarios, ensuring they meet regulatory standards.

These examples illustrate how design and development reviews, verification, and validation activities are applied across various industries to ensure that products, services, or processes meet requirements, safety standards, and user needs. The specific methods and criteria used may vary depending on the industry and the nature of the project. In summary, design and development reviews, verification, and validation are distinct but complementary activities within the design and development process. Reviews provide a high-level assessment of project progress and alignment with requirements. Verification confirms that the design outputs align with design inputs and meet specified criteria. Validation ensures that the final product or service is effective, usable, and suitable for its intended purpose. Together, these activities help organizations achieve their design and development goals while delivering high-quality and customer-centric solutions.

10) Design and development reviews, verification and validation can be conducted separately or in any combination, as is suitable for the products and services of the organization.

ISO 9001:2015 acknowledges that design and development reviews, verification, and validation can be conducted separately or in any combination that is suitable for the products and services of the organization. The standard recognizes that the specific approach to design and development controls may vary depending on the nature of the project, the complexity of the product or service, and the organization’s processes and needs.This flexibility allows organizations to tailor their design and development processes to best suit their unique circumstances. Here’s how organizations can choose to apply these activities:

  1. Separately: Some organizations may choose to conduct design and development reviews, verification, and validation as distinct, sequential steps in the project lifecycle. For example, they might conduct design reviews first to ensure that the design aligns with requirements, followed by verification to confirm accuracy, and finally validation to ensure the product meets user needs.
  2. In Combination: Others may choose to integrate these activities or conduct them concurrently. For instance, they might conduct design reviews while simultaneously performing verification and validation activities. This can streamline the development process and help identify issues earlier in the project.
  3. Tailored Approach: Organizations can tailor their approach based on the specific requirements and risks associated with the product or service. For high-risk or complex projects, a more comprehensive and integrated approach might be appropriate, while simpler projects may benefit from a more streamlined process.
  4. Iterative Process: In iterative development methodologies like Agile or Scrum, design and development reviews, verification, and validation activities are conducted continuously throughout the development cycle. This iterative approach allows for frequent feedback and adjustments, which is particularly beneficial for software development.

The key is to ensure that, regardless of the approach chosen, the organization maintains clear documentation, traceability, and records of these activities to demonstrate compliance with ISO 9001:2015 requirements. Additionally, the organization should have a risk-based approach that aligns with its quality objectives and customer expectations.Ultimately, the organization’s choice of how to conduct design and development controls should be driven by its specific context, the nature of the products or services, and the need to meet quality standards and customer requirements effectively.

Leave a Reply