BS EN IEC 62061:2021+A1:2024
$215.11
Safety of machinery. Functional safety of safety-related control systems
Published By | Publication Date | Number of Pages |
BSI | 2024 | 154 |
PDF Catalog
PDF Pages | PDF Title |
---|---|
2 | undefined |
12 | CONTENTS |
18 | FOREWORD |
20 | INTRODUCTION |
21 | 1 Scope |
22 | 2 Normative references Figures Figure 1 – Scope of this document |
23 | 3 Terms, definitions and abbreviations 3.1 Alphabetical list of definitions Tables Table 1 – Terms used in IEC 62061 |
25 | 3.2 Terms and definitions |
38 | 3.3 Abbreviations 4 Design process of an SCS and management of functional safety 4.1 Objective Table 2 – Abbreviations used in IEC 62061 |
39 | 4.2 Design process Figure 2 – Integration within the risk reduction process of ISO 12100 (extract) |
40 | Figure 3 – Iterative process for design of the safety-related control system |
41 | 4.3 Management of functional safety using a functional safety plan Figure 4 – Example of a combination of subsystems as one SCS |
43 | 4.4 Configuration management 4.5 Modification |
44 | 5 Specification of a safety function 5.1 Objective 5.2 Safety requirements specification (SRS) 5.2.1 General 5.2.2 Information to be available |
45 | 5.2.3 Functional requirements specification 5.2.4 Estimation of demand mode of operation |
46 | 5.2.5 Safety integrity requirements specification Figure 5 – By activating a low demand safety function at least once per year it can be assumed to be high demand Table 3 – SIL and limits of PFH values |
47 | 6 Design of an SCS 6.1 General 6.2 Subsystem architecture based on top down decomposition 6.3 Basic methodology – Use of subsystem 6.3.1 General |
48 | 6.3.2 SCS decomposition |
49 | 6.3.3 Sub-function allocation 6.3.4 Use of a pre-designed subsystem Figure 6 – Examples of typical decomposition of a safety function into sub-functions and its allocation to subsystems |
50 | 6.4 Determination of safety integrity of the SCS 6.4.1 General 6.4.2 PFH Figure 7 – Example of safety integrity of a safety function based on allocated subsystems as one SCS Table 4 – Required SIL and PFH of pre-designed subsystem |
51 | 6.5 Requirements for systematic safety integrity of the SCS 6.5.1 Requirements for the avoidance of systematic hardware failures |
52 | 6.5.2 Requirements for the control of systematic faults |
53 | 6.6 Electromagnetic immunity 6.7 Software based manual parameterization 6.7.1 General 6.7.2 Influences on safety-related parameters |
54 | 6.7.3 Requirements for software based manual parameterization |
55 | 6.7.4 Verification of the parameterization tool 6.7.5 Performance of software based manual parameterization 6.8 Security aspects |
56 | 6.9 Aspects of periodic testing 7 Design and development of a subsystem 7.1 General |
57 | 7.2 Subsystem architecture design Table 5 – Relevant information for each subsystem |
58 | 7.3 Requirements for the selection and design of subsystem and subsystem elements 7.3.1 General 7.3.2 Systematic integrity |
61 | 7.3.3 Fault consideration and fault exclusion |
62 | 7.3.4 Failure rate of subsystem element |
65 | 7.4 Architectural constraints of a subsystem 7.4.1 General |
66 | 7.4.2 Estimation of safe failure fraction (SFF) Table 6 – Architectural constraints on a subsystem: maximum SIL that can be claimed for an SCS using the subsystem |
68 | 7.4.3 Behaviour (of the SCS) on detection of a fault in a subsystem |
69 | 7.4.4 Realization of diagnostic functions 7.5 Subsystem design architectures 7.5.1 General |
70 | 7.5.2 Basic subsystem architectures Figure 8 – Basic subsystem architecture A logical representation Figure 9 – Basic subsystem architecture B logical representation Figure 10 – Basic subsystem architecture C logical representation |
71 | 7.5.3 Basic requirements Figure 11 – Basic subsystem architecture D logical representation Table 7 – Overview of basic requirements and interrelation to basic subsystem architectures |
72 | 7.6 PFH of subsystems 7.6.1 General 7.6.2 Methods to estimate the PFH of a subsystem 7.6.3 Simplified approach to estimation of contribution of common cause failure (CCF) 8 Software 8.1 General |
73 | 8.2 Definition of software levels Table 8 – Different levels of application software |
74 | 8.3 Software – Level 1 8.3.1 Software safety lifecycle – SW level 1 Figure 12 – V-model for SW level 1 Figure 13 – V-model for software modules customized by the designer for SW level 1 |
75 | 8.3.2 Software design – SW level 1 |
77 | 8.3.3 Module design – SW level 1 8.3.4 Coding – SW level 1 |
78 | 8.3.5 Module test – SW level 1 8.3.6 Software testing – SW level 1 |
79 | 8.3.7 Documentation – SW level 1 8.3.8 Configuration and modification management process – SW level 1 |
80 | 8.4 Software level 2 8.4.1 Software safety lifecycle – SW level 2 Figure 14 – V-model of software safety lifecycle for SW level 2 |
81 | 8.4.2 Software design – SW level 2 |
83 | 8.4.3 Software system design – SW level 2 8.4.4 Module design – SW level 2 |
84 | 8.4.5 Coding – SW level 2 |
85 | 8.4.6 Module test – SW level 2 8.4.7 Software integration testing SW level 2 8.4.8 Software testing SW level 2 |
86 | 8.4.9 Documentation – SW level 2 |
87 | 8.4.10 Configuration and modification management process – SW level 2 9 Validation 9.1 Validation principles |
89 | Figure 15 – Overview of the validation process |
90 | 9.1.1 Validation plan 9.1.2 Use of generic fault lists 9.1.3 Specific fault lists |
91 | 9.1.4 Information for validation 9.1.5 Validation record |
92 | 9.2 Analysis as part of validation 9.2.1 General 9.2.2 Analysis techniques 9.2.3 Verification of safety requirements specification (SRS) |
93 | 9.3 Testing as part of validation 9.3.1 General 9.3.2 Measurement accuracy |
94 | 9.3.3 More stringent requirements 9.3.4 Test samples 9.4 Validation of the safety function 9.4.1 General |
95 | 9.4.2 Analysis and testing 9.5 Validation of the safety integrity of the SCS 9.5.1 General 9.5.2 Validation of subsystem(s) |
96 | 9.5.3 Validation of measures against systematic failures 9.5.4 Validation of safety-related software |
97 | 9.5.5 Validation of combination of subsystems 10 Documentation 10.1 General 10.2 Technical documentation |
98 | Table 9 – Documentation of an SCS |
99 | 10.3 Information for use of the SCS 10.3.1 General 10.3.2 Information for use given by the manufacturer of subsystems |
100 | 10.3.3 Information for use given by the SCS integrator |
102 | Annexes Annex A (informative) Determination of required safety integrity A.1 General A.2 Matrix assignment for the required SIL A.2.1 Hazard identification/indication A.2.2 Risk estimation Figure A.1 – Parameters used in risk estimation |
103 | A.2.3 Severity (Se) A.2.4 Probability of occurrence of harm A.2.4.1 General A.2.4.2 Frequency and duration of exposure Table A.1 – Severity (Se) classification |
104 | A.2.4.3 Probability of occurrence of a hazardous event Table A.2 – Frequency and duration of exposure (Fr) classification |
105 | A.2.4.4 Probability of avoiding or limiting harm (Av) Table A.3 – Probability (Pr) classification |
106 | A.2.5 Class of probability of harm (Cl) A.2.6 SIL assignment Table A.4 – Probability of avoiding or limiting harm (Av) classification Table A.5 – Parameters used to determine class of probability of harm (Cl) |
107 | Table A.6 – Matrix assignment for determining the required SIL (or PLr)for a safety function |
108 | A.3 Overlapping hazards Figure A.2 – Example proforma for SIL assignment process |
109 | Annex B (informative) Example of SCS design methodology B.1 General B.2 Safety requirements specification B.3 Decomposition of the safety function Table B.1 – Safety requirements specification – example of overview |
110 | B.4 Design of the SCS by using subsystems B.4.1 General B.4.2 Subsystem 1 design – “guard door monitoring” B.4.2.1 Architectural constraints Figure B.1 – Decomposition of the safety function Figure B.2 – Overview of design of the subsystems of the SCS |
111 | B.4.2.2 Evaluation of SFF |
112 | B.4.2.3 Evaluation of DCI1 and DCI2 B.4.2.4 Evaluation of PFH B.4.3 Subsystem 2 design – “evaluation logic” |
113 | B.4.4 Subsystem 3 design – “motor control” B.4.4.1 Architectural constraints B.4.4.2 Evaluation of PFH B.4.5 Evaluation of the SCS B.4.5.1 Target |
114 | B.4.5.2 Systematic integrity and CCF B.4.5.3 Architectural constraints B.4.6 PFH B.5 Verification B.5.1 General B.5.2 Analysis Table B.2 – Systematic integrity – example of overview |
115 | B.5.3 Tests Table B.3 – Verification by tests |
116 | Annex C (informative) Examples of MTTFD values for single components C.1 General C.2 Good engineering practices method C.3 Hydraulic components |
117 | C.4 MTTFD of pneumatic, mechanical and electromechanical components Table C.1 – Standards references and MTTFD or B10D values for components |
119 | Annex D (informative) Examples for diagnostic coverage (DC) Table D.1 – Estimates for diagnostic coverage (DC) (1 of 2) |
121 | Annex E (informative) Methodology for the estimation of susceptibility to common cause failures (CCF) E.1 General E.2 Methodology E.2.1 Requirements for CCF E.2.2 Estimation of effect of CCF |
122 | Table E.1 – Estimation of CCF factor (β) |
123 | Table E.2 – Criteria for estimation of CCF |
124 | Annex F (informative) Guideline for software level 1 F.1 Software safety requirements Table F.1 – Example of relevant documents related to the simplified V-model |
125 | F.2 Coding guidelines Table F.2 – Examples of coding guidelines |
126 | F.3 Specification of safety functions Figure F.1 – Plant sketch |
127 | F.4 Specification of hardware design Table F.3 – Specified safety functions |
128 | Table F.4 – Relevant list of input and output signals |
129 | F.5 Software system design specification Figure F.2 – Principal module architecture design |
130 | Figure F.3 – Principal design approach of logical evaluation |
131 | F.6 Protocols Figure F.4 – Example of logical representation (program sketch) Table F.5 – Example of simplified cause and effect matrix |
132 | Table F.6 – Verification of software system design specification Table F.7 – Software code review |
133 | Table F.8 – Software validation |
134 | Annex G (informative) Examples of safety functions Table G.1 – Examples of typical safety functions |
135 | Annex H (informative) Simplified approaches to evaluate the PFH value of a subsystem H.1 Table allocation approach |
136 | Table H.1 – Allocation of PFH value of a subsystem |
137 | H.2 Simplified formulas for the estimation of PFH H.2.1 General H.2.2 Basic subsystem architecture A: single channel without a diagnostic function Figure H.1 – Basic subsystem architecture A logical representation Table H.2 – Relationship between B10D, operations and MTTFD |
138 | H.2.3 Basic subsystem architecture B: dual channel without a diagnostic function H.2.4 Basic subsystem architecture C: single channel with a diagnostic function H.2.4.1 General Figure H.2 – Basic subsystem architecture B logical representation Figure H.3 – Basic subsystem architecture C logical representation |
139 | H.2.4.2 External fault handling function Figure H.4 – Correlation of basic subsystem architecture C and the pertinent fault handling function |
140 | H.2.4.3 Fault handling partially or completely done within the subsystem Figure H.5 – Basic subsystem architecture C with external fault handling function |
141 | Figure H.6 – Basic subsystem architecture C with external fault diagnostics Figure H.7 – Basic subsystem architecture C with external fault reaction Figure H.8 – Basic subsystem architecture C with internal fault diagnostics and internal fault reaction |
142 | Table H.3 – Minimum value of 1/λD FH for the applicability of PFH equation (H.3) |
143 | H.2.5 Basic subsystem architecture D: dual channel with a diagnostic function(s) Figure H.9 – Basic subsystem architecture D logical representation |
144 | H.3 Parts count method |
145 | Annex I (informative) The functional safety plan and design activities I.1 General I.2 Example of a machine design plan including a safety plan I.3 Example of activities, documents and roles Figure I.1 – Example of a machine design plan including a safety plan |
146 | Figure I.2 – Example of activities, documents and roles (1 of 2) |
148 | Annex J (informative) Independence for reviews and testing/verification/validation activities J.1 Software design J.2 Validation Table J.1 – Minimum levels of independence for review, testing and verification activities Table J.2 – Minimum levels of independence for validation activities |
150 | Bibliography |