BS 8611:2016, ethical design and application of robots
Posted to News on 4th Oct 2017, 15:54

BS 8611:2016, ethical design and application of robots

BSI has published British Standard BS 8611:2016, Robots and robotic devices. Guide to the ethical design and application of robots and robotic systems. The following article looks at the new standard's contents and its implications for robot suppliers and integrators.

BS 8611:2016, ethical design and application of robots

As robots and robotic devices are used more widely in industrial and non-industrial environments, there is an increasing recognition of the ways in which they have an impact on the people with whom they are sharing space and tasks. The Foreword acknowledges that the technology is rapidly evolving so, although the standard writers have endeavoured to anticipate future circumstances, there are some areas, such as 'non-embodied autonomous systems,' that have not been addressed.

Furthermore, the Foreword explains that BS 8611 provides guidance and recommendations and should not be quoted as if it were a specification or code of practice; claims of compliance with BS 8611 cannot be made.


BS 8611 helps users identify potential ethical harm and presents guidelines for safe design, protective measures and information for the design and application of robots. Note that safety requirements are covered in existing standards for industrial, personal care and medical robots, but BS 8611 provides guidance to help eliminate or reduce risks associated with ethical hazards associated with the use of robots. Readers are reminded that most physical hazards can result in fear and stress - which are psychological hazards - so safety design features are, inherently, part of the ethical design.

Normative references

Three standards are listed, relating to machine risk assessment and risk reduction, risk management, and vocabulary for robots and robotic devices. Perhaps surprisingly, other robot standards are not listed here, though there is one in the bibliography.

Terms and definitions

First, the reader is referred to the terms and definitions in BS ISO 8373 (Robots and robotic devices. Vocabulary), then there is a series of definitions for various terms, including ethical harm, ethical hazard, ethical risk and ethics ('common understanding of principles that constrain and guide human behaviour').

Ethical risk assessment

This clause (4) considers ethical issues in terms of societal, application, commercial/financial and environmental. A table extends over four pages presenting examples of ethical hazards and their associated ethical risks and mitigation measures. Helpful comments are also included, together with examples of how the mitigation can be verified/validated. For instance, here is one ethical hazard that could be associated with a robot application:


thead{Ethical hazard|Ethical risk|Mitigation|Comment|Verification/validation}

tdata{Inappropriate 'trust' of a human by a robot|Malign or inadequate human control|Model of appropriate human control|Design robot to reach safe state with respect to any other tasks the robot is executing|Software verification; expert guidance}


As well as the table, Clause 4 includes subclauses that provide more detailed guidance on ethical hazard identification, ethical risk assessment and learning robots. In 4.2, Ethical hazard identification, reference is made to groups of humans or animals that are likely to be affected by a new robot or application, even though the definitions do not refer specifically to animals. Other standards for risk assessment are also referred to in subclause 4.2, namely BS EN ISO 12100:2010 for machines, and BS EN ISO 14971 for medical devices. In subclause 4.3, Ethical risk assessments, readers are told they should analyse the data collected in the hazard identification process qualitatively and/or quantitatively when determining the ethical risk associated with the ethical hazard. At the end of the risk assessment, BS 8611 states that 'As a general principal, the ethical risk of a robot should not be higher than the risk of a human operator performing the same action.'

Learning robots (discussed in subclause 4.4) can have their 'learning' functionality categorised in three classes: environmental, performance enhancement and strategy. Because robots that are otherwise identical may learn in different ways, and therefore act differently from each other and pose different risks, BS 8611 states that designers and operators should have the means to ensure that such changes in robots are easily identifiable individually.

Ethical guidelines and measures

This clause (5) draws on the EPSRC (Engineering and Physical Sciences Research Council) document Principles of robotics and other sources. In addition, this clause lists 16 examples of organisations that can help developers of robotics to engage with the public to ensure that ethically acceptable robots are developed and deployed.

Ethics-related system design recommendations

Clause 6 addresses the general principles for ethical design and is, in some ways, similar to safety-related system design. First, a direct reference is made to BS EN ISO 12100:2010 and the point is made that robots should be designed in accordance with the principles of this standard for all hazards, including ethical hazards. The robot should be designed so that 'the ethical risks of identified hazards are as low as reasonably practicable.'

Furthermore, subclause 6.1.2 calls for 'inherently ethical design' and, when the implementation of inherently ethical design measures is not practicable, subclause 6.1.3 says that robot users should be protected by safeguards and protective measures. Subclause 6.3 (Protection against the perception of harm) requires an ethical risk assessment to be made to determine the perception of harm, as perceived harm can trigger fear and stress.

Verification and validation

Clause 7 outlines the importance of verification and validation (and explains the difference between the two). However, it is recognised that an ethical risk assessment for an intended use application requires a precise ethical specification for the robot, and this may be difficult to obtain. In addition, the validation of a robot's ethics could well produce different outcomes with different users. Subclause 7.1 (General) then lists issues that should be taken into account during verification and validation, but it also highlights the difficulties that are likely to be encountered.

Subclause 7.2 (Suggested approaches) says that techniques for verification and validation that have been developed in industries such as aircraft, automotive and machine tool manufacturing can also be applied to robots - for example, redundant systems and independent safety systems that mitigate the effects of a failure. This subclause mentions the safety requirements for personal care robots (form BS EN ISO 13482) and also methods used by software engineers for automatically verifying whether a system precisely meets its specifications (algorithmic verification techniques and stochastic verification techniques).

Subclause 7.3 (Suggested methods) refers back to the guidelines in clause 4 (Ethical risk assessment) and lists six methods: user validation; software verification; expert guidance; economic and social assessment; legal assessment; and compliance tests. However, this subclause cautions that autonomous systems are prone to do the unexpected, and they can make decisions, so it is necessary to consider both what the system does and why it makes a particular decision.

Also discussed is the option to use an 'ethical verification system' that operates independently from the main control system to monitor the actions in terms of their ethical outcomes. A further extension of this concept is the use of multiple ethical verification systems to monitor, separately and in parallel, privacy, environmental issues, etc. Of course, conflicts between the verification systems then have to be managed somehow.

Information for use

Clause 8 relates only to ethical considerations in terms of information for use.

Because of the situations in which robots may operate, BS 8611 subclause 8.1 (General) recognises that not all 'users' will be able to read the instruction book, take note of warning signs (including pictograms, warning lamps, etc) or hear or understand acoustic warnings. In such cases, care must be taken to ensure that the inability to understand does not give rise to additional risks.

Subclause 8.2 (Markings or indications) lists the types of information that should be provided. Additional guidance about written pictorial, visual and audible information is also presented.

Subclauses 8.3 and 8.4 cover, respectively, the User manual and the Service manual. These subclauses are as would be expected, with the emphasis on ethics.


BS 8611 is the first edition of a new standard but, because this is a fast-evolving technology, in places the standard feels more like a draft rather than a published standard. Nevertheless, it is a groundbreaking document that could be of great value to those designing robots and robotic systems who want to take ethical issues into account. For the time being BS 8611 is a British Standard, but it is possible that we could see it develop into an international standard in the future.

BS 8611 is available from the BSI online shop as a hard copy or PDF, priced at 79 for members of BSI and 158 for non-members.

Suite 118
80 Churchill Square
ME19 4YU

+ 44 (0)1732 926117

Bosch Rexroth Pilz Automation Ltd Mechan Controls Ltd Procter Machine Safety ABSSAC Ltd SICK (UK) LTD WEG (UK) Ltd Leuze electronic Ltd Rittal Ltd HARTING Ltd Smartscan Ltd Murrelektronik Ltd AutomateUK AutomateUK Kawasaki Robotics (UK) Ltd Heidenhain (GB) Ltd PI (Physik Instrumente) Ltd Euchner (UK) STOBER Drives Ltd Aerotech Ltd Dold Industries Ltd Servo Components & Systems Ltd Micro Epsilon UK Limited FATH Components Ltd Spelsberg Els UK Ltd Machinesafe Compliance Ltd Phoenix Contact Ltd M Buttkereit Ltd