Silicon Valley Robotics has launched a Good Robot Design Council with our “5 Laws of Robotics” that are:


  • Robots should not be designed as weapons.
  • Robots should comply with existing law, including privacy.
  • Robots are products: and as such, should be safe, reliable and not misrepresent their capabilities.
  • Robots are manufactured artifacts: the illusion of emotions and agency should not be used to exploit vulnerable users.
  • It should be possible to find out who is responsible for any robot.


  Robots are multi-use tools. Robots should not be designed solely or primarily to kill or harm humans, except in the interests of national security. Robots should not be designed as weapons, except for national security reasons. Tools have more than one use. We allow guns to be designed which farmers use to kill pests and vermin but killing human beings with them (outside warfare) is clearly wrong. Knives can be used to spread butter or to stab people. In most societies, neither guns nor knives are banned but controls may be imposed if necessary (e.g. gun laws) to secure public safety. Robots also have multiple uses. Although a creative end-usercould probably use any robot for violent ends, just as with a blunt instrument, we are saying that robots should never be designed solely or even principally, to be used as weapons with deadly or other offensive capability. This law, if adopted, limits the commercial capacities of robots, but we view it as an essential principle for their acceptance as safe in civil society.
2 Humans, not robots, are responsible agents. Robots should be designed; operated as far as is practicable to comply with existing laws & fundamental rights & freedoms, including privacy. Robots should be designed and operated to comply with existing law, including privacy. We can make sure that robot actions are designed to obey the laws humans have made.

There are two important points here. First, of course no one is likely deliberately set out to build a robot which breaks the law. But designers are not lawyers and need to be reminded that building robots which do their tasks as well as possible will sometimes need to be balanced against protective laws and accepted human rights standards. Privacy is a particularly difficult issue, which is why it is mentioned. For example, a robot used in the care of a vulnerable individual may well be usefully designed to collect information about that person 24/7 and transmit it to hospitals for medical purposes. But the benefit of this must be balanced against that person's right to privacy and to control their own life e.g. refusing treatment. Data collected should only be kept for a limited time; again the law puts certain safeguards in place. Robot designers have to think about how laws like these can be respected during the design process (e.g. by providing off-switches).

Secondly, this law is designed to make it clear that robots are just tools, designed to achieve goals and desires that humans specify. Users and owners have responsibilities as well as designers and manufacturers. Sometimes it is up to designers to think ahead because robots may have the ability to learn and adapt their behaviour. But users may also make robots do things their designers did not foresee. Sometimes it is the owner's job to supervise the user (e.g. if a parent bought a robot to play with a child). But if a robot's actions do turn out to break the law, it will always be the responsibility, legal and moral, of one or more human beings, not of the robot (We consider how to find out who is responsible in law 5, below).
3 Robots are products. They should be designed using processes which assure their safety and security. Robots are products: as with other products, they should be designed to be safe and secure. Robots are simply not people. They are pieces of technology their owners may certainly want to protect (just as we have alarms for our houses and cars, and security guards for our factories) but we will always value human safety over that of machines. Our principle aim here, was to make sure that the safety and security of robots in society would be assured, so that people can trust and have confidence in them.

This is not a new problem in technology. We already have rules and processes that guarantee that, e.g. household appliances and children’s toys are safe to buy and use. There are well worked out existing consumer safety regimes to assure this: e.g. industry kite-marks, British and international standards, testing methodologies for software to make sure the bugs are out, etc. We are also aware that the public knows that software and computers can be “hacked” by outsiders, and processes also need to be developed to show that robots are secure as far as possible from such attacks. We think that such rules, standards and tests should be publicly adopted or developed for the robotics industry as soon as possible to assure the public that every safeguard has been taken before a robot is ever released to market. Such a process will also clarify for industry exactly what they have to do.

This still leaves a debate open about how far those who own or operate robots should be allowed to protect them from e.g. theft or vandalism, say by built-in taser shocks. The group chose to delete a phrase that had ensured the right of manufacturers or owners to include "self defence" capability into a robot. In other words we do not think a robot should ever be “armed” to protect itself. This actually goes further than existing law, where the general question would be whether the owner of the appliance had committed a criminal act like assault without reasonable excuse.
4 Robots are manufactured artefacts. They should not be designed in a deceptive way to exploit vulnerable users; instead their machine nature should be transparent. Robots are manufactured artefacts: the illusion of emotions and intent should not be used to exploit vulnerable users. One of the great promises of robotics is that robot toys may give pleasure, comfort and even a form of companionship to people who are not able to care for pets, whether due to rules of their homes, physical capacity, time or money. However, once a user becomes attached to such a toy, it would be possible for manufacturers to claim the robot has needs or desires that could unfairly cost the owners or their families more money. The legal version of this rule was designed to say that although it is permissible and even sometimes desirable for a robot to sometimes give the impression of real intelligence, anyone who owns or interacts with a robot should be able to find out what it really is and perhaps what it was really manufactured to do. Robot intelligence is artificial, and we thought that the best way to protect consumers was to remind them of that by guaranteeing a way for them to "lift the curtain" (to use the metaphor from The Wizard of Oz).

This was the most difficult law to express clearly and we spent a great deal of time debating the phrasing used. Achieving it in practice will need still more thought. Should all robots have visible bar-codes or similar? Should the user or owner (e.g. a parent who buys a robot for a child) always be able to look up a database or register where the robot's functionality is specified? See also rule 5 below.
5 The person with legal responsibility for a robot should be attributed. It should be possible to find out who is responsible for any robot. In this rule we try to provide a practical framework for what all the rules above already implicitly depend on: a robot is never legally responsible for anything. It is a tool. If it malfunctions and causes damage, a human will be to blame. Finding out who the responsible person is may not however be easy. In the UK, a register of who is responsible for a car (the “registered keeper”) is held by DVLA; by contrast no one needs to register as the official owner of a dog or cat. We felt the first model was more appropriate for robots, as there will be an interest not just to stop a robot whose actions are causing harm, but people affected may also wish to seek financial compensation from the person responsible.

Responsibility might be practically addressed in a number of ways. For example, one way forward would be a licence and register (just as there is for cars) that records who is responsible for any robot. This might apply to all or only operate where that ownership is not obvious (e.g. for a robot that might roam outside a house or operate in a public institution such as a school or hospital). Alternately, every robot could be released with a searchable online licence which records the name of the designer /manufacturer and the responsible human who acquired it (such a licence could also specify the details we talked about in rule 4 above). There is clearly more debate and consultation required.

Importantly, it should still remain possible for legal liability to be shared or transferred e.g. both designer and user might share fault where a robot malfunctions during use due to a mixture of design problems and user modifications. In such circumstances, legal rules already exist to allocate liability (although we might wish to clarify these, or require insurance). But a register would always allow an aggrieved person a place to start, by finding out who was, on first principles, responsible for the robot in question.


Submitted by Roxanna77 on Fri, 2017-03-24 20:31

Roxanna77's picture

Robot will get a job that makes enough $$$$$$$ to support me in the fashion I am accustomed to!


"Propeller Girl"
I tinker, therefore I am.....

Submitted by JeffRo on Sun, 2017-03-26 16:07

JeffRo's picture

That would be great if it was possible. How about we put that one as a potential for now. I do think that limiting to just 5 rules is a joke as there needs to be many more as robots reach a higher level of thought.

Submitted by jinx on Mon, 2017-05-01 05:56

last ten years been based on war bots advancing the field.

 in 2

 "designed and operated to comply with....."  design really they asking we stop free thinking!. should it not read  "built and operated"


 "The person with legal responsibility for a robot" want be me!

   at the level of public interaction they are considering these machines will be lease only option. take NAO a 2ft £9 - 20,000,00. cant bring in the washing. even a motorise granny basket becomes unaffordable with personal insurance. then servicing all blame gonna fall on them... unless you stupid enough to thow a bot from a trower block and unfortuate enough to be under it.

its baby dribble with wiskey :P  they go on about safety but offer no advice such as a robot "kill switch"  which is standardised for the public awerness.sure it be abused by some spotty teenager but you want a big red dome to press when the 5ft biped appears and one drops on a kid still flapping it limbs around!!! but no we want mention that.

   Gaurd bots nothing wrong with gaurds carrying a gatlin guns, you stupid enough to be in front of one without ID.  god bless you... no mentioning of restricting rfid becons on gaurds, either their on site or there off line so no harm to public for those wayward rogue traveller.

"In the UK, a register of who is responsible for a car (the “registered keeper”) is held by DVLA.  the DVLA a farce and expensive to run who will police it. it's a terrible example they be asking makers to register themselves next!.


I should have guessed this group lol i almost did  with its dribble with the DVLA

     "It's easy to overlook the work of people who seem determined to be extremist or irresponsible, but doing this could easily put us in the position that GM scientists are in now, where nothing they say in the press has any consequence. We need to engage with the public and take responsibility for our public image."

   extremist or irresponsible HAHAHAHHA think I know A few or two that could be viewed that way :P but there done more for public awerness then those in attendance too dribble that. their confusing bot builders with drone flyer!!. when was the last time you see a DIY mobile bot recklessly drive on a Mway???. just some unnecessary vilifying of the hobby builder..


   great laws that encompass aspects of life's short journey should be as elegant as the theory of everything. avoid more laws than neceassary to reduce conflicts. less laws more subsection. :P

                                                                                                                                  jinx 2017








Submitted by DangerousThing on Tue, 2017-06-27 18:57

I'm sorry JeffRo, but I don't think it is *possible* to create a useful robot that will not break some law somewhere. For example, some privacy laws are different state-to-state in the US. Many other laws vary according to the town. Sometimes local laws conflict with federal laws.

And once robots become more visible, I think that certain places will make them illegal. And since laws change according to the whim of the politicians and voters, it would be impossible for a robot designer and even a lawyer to predect what would be legal by the time the robot is released.

The world has too many laws right now. Maybe we should create a 5-law system for the world.

JeffRo's picture
Last updated: 
24 Mar 2017 - 09:50