We Robot Conference: How Should the Law Think About Robots?

April 21st, 2012

A Few Words from Our Sponsor
Laurie Silvers ’77, Member U. Miami Board of Trustees,  President of Hollywood Media Corp.

  •  She invented the Sci-Fi Channel and talked to Isaac Asimov about starting the network.

Introductory Remarks
A. Michael Froomkin
, Program Chair

  • The logo adapts the Creation of Adam, but replaces the hand of God with the hand of a Robot (to some objections on religious grounds)
  • This conference sets the right standards, early.
  • We are in really early days, standards have not yet been set out. Robots/2012 = Internet/1968?


How Should the Law Think About Robots?
Neil Richards & William Smart
Discussant: Annemarie Bridy


  • The history of robots:
    • Word Robot coined in Czech Play R.U.R. – Rossum’s Universal Robots; Czech word for “serf labor”
    • Source of agency source of anxiety
    • Asimov 3 Laws
    • Is Roboethics the ethics of robots or the ethics of robot scientists?
  • The robots are coming!
  • 4 basic claims (Richards & Smart)
    • Need to think carefully how we define robots
      • Non-biological (apparently) autonomous agent; physical and mental agency, but not alive in biological sense
      • Not always true agency, sometimes only appearance of agency
      • Roboagency continuum- at one end, controlled entirely by humans, at other end full autonomy (machine initiative)
    • Need to understand what robots can do–in the world and in the lab
    • Use cyberlaw to understand
      • Uncertainty gives rise to regulatory challenge- how do we regulate technology we don’t fully understand
      • What can robolaw learn from cyberlaw?
      • Find where analogies/metaphors work, and don’t work
      • Olmstead/Katz
      • Choice of Metaphors matter
      • Mainstream Loudoun v. Board of Trustees of the Loudoun County Public LIbrary (EDVA 1998)- Internet less like interlibrary loan than set of encyclopedias
      • Stratton Oakmont v. Prodigy- N.Y. Sup. Ct. 1995 – is Prodigy BBS more less like a bookstore than a newspaper — led to s. 230 of Communications Decency Act
    • Avoid Android Faillacy – projection of human will onto robots
      • Robots are just tools
      • Legislate based on function, not form.
      • How will we know when robots stop being “just tools” (move from robot remote control to robot self control)? What should law do then?


  • Olmstead is/is not a cyber law case- use an example so far in the past that we know what the right answer is.
  • Communications Decency Act – poorly thought-out piece of legislation. S. 230 not entirely clear whether it was a good idea- enabled Web 2.0, good and bad- death of public discourse through more discourse
  • Problem is that lawyers look to other things to understand new things. Robots are entirely new.
  • We don’t know path robots will take
  • Android fallacy is dangerous- the way we visualize robots matters. they are not all like C3PO


  • When will we know robots are more than just tools; always just tools- they are sophisticated tools, but are still toasters; wary of thinking about them in any other way than really fancy hammer


  • Robolaw is law about people. There will be liability rules for robots when they malfunction or get hacked. Issues about human values are about people.
  • Ian Kerr- is framing of android fallacy fallacious? Android fallacy permits that the notion of the tool is not a metaphor. Relationship between humans and their tools. Important from starting point to remember that we have social relationships with tools. We already project onto these entities all sorts of values and relationships that make the “just tools” argument difficult. The android fallacy eliminates entire area of discussion. We will get to point where we need to start thinking about robots in different ways.
  • Richards- be careful about projecting onto robots (romantic attachments to iphone); tendency to anthropomorphize robots. Robots that appear human will happen.
  • Smart- “robots *who* do things” v. “robots *that* do things”
  • Calo- No basis in paper for leaving out non-embodied autonomous agents; surprised you used metaphor of metaphors- distinction between open and closed systems (Tim Wu/Jonathan Zitrain)
  • Elizabeth Grossman (Microsoft)- Don’t use term killer app in same context as killer robots. lol.


Liquid Robotics
Suneil Thomas

  • Protect human jobs from robots; synergistic robots; romantic notions of the “sailor” – labor issues, union/protectionist issues on the horizon
  • Interesting questions relating to human labor and robots