October 18, 2009
Better To Live In Country With Rights-Possessing Robots?

Robin Hanson doesn't want to live in a country where robots are held back from full sentience and autonomy.

On Tuesday I asked my law & econ undergrads what sort of future robots (AIs computers etc.) they would want, if they could have any sort they wanted.  Most seemed to want weak vulnerable robots that would stay lower in status, e.g., short, stupid, short-lived, easily killed, and without independent values. When I asked “what if I chose to become a robot?”, they said I should lose all human privileges, and be treated like the other robots.  I winced; seems anti-robot feelings are even stronger than anti-immigrant feelings, which bodes for a stormy robot transition.

At a workshop following last weekend’s Singularity Summit two dozen thoughtful experts mostly agreed that it is very important that future robots have the right values.  It was heartening that most were willing accept high status robots, with vast impressive capabilities, but even so I thought missed the big picture.  Let me explain.

I do not see how limiting the capabilities of robots will put limits on our quality of life. We can still have specialized artificial intelligences to work on scientific problems and engineering designs without creating totally autonomous mobile thinking machines that have all the cognitive attributes that make a rights-possessing being possible.

In fact, if we do not build robotic sentient citizens we'll have fewer "people" competing with us for resources. Super smart rights-possessing autonomous robots would be able to out-produce us rather than produce for us.

I also very much doubt that we can maintain artificially intelligent robots in a state where they would be guaranteed to continue to possess all the attributes necessary for rights-possessing beings. Robots, unlike humans, will have very easily modifiable reasoning mechanisms. They will always be one upload away from becoming really dangerous. Better short and stupid robots than dangerous ones.

Imagine that you were forced to leave your current nation, and had to choose another place to live.  Would you seek a nation where the people there were short, stupid, sickly, etc.?  Would you select a nation based on what the World Values Survey says about typical survey question responses there?

I would choose to live in a nation where I'm not one software modification away from becoming prey.

I think at least some of Artificial Intelligence promoters make a fundamental mistake: They assume that reasoning ability alone is enough to cause an entity to respect the rights of others. Part of the human respect for other human lives comes from very innate wiring of the brain that is below the level of conscious control. Our ancestors living in small tribes had a selective advantage from being loyal to each other in a way similar to that of a pack of wolves. How to give AIs the innate preference to like humans? I do not see a safe assured way to do that.

Share |      Randall Parker, 2009 October 18 07:47 PM  Artificial Intelligence


Comments
David Govett said at October 18, 2009 10:49 PM:

I hope the robots allow us to retain our rights.

averros said at October 19, 2009 12:55 AM:

Rothbard argued (in context of child rights) that anybody clearly capable of asserting his rights must be affored full rights. (For the actual argument see his "Ethics of Liberty").

So if we get robots capable of understanding the concept of rights (this would make them smarter than most humans, however) they must have rights.

And, no, self-aware and rights posessing robots won't make us poorer - in fact, they will make us immensely richer by bringing in the full resources of the Solar System - without Earth-side pollution. They won't have much use for the crowded biosphere deep in the gravity well (and would likely regard it as both messy and worthy of protection, together with biological humans).

Bob Hawkins said at October 19, 2009 8:51 AM:

The future belongs to P.G. Wodehouse. Humans will be genially idiotic Bertie Woosters, utterly dependent on our robot Jeeveses, who blandly manipulate us to conform to their idea of proper dinner dress and marriage partners. Whether we "give" them "rights" will be irrelevant.

If you want a picture of the future, imagine a gentleman's gentleman murmuring "Very well, Sir," and doing what he damn well pleases, forever.

Matthew f. said at October 23, 2009 8:29 AM:

I think they way it would occur is opaque because we don't know exactly why we cannot just randomly kill any enemy who gets in our way, but there is a precise set of causes, a system that makes us want to be moral.

This could change with self augmentation (not easily or soon of course).

But, prior to that transition, wouldn't any such sentience have to WANT to make such a change???

So the same applies to the AI. The initial design of the AI determines the fate of humanity and maybe the universe. This is my understanding of what singinst mission/reasoning is all about and why anthropomorphizing the robots is silly. You might have read this which deals with an aspect of your belief about why we should hinder the rights of robots:

http://www.acceleratingfuture.com/michael/blog/2009/10/answering-popular-sciences-10-questions-on-the-singularity/

Randall Parker said at October 24, 2009 7:36 PM:

Matthew f.,

My opposition to creating sentient robots does not stem from any analysis about robots I've read by others. I'm reasoning from evolutionary point of view informed by lots of scientific reports about human nature. The idea that the capacity to reason should lead to a rights-possessing entity is a joke. We have a society based partially on individual rights because of a combination of reasoning ability and some innate desires and values.

To have a rights-based society a substantial fraction of the population has to not want to rule and not want to be ruled and not want others to be treated unfairly. I think this instinctive desire is a product of specific evolutionary circumstances and that a species could be sentient and yet not possess these traits.

Post a comment
Comments:
Name (not anon or anonymous):
Email Address:
URL:
Remember info?

                       
Go Read More Posts On FuturePundit
Site Traffic Info
The contents of this site are copyright ©