20120907

If Apple makes robots, will robots have rights?

By Patrick Thibodeau

Let’s face the reality that robots will gain cognitive skills. This is not self-awareness. But it is an ability to interact in ways that prompt human emotional attachment.

People do get emotionally attached to things. We all know this. But we have little idea how people will ultimately respond to machines that can converse, learn and demonstrate an interest in your life.

Robotics has to be in Apple’s development thinking. It is a logical extension of the iPhone Siri capability six or a dozen generations from today.

Imagine that Apple will develop a walking, smiling and talking version of your iPhone. It has arms and legs. Its eye cameras recognize you. It will drive your car (and engage in Bullitt-like races with Google’s driverless car), do your grocery shopping, fix dinner and discuss the day’s news.

Apple will patent every little nuance the robot is capable of. We know this from its patent lawsuits. If the robot has eyebrows, Apple may file a patent claiming rights to “a robotic device that can raise an eyebrow as a method for expressing skepticism.”

But will Apple or a proxy group acting on behalf of the robot industry go further? Much further. Will it argue that these cognitive or social robots deserve rights of their own not unlike the protections extended to pets?

Should there be, minimally, anti-cruelty laws that protect robots from turning up on YouTube videos being beaten up? Imagine if it were your robot?

Kate Darling, a research specialist at the MIT Media Lab, looks at this broad issue in a recent paper, “Extending Legal Rights to Social Robots.” (Click on download link) She writes, in part:

The Kantian philosophical argument for preventing cruelty to animals is that our actions towards non-humans reflect our morality — if we treat animals in inhumane ways, we become inhumane persons. This logically extends to the treatment of robotic companions. Granting them protection may encourage us and our children to behave in a way that we generally regard as morally correct, or at least in a way that makes our cohabitation more agreeable or efficient.
Darling’s interesting and thoughtful paper also discusses the risks and controversies likely to emerge by giving legal rights to robots.
Some argue that the development and dissemination of such technology encourages a society that no longer differentiates between real and fake, thereby potentially undermining values we may want to preserve. Another cost could be the danger of commercial or other exploitation of our emotional bonds to social robots.
If Apple or any company can make a robot that leaves the factory with rights the marketing potential, as Darling makes note of, may be significant. But then if corporations are people, why not give rights to their assembly line babies? This is all weird, fascinating, discomforting and academic still, but on its way.

No comments: