Sunday, December 13, 2015


232. Will robots have voice?

What will happen when robots take on more and more tasks, with increasing intelligence? What if a robot is opinionated, its views going against the established order, or against the will of its maker or owner?
 
Presently, an intellectual, scientist, or worker on a shop floor with contrary views cannot easily be silenced, in democracies. But robots may be simply switched off, or re-programmed to conform.

What will this do to people, if with regard to robots they no longer need to defend their views, and can bend the views of robots to their own? Would people then prefer to consort with robots rather than people, for the ease and comfort of it? Would that make them more self-involved, narcissistic even, turning robots into mirrors?

In this blog I argued that one needs the opposition form the other to detect one’s own myopia, to nourish a flourishing life. This is needed, I argued, to achieve the highest form of freedom, which includes freedom from the bias of the self.

If robots are self-learning, by adapting their intelligence to what is successful, more rigorously and perfectly than humans, will this be a source of contrariness, defiance? People have a variety of sources of experience, in jobs, families, friendship, sports, travel and chance encounters, to feed their cognition and morality. Will robots have access to such diversity of experience? Will the owner of the robot, having invested in it, be willing to grant it unproductive time, in a range of private activities?

Next to his notions of ‘exit’ and ‘voice’, Albert Hirschman recognized the possibility of ‘loyalty’, which is acceptance, surrender to a faulty relationship.

Robots may undergo forced exit, being switched off, or may be programmed for loyalty. Will they be allowed to raise voice, or even be programmed for it? Or will they ever be self-generative enough to grasp voice, or even to impose loyalty on people? How moral will they be? And how would they acquire morality? I discussed that in item 179 of this blog.

No comments:

Post a Comment