+86-755-28171273
Home / Knowledge / Details

Nov 09, 2021

Social bots can cheat

With the popularity of social robots , the ethical risk caused by social robots has attracted more and more attention. One of the important concerns is the ethical issue of "deception" of social robots. Social robot, as its name implies, is an autonomous robot that can communicate and interact with human beings. "anthropomorphism" is an important attribute of social robot.

"Social robots are able to communicate and interact with us in a human way, to understand us and even to have relationships with us," says Breazel, from MIT's Media Lab. It is a robot with humanoid social intelligence. We interact with it as if it were a person or even a friend." Duffy, also of MIT, argues that the ability of a robot to engage in meaningful social interaction with people itself requires a degree of anthropomorphism or humanlike qualities, either in form or behavior, or both.


robot

In fact, from the current popular social robots on the market, they not only have the appearance of human beings, but also through "speech", "facial expression", "body language" to simulate human emotions. It can be predicted that with the development of artificial intelligence technology, the "anthropomorphic" attribute of social robots will become more and more prominent. Not only that, But Duffy believes that the ultimate goal for many roboticists today is to create a fully anthropomorphic synth. However, while humans pride themselves on their efforts to "fully anthropomorphise" robots, the ethical risks associated with anthropomorphism should not be underestimated.


Zavesca, from university College Dublin's Intelligence Lab, puts the problem squarely in perspective when she argues that "at the heart of anthropomorphism is illusion, so one of the key ethical issues with social robots is deception." Amanda and Noel Sharkey from the University of Sheffield, UK, agree that designing robots to encourage anthropomorphic attributes could be seen as unethical cheating.


Why does the anthropomorphic nature of social robots raise ethical questions about deception? This needs to start with the understanding of "deception". The Stanford Encyclopedia of Philosophy defines deception as "making people believe something that is false." It can be seen that for the deceived, the object or its attribute is "false", which is the root of deception.


Sparrow points out the "falseness" of robots by comparing them with living things, arguing that "robots have no feelings or experiences." Kirkelberg, from the University of Vienna, also summed up the "unreal" nature of emotional robots from three aspects. First, emotional robots attempt to deceive with their "emotions". The second is that the robot's emotions are unreal. The third is that emotional robots pretend to be entities, but they're not. Thus, no matter how "anthropomorphic" a robot is, it is still fake and unreal compared to humans themselves, which undoubtedly fits well with the definition of "false things" in deception. But only "false" is not enough to raise "deception" to the height of moral ethics.


Carson argues that deception requires some kind of intent to cause others to have false beliefs. The salient moral feature of deception is that it involves deliberately inducing false beliefs in others. Obviously, social robots must "deliberately" conceal their "emotional", "language", "component" and other aspects of the falsity, in order to produce "deception" ethical issues. However, social robots composed of integrated circuits, components and other components do not have any "intention" themselves, but their "anthropomorphic" attributes do have "intention", that is, to induce people to believe that they look more like a human. Zavesca and others argue that robots are essentially machines that "exploit human anthropomorphic tendencies to create the illusion of life rather than becoming alive." It can be seen that the use of "human's tendency to anthropomorphize" is a kind of "intention", and what it points to is the "intention" of "creating the illusion of life". Therefore, considering the definition of deception and the intention of "deception" of social robots, there are indeed ethical problems of "deception" of social robots.


About Manly Battery


A leading battery manufacturer mainly offering rechargeable Lithium batteries over 12 years ,widly used for Robostic ,if there is any project need to evluate ,pls feel free to send email to info@manlybatteries.com

Send Message