Kids in mall. Run, robot, run (for your life) – Info Robotic
When is human-likeness a good thing and when is it too much of a good thing? Interesting thought: If a child delights in pulling hair off her dolls’ heads, disturbing the cat and jabbing her sister to tears, what will she do to the little space robot with eyes carefully designed in some studio to melt hearts or, as advertisers say, “engage?”
The thought has been raised to an investigation among researchers, who have examined the question of children reaching out to machines created with human attributes.
To be sure, Japan is one of the many places where the question is worth posing, as Japan eyes the future of nonmilitary robots as assistive technology aides and guides in schools, hospitals, museums and shops.
“Why Do Children Abuse Robots?” This is the title of a paper authored by seven researchers in Japan, representing Ryukoku University, Tokai University and a robotics lab in Kyoto.
The researchers carried out the study in a shopping mall in Japan. The robot was human sized and humanoid. They observed a gamut of troublesome behavior by the children. The children persistently obstructed the locomotion of the robot even if the robot uttered a request for the obstruction to stop; abusive language; kicking, punching and beating the robot; folding its arms, and bending the joints of its arm and head. The team afterwards conducted semi-structured interviews with them, based on a protocol developed in advance.
A total of twenty-eight children were interviewed. Why did they behave as they did with this robot? Some said they were curious. Others said they enjoyed behaving as they did. Overall, the percentages went like this: for curiosity (22%), for enjoyment (35%), or triggered by others (17%). One child mentioned that he explicitly intended to threaten the robot.
The authors, in the discussion section of their paper, raised interesting points for those who wish to pursue studies on children interacting with robots.
The researchers reported: “we found that about half of the children believed the capability of the robot of perceiving their abusive behavior. It suggests that these children lacked empathy for the robot (i.e., they know, but did not empathize).”
They speculated that “although one might consider that human-likeness might help moderating the abuse, human-likeness is probably not that powerful way to moderate robot abuse. Instead, one possibility is to explore the way to elicit children’s empathy for robots.”
More data are needed, they said, for exploring children abusing robots . “In the case that children abuse a robot regarding it as an entity closer to a machine than a human, we face a question: whether the increase of human-likeness in a robot simply leads to the increase of children’s empathy for it, or favors its abuse from children with a lack of empathy for it.”
Article Prepared by Ollala Corp