As technology continues to improve, humanlike robots will likely play an ever-increasing role in our lives: They may become tutors for children, caretakers for the elderly, office receptionists or even housemaids. Children will come of age with these androids, which naturally raises the question: What kind of relationships will kids build with personified robots?
Children will view humanoid robots as intelligent social and moral beings, allowing them to develop substantial and meaningful relationships with the machines, new research suggests.
Researchers analyzed the interactions between nearly 100 children and Robovie, a 3-foot-tall (0.9 meters) robot developed by the Advanced Telecommunications Research Institute in Japan. In the study, two technicians controlled Robovie remotely from another room, leading the children to believe that the robot was autonomous. The researchers imparted humanlike behavior to the robot, such as having Robovie claim unfair treatment when he was told to go into the closet at the end of the interaction sessions.
Follow-up interviews with the children showed that the kids believed Robovie had mental states, such as being intelligent and having feelings, and was a social entity capable of being a friend and confidante. Many of the children also believed that Robovie deserved fair treatment and should not be psychologically harmed.
“We typically think [of] robots as rational calculators rather than humanlike and emotional,” said Adam Waytz, a psychologist at Northwestern University in Illinois, who was not involved in the study. “But this research provides a nice example of how endowing a robot with emotions can lead children to treat the robot as a companion and to consider its moral standing.”
A mental, social and moral entity
A major goal in the field of human-robot interaction is to determine how people will behave socially with robots in the near future. Will we treat robots as tools to be used and tossed aside at will, or will we see them as moral entities deserving of fairness and rights?
To find out, Solace Shen, a psychology doctoral student at the University of Washington, and her colleagues recruited 90 children ages 9, 12 or 15 years old to interact with Robovie. The robot has some autonomous functions and speech recognition, but the researchers instead chose to control Robovie themselves.
“We tried to create a situation where people come in and interact with the robot in what would be a possible future scenario,” Shen told LiveScience.
The 15-minute interaction sessions had several stages designed to impart Robovie with seemingly human characteristics and behavior. For example, Robovie introduces himself to the children, shows them an aquarium and teaches them about the ocean, asks them to move a ball out of his way, plays “I Spy” and argues with a researcher, who is present for the entire session.
In the last leg of the session, a second researcher interrupts the “I Spy” game to tell Robovie that he is no longer needed and has to go into the closet. Robovie objects and says that he is scared of being in the closet, but the researcher puts him in there anyway.
Immediately following the staged interactions, the researchers interviewed each child for 50 minutes. The majority of the children thought that Robovie had mental states; for instance, 79 percent believed he was intelligent and 60 percent believed that he had feelings. On the social side of things, 84 percent of the children said they might like to spend time with Robovie if they were lonely and 77 percent believed that he could be their friend.
Fewer children attributed Robovie with moral rights: 54 percent of the children believed it was wrong to put Robovie in the closet (whereas 98 percent said it would be wrong to put a person in a closet), and 42 percent believed that Robovie should be paid if he teaches people about the ocean all day long.
A fanciful view
Overall, fewer 15-year-olds saw Robovie as a mental, social and moral being than did the 9- and 12-year-olds, who scored the robot relatively the same on mental capacity. “But even though the 15-year-olds attribute less of these qualities, overhalf of them scored pretty high for Robovie as a mental, social, moral entity,” Shen said.
The older children may just have a less “fanciful” view of robots and see them as mechanical machines. Alternatively, their views may have something to do with adolescents, which is a “unique age group that comes with its own issues and struggles,” Shen explained. To really figure it all out, the researchers need to follow up with similar studies involving Robovie and adults.
“If we did [that] and we saw that this developmental trend continues, then it would give us more clear evidence that maybe the older you get, the more you lose this fanciful view of robots,” Shen said.
Whatever the case, the researchers think that the results have important implications for the design of future robots. If engineers design robots to simply obey orders, the master-servant relationship that children experience may trickle into their interactions with other humans. Is it then better to design robots with the ability to “push back” as Robovie did when he was instructed to go into the closet?
Shen said there is no easy answer to which design scheme is better.
“I don’t think children will treat robots as nonsocial beings, they will treat them as social actors and interact with them in social ways,” she said. “But we need more data and evidence to see how adults, as well as children, will develop relationships with these robots.”
The study was published in the March issue of the journal Developmental Psychology.
Related posts:
Views: 0