The more robots become embedded in human society as toys and workers, the more people treat them like pets, friends or even as an extension of themselves. For soldiers who rely more and more on battlefield robots, researchers wondered: If a soldier attaches human or animal-like characteristics to these machines, could they care too much about the robot to send it onto a dangerous battlefield?

The answer to that question could help inform future robot design and human-robot training.

VIDEO: Can We Fall in Love with Robots?

Julie Carpenter, who recently completed her doctorate in education from the University of Washington, wanted to find out, so she interviewed 23 highly trained soldiers in the Explosive Ordnance Disposal unit. She wanted to know if a soldier’s relationship with the small, tank-like robots could affect their decision-making ability and possibly compromise a mission.

Even though soldiers unanimously used the word “tool” or “mechanical” to describe robots in interviews and questionnaires, Carpenter found that many soldiers anthropomorphized the robots. The robots were assigned a male or female pronoun or named Sergeant So-and-So or after a celebrity. Because of the male-dominated environment, robots were often named after a girlfriend or wife. Soldiers posed with robots in photos and a mock funeral was even performed for one robot that was blown up beyond repair.

“It was pretty consistent, the series of emotions that soldiers would go through when a robot was disabled beyond repair -- anger, frustration and sadness for a sense of loss,” Carpenter told Discovery News.

Suicide Drones Blow Up With Their Target

“The sense of loss was more complicated,” Carpenter said. “They had trouble verbalizing that emotion. And I might add, this is a very verbal group of people.”

Ultimately, Carpenter came away with the sense that a soldier's relationship with a robot did not compromise judgment on the battlefield. She said soldiers were too keenly aware of robots' capabilities and limitations and were not deluded into thinking they were human. She observed that soldiers never quite crossed that line.

"These people are so highly trained that it's very true that is does not affect them," Carpenter said. "My concern is that future robots are going to be developed with different shapes, abilities and take on different roles."

Humans are increasingly exposed to robots in their daily lives, and new research shows people feel the same empathy for the bots as they would for another person., Shutterstock

But as robots become more human, that could change, said Carpenter. “I think this is an issue people need to keep an eye on and monitor,” she said.

Clifford Nass, a Stanford University professor who studies the social-psychological aspects of human interactions with technology, says that when technology fulfills a human role, it’s a natural tendency for our brain to think of that technology as human.

Robot Warrior Ethical Guide In The Works

“In high-intensity contexts, such as military and otherwise, the social responses actually increase because your brain doesn’t have as much ability to say ‘It’s only a robot,’” he said. “The more intense and complex a situation, ironically, the more likely people are to develop emotional and social attachments.”

Nass compares the dilemma with people who work with search-and-rescue dogs and the strong attachments they form. “As a result of that, they often become reluctant to use the dogs in those situations,” he said. “The same thing can happen with robots.”

Down the line, Nass suggests that aggressive policy dealing with attachment issues may need to be adopted for those who work closely with robots. “In the case of search-and- rescue dogs, you have to rotate the dogs you use so that you don’t become attached,” he said. “You could do the same thing with a robot.”

Suicide Drones Blow Up With Their Target

In the long run, as robots become more intelligent and autonomous, Carpenter envisions some people will become concerned about the ethics of a robot being destroyed. But for now, she’s concerned with more tangible fears.

“You don’t want to have a human hesitate to put a robot in a dangerous situation when you have to make critical, split-second decisions that affect human lives,” she said.