Humans feel empathy for robots and experience a similar reaction when shown images of humans and machines being treated with affection or violence.
Researchers at the University of Duisburg Essen in Germany looked at MRI scans of people to measure their reaction to robots being treated kindly or abused.
Firstly, 40 participants watched videos of a small dinosaur-shaped robot either being treated with affection or violence. The team found participants experienced more negative emotions when watching the robot being abused.
The researchers then investigated brain correlations to human/robot interaction in comparison with human/human interaction.
Videos showed a human, a robot and an inanimate object either being abused or treated affectionately.
Findings showed that affectionate interaction towards the human and robot resulted in similar brain activity, suggesting they evoke similar emotional reactions.
The videos showing abusive behaviour garnered a stronger reaction for the human than the robot. "One goal of current robotics research is to develop robotic companions that establish a long-term relationship with a human user, because robot companions can be useful and beneficial tools," said researcher Astrid Rosenthal-von der Pütten.
"They could assist elderly people in daily tasks and enable them to live longer autonomously in their homes, help disabled people in their environments, or keep patients engaged during the rehabilitation process."
However, while researchers look to further develop robotic technology to aid people, a campaign group are looking to ban 'killer robots' that can attack targets without human intervention.
The Campaign to Stop Killer Robots has been launched in London to ban 'fully autonomous weapons' that are able to choose and fire at targets without being operated by humans.
It follows a similar, successful bid from the 1990s, where anti-personnel landmines and blinding lasers were banned.
While the UK government says it has no plans to develop such weapons, campaigners say it is only a matter of time before killer robots are created.
They say they are concerned about the increased use of unmanned drones by the US and other countries, and want a pre-emptive international law and treaties put in place to restrict their use.
'I don't want a machine to kill me'
In November, the US Department of Defence issued a directive requiring a human to be involved in decisions regarding the use of lethal force. However, Human Rights Watch said it contained "significant loopholes" that could be waived by high-level department officials.
Campaign leader Jody Williams, who won the Nobel Peace Prize in 1997 for working to ban anti-personnel landmines, told Sky News: "We're worried about machines that can be programmed, you put in a certain set of criteria, but then you set that machine free and it goes off whether it is in the air or on land or in the sea.
"I don't want a programmed machine to kill me."
However, robotics professor Noel Sharkey said the campaign group's use of the term 'killer robots' is far-fetched: "In robotics we came up with the term 'autonomy' to simply mean programmed and operating with censors, rather than calling them killer robots, it's a weapons system that can select a target and engage without further human involvement."