Live Now

Why Scientists Want Robots to Learn to Feel Pain

Advertisement
By Karen Turner, The Washington Post | Updated: 28 May 2016 18:13 IST
Highlights
  • Researchers programmed their robot to experience a hierarchy of pain.
  • Robot acted upon different stimuli and retracted in danger.
  • It's the first research to see a sense of human empathy for robots having pain.
Why Scientists Want Robots to Learn to Feel Pain
Robots are one step closer to being able to experience an essential human feeling: pain.

Researchers in Germany are currently creating a "nervous system" that would mimic a pain response in robots, allowing them to quickly react and avoid harmful situations.

"Pain is a system that protects us," researcher Johannes Kuehn told a conference of engineers last week. "When we evade from the source of pain, it helps us not get hurt."

The researchers programmed their robot to experience a "hierarchy" of pain through a variety of different stimuli, such as blunt force or heat. Depending on the threat, such as a harsh movement or intense heat, the robot is programmed to retract to the danger. The more dangerous it registers the threat to be, the faster the robot will retract and the longer it will avoid the hazardous force.

"A robot needs to be able to detect and classify unforeseen physical states and disturbances, rate the potential damage they may cause to it, and initiate appropriate countermeasures, i.e., reflexes," the research paper states.

Advertisement

Kuehn said a built-in pain response could protect robots potentially operating heavy machinery or other tools in factories from potential harm, thus saving companies from the fallout of damages. It also means a better safety environment for human workers, who often work side-by-side with robots on the factory floor.

It's the synthesis of a pain sensation that encourages robots to experience a sense of self-preservation. Robots built to automatically detect human collisions have been around for a while: researchers from Stanford and University of Rome-La Sapeinza created a reflexive robot arm that detects and avoids collision with humans in 2011. But to equip these robots with a nervous system forces them to prioritize avoidance of their own pain, thus programming them to avoid destroying themselves as well as avoiding collision with humans, according to Kuehn. This will trigger different reactions in the robot than just crash avoidance.

Advertisement

The concept of robots that a feel physical sensation is not new. Sensitive "robot skin" was developed by researchers at Georgia Tech in 2014. The skin makes use of flexible touch sensors that communicate with a memory device that can store tactile interactions, mimicking human sensory memory. It allows the robot to adjust the pressure of its touch based on the object it comes into contact with, letting it grip soft objects, such as fruit, without destroying them. This touch-sensitive technology will allow for robotic applications outside of the hard machinery of the factory floor and into other spheres, such as assisting the disabled with daily household tasks.

But what about the ethics of empowering robots with a sense of touch, and on the other end of the spectrum, a sense of pain? After all, research shows that humans actually do feel bad when robots get hurt.

Advertisement

A study in the journal Scientific Reports became the first research to observe a sense of human empathy for robots experiencing pain. Subjects hooked up to electroencephalography (EEG) devices to measure their electrical brain activity were then exposed to a series of images of violence both on humans and on robots. The study found that subjects did register a sense of emotional concern for the robots who were subjected to pain, albeit to a much lesser degree than their human counterparts. The reasons for this phenomenon are unknown, though some have speculated that it has to do with exposure to human-like robots in popular culture.

In the journey to equip robots with a pain response, researchers still have a long way to go. But the recent paper is an important first step.

© 2016 The Washington Post

 

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: Robotics, Robots, Science
Advertisement

Related Stories

Popular Mobile Brands
  1. Microsoft Wants Websites to Have an AI-Powered Natural Language Interface
  2. Google's New Beam Video Communication Platform Can Turn 2D Video Into 3D
  3. iQOO Neo 10 Pro+ With Snapdragon 8 Elite, 6,800mAh Battery Launched
  4. Apple WWDC 2025 Scheduled From June 9 to June 13: All You Need to Know
  5. Nothing Phone 3 Confirmed to Launch Globally in July
  6. Gemini 2.5 Series Gets Improved Capabilities and a Deep Think Mode
  7. Infinix Hot 60 Pro+ Tipped to Debut as the Slimmest Curved Screen Phone
  8. OnePlus Pad 3 With Snapdragon 8 Elite SoC to Launch Globally on This Date
  1. Google Previews Gemini-Powered Android XR Glasses at I/O With Live Language Translation Feature
  2. Google Introduces Beam, an AI-Driven Communication Platform That Turns 2D Video Into 3D Experiences
  3. Google Expands AI Overviews to Over 200 Countries in More Than 40 Languages
  4. Google I/O 2025: Gemini 2.5 AI Models Upgraded With Deep Think Mode, Native Audio Output
  5. Google I/O 2025: AI Mode in Search Gets Agentic Capabilities and a Shopping Experience
  6. Apple WWDC 2025 to Be Held From June 9 to June 13: All You Need to Know
  7. Scientists Transform Lead into Gold, But Only for a Fleeting Moment
  8. Scientists Discover Three-Eyed Sea Moth From Half a Billion Years Ago
  9. NASA's LROC Captures ispace RESILIENCE Landing Site Ahead of June 2025 Lunar Touchdown
  10. Canadian Astrophotographer Captures Stunning Sunflower Galaxy from Ontario
Gadgets 360 is available in
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2025. All rights reserved.