Science

Now, robots understand what objects feel by picking them up

Researchers at MIT, Amazon Robotics and the University of British Columbia have proposed a new way for robots to figure out what an object looks like by picking it up and moving it. These robots use their own built-in sensors to tell the weight, softness, or what the inside of an object is without using a camera or special tools.

Get clues from human perception

Think about when to pick up the box and shake it to guess what’s inside. Robots can do something similar now. They use what scientists call “protosensory”—the ability to perceive their own movements and positions.

“The angle of joints in a human finger or the exact amount of torque we are applying to an object is not measured extremely accurately, but robots can take advantage of these capabilities.”

Unlike other methods that require an external camera or tool, this method uses sensors that are already built into the robot joint. These sensors track how joints move and rotate while processing objects.

https://www.youtube.com/watch?v=prexg_n3nsy

How it works: Digital twins make it possible

The key to making this work is a computer simulation that models both the robot and the object being processed. The system observes what happens during the actual interaction and then adjusts the object’s properties in the simulation until it matches what really happens.

Researchers call this “distinguishable simulation” called the computer able to figure out how small changes in object properties affect how robots move.

“Having real-world digital twins is really important for the success of our approach,” said Peter Yichen Chen, lead author of the research paper.

Real-world use beyond the laboratory

This technology works well in places where the camera cannot be — such as dark rooms or disaster areas with dust and smoke. It’s also cheaper because it uses sensors built into most robots.

During the test, the researchers taught the robot to figure out:

  • How many different objects weigh (only 0.002 kg of slight error)
  • How soft is the various materials
  • What is a closed container

Being able to “feel” objects without seeing them helps the robot work better in a messy real-life environment. “The idea is general, and I believe we’re just scratching what robots can learn like this,” Chen said. “My dream is to get robots into the world, touch things and move things in the environment, and find out the attributes of everything they interact with themselves.”

What’s next?

The research team now hopes to combine this method with camera field of view for better results. They also plan to try it out using more complex robots and challenging materials such as oblique liquids or sand.

“This work is important because it shows that the robot can only use its internal joint sensors to accurately infer mass and flexibility without relying on external cameras or specialized measurement tools,” said Miles Macklin of NVIDIA, who is not part of the research team.

As robots become more common in our homes, workplaces, and challenging environments, this ability to “feel” what they process marks an important step in making machines that can interact with the world like we do.

Fuel Independent Scientific Report: Make a difference today

If our report has been informed or inspired, please consider donating. No matter how big or small, every contribution allows us to continue to deliver accurate, engaging and trustworthy scientific and medical news. Independent news takes time, energy and resources – your support ensures that we can continue to reveal the stories that matter most to you.

Join us to make knowledge accessible and impactful. Thank you for standing with us!

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button