Robots learn how to arrange objects by ‘hallucinating’ humans into their environment (w/ video)

(Phys.org) —A team of robotics engineers working in the Personal Robotics Lab at Cornell University (led by Ashutosh Saxena) has developed a new way to give robots a context-sensitive way to organize a room. Instead of providing the robots with a map, the researchers instead cause the robot to “imagine” how a human being would use objects in a room and then to place them accordingly.

See the article here:
Robots learn how to arrange objects by ‘hallucinating’ humans into their environment (w/ video)


About The Author

Ibrar Ayyub

I am an experienced technical writer holding a Master's degree in computer science from BZU Multan, Pakistan University. With a background spanning various industries, particularly in home automation and engineering, I have honed my skills in crafting clear and concise content. Proficient in leveraging infographics and diagrams, I strive to simplify complex concepts for readers. My strength lies in thorough research and presenting information in a structured and logical format.

Follow Us:
LinkedinTwitter

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top