Robots Taught To Tidy Homes By Watching Humans In Action

The robot in action. Robots could soon be tidying up our messy bedrooms, according to new research. PHOTO BY DR.DEEPAK PATHAK/YOUTUBE
The robot in action. Robots could soon be tidying up our messy bedrooms, according to new research. PHOTO BY DR.DEEPAK PATHAK/YOUTUBE


By Mark Waghorn

Robots could soon be tidying up our messy bedrooms, according to new research.

Scientists have taught them how to carry out household chores – by showing them video footage of humans doing the same.

The breakthrough is a major step toward AI assisting us with cooking and cleaning.

Two machines successfully learned 12 jobs, including opening a drawer, oven door and lid.

They also took a pot off the stove, picked up a vegetable and can of soup – and answered the telephone.

Project leader Dr. Deepak Pathak, of Carnegie Mellon University, Pittsburgh, Pa., said: “The robot can learn where and how humans interact with different objects through watching videos.

“From this knowledge, we can train a model that enables two robots to complete similar tasks in varied environments.”

Robot ‘Kasia’ delivers orders from a Hindi restaurant in Rzeszow. On Monday, January 02, 2023, in Rzeszow, Subcarpatian Voivodeship, Poland. Scientists have taught them how to carry out household chores – by showing them video footage of humans doing the same. PHOTO BY ARTUR WIDAK/GETTY IMAGES 

Current methods of training robots require either the manual demonstration of tasks by humans or extensive training in a simulated environment.

Both are time consuming and prone to failure. The U.S. team previously demonstrated a novel method in which robots learn from observing humans complete tasks.

But WHIRL, short for In-the-Wild Human Imitating Robot Learning, requires the human to complete the task in the same environment as the robot.

Now Vision-Robotics Bridge (VRB) builds on and improves WHIRL. It eliminates the necessity of human demonstrations as well as the need for the robot to operate within an identical environment.

Like WHIRL, the robot still requires practice to master a task. Experiments showed it can learn a new task in as little as 25 minutes.

Shikhar Bahl, a Ph.D. student in robotics, said: “We were able to take robots around campus and do all sorts of tasks.

“Robots can use this model to curiously explore the world around them. Instead of just flailing its arms, a robot can be more direct with how it interacts.”

To teach the robot how to interact with an object, the team applied the concept of affordances. Affordances have their roots in psychology and refer to what an environment offers an individual.

The concept has been extended to design and human-computer interaction to refer to potential actions perceived by an individual.

VRB interacts with an object based on human behavior. For example, as a robot watches a human open a drawer, it identifies the contact points — the handle — and the direction of the drawer’s movement — straight out from the starting location.

After watching several videos of humans opening drawers, the robot can determine how to open any drawer.

The team used videos from large datasets such as Ego4D and Epic Kitchens. Ego4D has nearly 4,000 hours of egocentric videos of daily activities from across the world.

Epic Kitchens features similar videos capturing cooking, cleaning and other kitchen tasks. Both datasets are intended to help train computer vision models.

Bahl said: “We are using these datasets in a new and different way. This work could enable robots to learn from the vast amount of internet and YouTube videos available.”

VRB is being unveiled at the Conference on Vision and Pattern Recognition in Vancouver.

Produced in association with SWNS Talker

Edited by Saba Fatima and Newsdesk Manager



The post Robots Taught To Tidy Homes By Watching Humans In Action appeared first on Zenger News.