TidyBot was tested on eight real-world scenarios, each with its own set of ten objects, and ran the bot three times in each scenario. When they applied this approach to the real-world robot, TidyBot, they found that it was able to successfully put away 85 percent of objects. This test, they wrote, achieved a 91.2 percent accuracy on unseen objects. Each scenario contains two to five places to put objects and equal amounts of seen and unseen objects for the model to sort. The benchmark scenarios were defined in four rooms, with 24 scenarios per room. The LLM summarized the examples into general rules and then used the summary to determine where to place new objects. The researchers first tested a text-based benchmark dataset, where they input user preferences, and then asked the LLM to create personalized rules to determine where the objects belong. The website for the researchers’ paper displays a robot that is able to sort laundry into lights and darks, recycle drink cans, throw away trash, put away bags and utensils, put away scattered objects where they belong, and put toys into a drawer. “Unlike classical approaches that require costly data collection and model training, we show that LLMs can be directly used off-the-shelf to achieve generalization in robotics, leveraging the powerful summarization capabilities they have learned from vast amounts of text data,” they added. “LLMs demonstrate astonishing abilities to perform generalization through summarization, drawing upon complex object properties and relationships learned from massive text datasets.” “The underlying insight is that the summarization capabilities of LLMs are a good match for the generalization requirements of personalized robotics,” the authors wrote. The researchers wrote in the paper that they first asked a person to provide a few example object placements such as “yellow shirts go in the drawer, dark purple shirts go in the closet, white socks go in the drawer,” and then asked the LLM to summarize the examples to create generalized preferences for the person.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |