Uncategorized

Generating Robotic Simulation Tasks via Large Language Models



Download a PDF of the paper titled GenSim: Generating Robotic Simulation Tasks via Large Language Models, by Lirui Wang and 8 other authors

Download PDF

Abstract:Collecting large amounts of real-world interaction data to train general robotic policies is often prohibitively expensive, thus motivating the use of simulation data. However, existing methods for data generation have generally focused on scene-level diversity (e.g., object instances and poses) rather than task-level diversity, due to the human effort required to come up with and verify novel tasks. This has made it challenging for policies trained on simulation data to demonstrate significant task-level generalization. In this paper, we propose to automatically generate rich simulation environments and expert demonstrations by exploiting a large language models’ (LLM) grounding and coding ability. Our approach, dubbed GenSim, has two modes: goal-directed generation, wherein a target task is given to the LLM and the LLM proposes a task curriculum to solve the target task, and exploratory generation, wherein the LLM bootstraps from previous tasks and iteratively proposes novel tasks that would be helpful in solving more complex tasks. We use GPT4 to expand the existing benchmark by ten times to over 100 tasks, on which we conduct supervised finetuning and evaluate several LLMs including finetuned GPTs and Code Llama on code generation for robotic simulation tasks. Furthermore, we observe that LLMs-generated simulation programs can enhance task-level generalization significantly when used for multitask policy training. We further find that with minimal sim-to-real adaptation, the multitask policies pretrained on GPT4-generated simulation tasks exhibit stronger transfer to unseen long-horizon tasks in the real world and outperform baselines by 25%. See the project website (this https URL) for code, demos, and videos.

Submission history

From: Lirui Wang [view email]
[v1]
Mon, 2 Oct 2023 17:23:48 UTC (22,405 KB)
[v2]
Sun, 21 Jan 2024 21:01:12 UTC (29,260 KB)



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *