::: GENSIM: GENERATING ROBOTIC SIMULATION TASKS VIA LARGE LANGUAGE MODELS
Table of Contents
1. problem
existing data generation scheme are on scene-level diversity(objectg instances and poses)
task-level diversity is effort-heavy.
2. method
GIMSIM the system. They have crafted prompting mechnisms(template of prompt to LLM) that output task in a format of task-name, task-descriptoin, asset-used. and prompt template to generate task code from those descriptions, and hence simulation codes
The also used LLM to reflect on executions of the task codes(compiler error,runtime error, simulation log) to decide whether to keep it in the task library, which would be reused.
The system is crafted in a way to automate whereas possible.
2.1. GIMSIM the system
2.1.1. modes
2.1.1.1. goal-directed generation mode
target task -> LLM -> task curriculum toi solve the target task(decomposition)
2.1.1.2. exploratory generation mode
previous tasks -> LLM -> novel tasks that would be helpful in solving them(which would be added to previous tasks.)
2.1.2. components
2.1.2.1. prompting mech
given prompt “how to achieve the task BuildCar”, return code implementation of the task, in task simulation language(python I think).
2.1.2.2. task library
2.1.2.3. language-conditioned multi-task policy training procedure
3. fun stuff
- init task library with 10 good(human-curated tasks), generated over 100 tasks
4. experiments
The experimental evaluation shows that pre-training on a large number of tasks increases zero-shot generalization performance on new tasks (held-out test set also generated) in both simulation and real environments after fine-tuning.
So it works, the more divesre tasks did improve performance of training
5. practical
I have a copy of their code in my playground. May run it sometime
Backlinks
LLM labeling trajectory
(Related keywords)
- [trajectory]
- [prediction]
- [LLM,VLM]
- [perception]
- it is possible that it could be done better with chain of thoughts or some pipeline like in GENSIM: GENERATING ROBOTIC SIMULATION TASKS VIA LARGE LANGUAGE MODELS
- flight Trajectory Reconstruction
- Time-series data prediction
Here’s a script to insert multiple org-roam nodes
(defun hermanhel-strings-to-hash (strings) "Convert a list of STRINGS to a hash table with the strings as keys." (let ((hash (make-hash-table :test 'equal))) (dolist (str strings) (puthash str t hash)) hash)) (defun hermanhel-org-roam-insert-multiple-nodes-as-list () (interactive) (let ( (candidates (hermanhel-strings-to-hash (org-roam--get-titles))) (selected-nodes (citar--select-multiple "References: " candidates)) ) (dolist (title selected-nodes) (insert "+ " "[[roam:" title "]]" "\n") ) ) )