Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
  • Loading branch information
Mike Roberts committed Mar 31, 2023
2 parents 00cc19f + 4234291 commit 1d2eaff
Show file tree
Hide file tree
Showing 5 changed files with 37 additions and 10 deletions.
1 change: 0 additions & 1 deletion examples/generate_image_dataset/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,6 @@ Running `generate_poses.py` will generate a `poses.csv` file consisting of camer

Running `generate_images.py` will generate images in an `images` directory. This tool accepts several optional command-line arguments that can be used to control its behavior (see the source code for details), e.g.,
- `--poses_file` can be used to generate images based on the camera poses in a specific CSV file.
- `--rendering_mode` can be set to `baked` to use baked lighting, or `raytracing` for ray-traced lighting if you are running on Windows and you have a GPU that supports DirectX ray-tracing.
- `--num_internal_steps` can be used to control the image quality when running in ray-traced mode.
- `--benchmark` can be used to test the overall speed of the simulation.
- `--wait_for_key_press` can be used to compare the game window output to the image that has been saved to disk.
3 changes: 1 addition & 2 deletions examples/imitation_learning_openbot/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Imitation learning with OpenBot
# Imitation Learning with OpenBot

In this example application, we demonstrate how to collect training data that can be plugged into the [OpenBot](http://www.openbot.org) framework and used to train a navigation policy.

Expand Down Expand Up @@ -36,7 +36,6 @@ Running `generate_episodes.py` will generate navigation episodes and store them

Running `generate_dataset.py` will generate a dataset of goals, observations, and actions for each episode. The structure of the generated dataset precisely mimics the structure expected by the OpenBot framework, and can therefore be plugged into the OpenBot training code directly. This tool accepts several optional command-line arguments that can be used to control its behavior (see the source code for details), e.g.,
- `--episodes_file` can be used to read episodes from a specific CSV file.
- `--rendering_mode` can be set to `baked` to use baked lighting, or `raytracing` for ray-traced lighting if you are running on Windows and you have a GPU that supports DirectX ray-tracing.
- `--create_videos` can be used to generate videos from OpenBot observations. If you use this optional argument, the `ffmpeg` command-line tool must be visible on your path.
- `--benchmark` can be used to test the overall speed of the simulation.

Expand Down
4 changes: 4 additions & 0 deletions examples/imitation_learning_openbot/user_config.yaml.example
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,10 @@ SIMULATION_CONTROLLER:

SIMULATION_STEP_TIME_SECONDS: 0.033
MAX_SUBSTEP_DELTA_TIME: 0.001
MAX_SUBSTEPS: 100
CONTACT_OFFSET_MULTIPLIER: 0.01
MIN_CONTACT_OFFSET: 0.0001
MAX_CONTACT_OFFSET: 1.0

OPENBOT_AGENT:
OPENBOT_ACTOR_NAME: "OpenBotActor"
Expand Down
35 changes: 28 additions & 7 deletions examples/open_loop_control_fetch/README.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,35 @@
# Open Loop Control with UrdfBot
# Open-Loop Control with Fetch

In this example application, we demonstrate how to load a Urdf agent and apply simple actions such as base motion, arm motion and gripper grasping.
In this example application, we demonstrate how control a Fetch agent to pick up an object and move it to another location.

Before running this example, rename `user_config.yaml.example` to `user_config.yaml` and modify the contents appropriately for your system, as described in our top-level [README](http://github.com/isl-org/spear).
Before running this example, rename `user_config.yaml.example` to `user_config.yaml` and modify the contents appropriately for your system, as described in our [Getting Started](../../docs/getting_started.md) tutorial.

### Running the example
### Important configuration options

You can control the behavior of this example by setting the following parameters in your user_config.yaml file, e.g.,
- `SPEAR.PAKS_DIR` is the directory containing scene data in the form of PAK files.
- `SIMULATION_CONTROLLER.OPENBOT_AGENT.CAMERA.RENDER_PASSES` can be set to a list of image modalities that you want the agent to return (e.g., setting the value `["depth", "final_color", "segmentation"]` will return depth images, photorealistic RGB images, and segmentation images).

Your `user_config.yaml` file only needs to specify the value of a parameter if it differs from the defaults defined in the `python/config` directory. You can browse this directory for a complete set of all user-configurable parameters.

### Running the example

You can run the example as follows.

```console
python run.py --actions_file=actions.csv --observation_file=observations.csv
# generate actions
python generate_actions.py

# execute actions
python run.py
```

- `actions_file`: preconfigured joint actions.
- `observation_file`: store received link_position observation.
Running `generate_actions.py` will generate an `actions.csv` file consisting of actions that will be used in the following step. This tool accepts several optional command-line arguments that can be used to control its behavior (see the source code for details), e.g.,

- `--scene_id` can be used to specify which scene to generate actions for.

Running `run.py` will execute the previously generated actions in an open-loop fashion on the Fetch agent. This tool accepts several optional command-line arguments that can be used to control its behavior (see the source code for details), e.g.,

- `--action_file` can be used to specify which actions to load.
- `--scene_id` can be used to specify which scene to load.
- `--benchmark` can be used to test the overall speed of the simulation.
4 changes: 4 additions & 0 deletions examples/open_loop_control_fetch/user_config.yaml.example
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,10 @@ SIMULATION_CONTROLLER:
CUSTOM_UNREAL_CONSOLE_COMMANDS: []

MAX_SUBSTEP_DELTA_TIME: 0.001
MAX_SUBSTEPS: 100
CONTACT_OFFSET_MULTIPLIER: 0.01
MIN_CONTACT_OFFSET: 0.0001
MAX_CONTACT_OFFSET: 1.0

URDFBOT_AGENT:
URDFBOT_ACTOR_NAME: "UrdfBotActor"
Expand Down

0 comments on commit 1d2eaff

Please sign in to comment.