Compare commits
5 Commits
1815127398
...
d308707fa2
Author | SHA1 | Date |
---|---|---|
|
d308707fa2 | |
|
78fd1a1e04 | |
|
90533e6b9f | |
|
fa7e40a4a5 | |
|
15e91b5905 |
16
README.md
16
README.md
|
@ -98,14 +98,18 @@ conda create -y -n lerobot python=3.10
|
|||
conda activate lerobot
|
||||
```
|
||||
|
||||
Install 🤗 LeRobot:
|
||||
When using `miniconda`, if you don't have `fffmpeg` in your environment:
|
||||
```bash
|
||||
pip install -e .
|
||||
conda install ffmpeg
|
||||
```
|
||||
|
||||
> **NOTE:** Depending on your platform, If you encounter any build errors during this step
|
||||
you may need to install `cmake` and `build-essential` for building some of our dependencies.
|
||||
On linux: `sudo apt-get install cmake build-essential`
|
||||
Install 🤗 LeRobot:
|
||||
```bash
|
||||
pip install --no-binary=av -e .
|
||||
```
|
||||
|
||||
> **NOTE:** If you encounter build errors, you may need to install additional dependencies (`cmake`, `build-essential`, and `ffmpeg libs`). On Linux, run:
|
||||
`sudo apt-get install cmake build-essential python-dev pkg-config libavformat-dev libavcodec-dev libavdevice-dev libavutil-dev libswscale-dev libswresample-dev libavfilter-dev pkg-config`. For other systems, see: [Compiling PyAV](https://pyav.org/docs/develop/overview/installation.html#bring-your-own-ffmpeg)
|
||||
|
||||
For simulations, 🤗 LeRobot comes with gymnasium environments that can be installed as extras:
|
||||
- [aloha](https://github.com/huggingface/gym-aloha)
|
||||
|
@ -114,7 +118,7 @@ For simulations, 🤗 LeRobot comes with gymnasium environments that can be inst
|
|||
|
||||
For instance, to install 🤗 LeRobot with aloha and pusht, use:
|
||||
```bash
|
||||
pip install -e ".[aloha, pusht]"
|
||||
pip install --no-binary=av -e ".[aloha, pusht]"
|
||||
```
|
||||
|
||||
To use [Weights and Biases](https://docs.wandb.ai/quickstart) for experiment tracking, log in with
|
||||
|
|
|
@ -59,7 +59,7 @@ git clone https://github.com/huggingface/lerobot.git ~/lerobot
|
|||
|
||||
#### 5. Install LeRobot with dependencies for the feetech motors:
|
||||
```bash
|
||||
cd ~/lerobot && pip install -e ".[feetech]"
|
||||
cd ~/lerobot && pip install --no-binary=av -e ".[feetech]"
|
||||
```
|
||||
|
||||
Great :hugs:! You are now done installing LeRobot and we can begin assembling the SO100 arms :robot:.
|
||||
|
|
|
@ -69,7 +69,7 @@ git clone https://github.com/huggingface/lerobot.git ~/lerobot
|
|||
|
||||
#### 5. Install LeRobot with dependencies for the feetech motors:
|
||||
```bash
|
||||
cd ~/lerobot && pip install -e ".[feetech]"
|
||||
cd ~/lerobot && pip install --no-binary=av -e ".[feetech]"
|
||||
```
|
||||
|
||||
## C. Install LeRobot on laptop
|
||||
|
@ -110,7 +110,7 @@ git clone https://github.com/huggingface/lerobot.git ~/lerobot
|
|||
|
||||
#### 5. Install LeRobot with dependencies for the feetech motors:
|
||||
```bash
|
||||
cd ~/lerobot && pip install -e ".[feetech]"
|
||||
cd ~/lerobot && pip install --no-binary=av -e ".[feetech]"
|
||||
```
|
||||
|
||||
Great :hugs:! You are now done installing LeRobot and we can begin assembling the SO100 arms and Mobile base :robot:.
|
||||
|
|
|
@ -33,7 +33,7 @@ git clone https://github.com/huggingface/lerobot.git ~/lerobot
|
|||
|
||||
5. Install LeRobot with dependencies for the feetech motors:
|
||||
```bash
|
||||
cd ~/lerobot && pip install -e ".[feetech]"
|
||||
cd ~/lerobot && pip install --no-binary=av -e ".[feetech]"
|
||||
```
|
||||
|
||||
## Configure the motors
|
||||
|
|
|
@ -18,7 +18,7 @@ training outputs directory. In the latter case, you might want to run examples/3
|
|||
|
||||
It requires the installation of the 'gym_pusht' simulation environment. Install it by running:
|
||||
```bash
|
||||
pip install -e ".[pusht]"`
|
||||
pip install --no-binary=av -e ".[pusht]"`
|
||||
```
|
||||
"""
|
||||
|
||||
|
|
|
@ -33,7 +33,7 @@ First, install the additional dependencies required for robots built with dynami
|
|||
|
||||
Using `pip`:
|
||||
```bash
|
||||
pip install -e ".[dynamixel]"
|
||||
pip install --no-binary=av -e ".[dynamixel]"
|
||||
```
|
||||
|
||||
Using `poetry`:
|
||||
|
|
|
@ -45,7 +45,7 @@ git clone https://github.com/huggingface/lerobot.git ~/lerobot
|
|||
|
||||
6. Install LeRobot with stretch dependencies:
|
||||
```bash
|
||||
cd ~/lerobot && pip install -e ".[stretch]"
|
||||
cd ~/lerobot && pip install --no-binary=av -e ".[stretch]"
|
||||
```
|
||||
|
||||
> **Note:** If you get this message, you can ignore it: `ERROR: pip's dependency resolver does not currently take into account all the packages that are installed.`
|
||||
|
|
|
@ -32,7 +32,7 @@ git clone https://github.com/huggingface/lerobot.git ~/lerobot
|
|||
|
||||
5. Install LeRobot with dependencies for the Aloha motors (dynamixel) and cameras (intelrealsense):
|
||||
```bash
|
||||
cd ~/lerobot && pip install -e ".[dynamixel, intelrealsense]"
|
||||
cd ~/lerobot && pip install --no-binary=av -e ".[dynamixel, intelrealsense]"
|
||||
```
|
||||
|
||||
## Teleoperate
|
||||
|
|
|
@ -24,7 +24,7 @@ Designed by Physical Intelligence. Ported from Jax by Hugging Face.
|
|||
|
||||
Install pi0 extra dependencies:
|
||||
```bash
|
||||
pip install -e ".[pi0]"
|
||||
pip install --no-binary=av -e ".[pi0]"
|
||||
```
|
||||
|
||||
Example of finetuning the pi0 pretrained model (`pi0_base` in `openpi`):
|
||||
|
|
|
@ -0,0 +1,132 @@
|
|||
"""
|
||||
Edit your dataset in-place.
|
||||
|
||||
Example of usage:
|
||||
```bash
|
||||
python lerobot/scripts/edit_dataset.py remove \
|
||||
--root data \
|
||||
--repo-id cadene/koch_bimanual_folding_2 \
|
||||
--episodes 0 4 7 10 34 54 69
|
||||
```
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import shutil
|
||||
from pathlib import Path
|
||||
|
||||
import torch
|
||||
|
||||
from lerobot.common.datasets.compute_stats import compute_stats
|
||||
from lerobot.common.datasets.lerobot_dataset import CODEBASE_VERSION, LeRobotDataset
|
||||
from lerobot.common.datasets.utils import calculate_episode_data_index, create_branch
|
||||
from lerobot.scripts.push_dataset_to_hub import (
|
||||
push_dataset_card_to_hub,
|
||||
push_meta_data_to_hub,
|
||||
push_videos_to_hub,
|
||||
save_meta_data,
|
||||
)
|
||||
|
||||
|
||||
def remove_episodes(dataset, episodes):
|
||||
if not dataset.video:
|
||||
raise NotImplementedError()
|
||||
|
||||
repo_id = dataset.repo_id
|
||||
info = dataset.info
|
||||
hf_dataset = dataset.hf_dataset
|
||||
# TODO(rcadene): implement tags
|
||||
# if None, should use the same tags
|
||||
tags = None
|
||||
|
||||
local_dir = dataset.videos_dir.parent
|
||||
train_dir = local_dir / "train"
|
||||
new_train_dir = local_dir / "new_train"
|
||||
meta_data_dir = local_dir / "meta_data"
|
||||
|
||||
new_hf_dataset = hf_dataset.filter(lambda row: row["episode_index"] not in episodes)
|
||||
|
||||
unique_episode_idxs = torch.stack(new_hf_dataset["episode_index"]).unique().tolist()
|
||||
|
||||
episode_idx_to_reset_idx_mapping = {}
|
||||
for new_ep_idx, ep_idx in enumerate(sorted(unique_episode_idxs)):
|
||||
episode_idx_to_reset_idx_mapping[ep_idx] = new_ep_idx
|
||||
|
||||
for key in dataset.video_frame_keys:
|
||||
path = dataset.videos_dir / f"{key}_episode_{ep_idx:06d}.mp4"
|
||||
new_path = dataset.videos_dir / f"{key}_episode_{new_ep_idx:06d}.mp4"
|
||||
path.rename(new_path)
|
||||
|
||||
def modify_ep_idx(row):
|
||||
new_ep_idx = episode_idx_to_reset_idx_mapping[row["episode_index"].item()]
|
||||
|
||||
for key in dataset.video_frame_keys:
|
||||
fname = f"{key}_episode_{new_ep_idx:06d}.mp4"
|
||||
row[key]["path"] = f"videos/{fname}"
|
||||
|
||||
row["episode_index"] = new_ep_idx
|
||||
return row
|
||||
|
||||
new_hf_dataset = new_hf_dataset.map(modify_ep_idx)
|
||||
|
||||
episode_data_index = calculate_episode_data_index(new_hf_dataset)
|
||||
|
||||
new_dataset = LeRobotDataset.from_preloaded(
|
||||
repo_id=dataset.repo_id,
|
||||
hf_dataset=new_hf_dataset,
|
||||
episode_data_index=episode_data_index,
|
||||
info=info,
|
||||
videos_dir=dataset.videos_dir,
|
||||
)
|
||||
stats = compute_stats(new_dataset)
|
||||
|
||||
new_hf_dataset = new_hf_dataset.with_format(None) # to remove transforms that cant be saved
|
||||
|
||||
new_hf_dataset.save_to_disk(str(new_train_dir))
|
||||
shutil.rmtree(train_dir)
|
||||
new_train_dir.rename(train_dir)
|
||||
|
||||
save_meta_data(info, stats, episode_data_index, meta_data_dir)
|
||||
|
||||
new_hf_dataset.push_to_hub(repo_id, revision="main")
|
||||
push_meta_data_to_hub(repo_id, meta_data_dir, revision="main")
|
||||
push_dataset_card_to_hub(repo_id, revision="main", tags=tags)
|
||||
if dataset.video:
|
||||
push_videos_to_hub(repo_id, dataset.videos_dir, revision="main")
|
||||
create_branch(repo_id, repo_type="dataset", branch=CODEBASE_VERSION)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
parser = argparse.ArgumentParser()
|
||||
subparsers = parser.add_subparsers(dest="mode", required=True)
|
||||
|
||||
# Set common options for all the subparsers
|
||||
base_parser = argparse.ArgumentParser(add_help=False)
|
||||
base_parser.add_argument(
|
||||
"--root",
|
||||
type=Path,
|
||||
default="data",
|
||||
help="Root directory where the dataset will be stored locally at '{root}/{repo_id}' (e.g. 'data/hf_username/dataset_name').",
|
||||
)
|
||||
base_parser.add_argument(
|
||||
"--repo-id",
|
||||
type=str,
|
||||
default="lerobot/test",
|
||||
help="Dataset identifier. By convention it should match '{hf_username}/{dataset_name}' (e.g. `lerobot/test`).",
|
||||
)
|
||||
|
||||
remove_calib = subparsers.add_parser("remove", parents=[base_parser])
|
||||
remove_calib.add_argument(
|
||||
"--episodes",
|
||||
type=int,
|
||||
nargs="+",
|
||||
help="Episode indices to remove (e.g. `0 1 5 6`).",
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
input("It is recommended to make a copy of your dataset before modifying it. Press enter to continue.")
|
||||
|
||||
dataset = LeRobotDataset(args.repo_id, root=args.root)
|
||||
|
||||
if args.mode == "remove":
|
||||
remove_episodes(dataset, args.episodes)
|
Loading…
Reference in New Issue