Compare commits

...

5 Commits

Author SHA1 Message Date
Remi d308707fa2
Merge fa7e40a4a5 into 78fd1a1e04 2025-03-28 13:15:34 +08:00
Steven Palma 78fd1a1e04
chore(docs): update docs (#911) 2025-03-27 09:55:06 +01:00
Steven Palma 90533e6b9f
fix(docs): hot-fix updating installation instructions after #883 (#907) 2025-03-26 13:21:40 +01:00
Remi Cadene fa7e40a4a5 fix 2024-08-25 15:55:29 +02:00
Remi Cadene 15e91b5905 Add edit_dataset.py remove --episodes 2024-08-24 18:16:37 +02:00
10 changed files with 151 additions and 15 deletions

View File

@ -98,14 +98,18 @@ conda create -y -n lerobot python=3.10
conda activate lerobot
```
Install 🤗 LeRobot:
When using `miniconda`, if you don't have `fffmpeg` in your environment:
```bash
pip install -e .
conda install ffmpeg
```
> **NOTE:** Depending on your platform, If you encounter any build errors during this step
you may need to install `cmake` and `build-essential` for building some of our dependencies.
On linux: `sudo apt-get install cmake build-essential`
Install 🤗 LeRobot:
```bash
pip install --no-binary=av -e .
```
> **NOTE:** If you encounter build errors, you may need to install additional dependencies (`cmake`, `build-essential`, and `ffmpeg libs`). On Linux, run:
`sudo apt-get install cmake build-essential python-dev pkg-config libavformat-dev libavcodec-dev libavdevice-dev libavutil-dev libswscale-dev libswresample-dev libavfilter-dev pkg-config`. For other systems, see: [Compiling PyAV](https://pyav.org/docs/develop/overview/installation.html#bring-your-own-ffmpeg)
For simulations, 🤗 LeRobot comes with gymnasium environments that can be installed as extras:
- [aloha](https://github.com/huggingface/gym-aloha)
@ -114,7 +118,7 @@ For simulations, 🤗 LeRobot comes with gymnasium environments that can be inst
For instance, to install 🤗 LeRobot with aloha and pusht, use:
```bash
pip install -e ".[aloha, pusht]"
pip install --no-binary=av -e ".[aloha, pusht]"
```
To use [Weights and Biases](https://docs.wandb.ai/quickstart) for experiment tracking, log in with

View File

@ -59,7 +59,7 @@ git clone https://github.com/huggingface/lerobot.git ~/lerobot
#### 5. Install LeRobot with dependencies for the feetech motors:
```bash
cd ~/lerobot && pip install -e ".[feetech]"
cd ~/lerobot && pip install --no-binary=av -e ".[feetech]"
```
Great :hugs:! You are now done installing LeRobot and we can begin assembling the SO100 arms :robot:.

View File

@ -69,7 +69,7 @@ git clone https://github.com/huggingface/lerobot.git ~/lerobot
#### 5. Install LeRobot with dependencies for the feetech motors:
```bash
cd ~/lerobot && pip install -e ".[feetech]"
cd ~/lerobot && pip install --no-binary=av -e ".[feetech]"
```
## C. Install LeRobot on laptop
@ -110,7 +110,7 @@ git clone https://github.com/huggingface/lerobot.git ~/lerobot
#### 5. Install LeRobot with dependencies for the feetech motors:
```bash
cd ~/lerobot && pip install -e ".[feetech]"
cd ~/lerobot && pip install --no-binary=av -e ".[feetech]"
```
Great :hugs:! You are now done installing LeRobot and we can begin assembling the SO100 arms and Mobile base :robot:.

View File

@ -33,7 +33,7 @@ git clone https://github.com/huggingface/lerobot.git ~/lerobot
5. Install LeRobot with dependencies for the feetech motors:
```bash
cd ~/lerobot && pip install -e ".[feetech]"
cd ~/lerobot && pip install --no-binary=av -e ".[feetech]"
```
## Configure the motors

View File

@ -18,7 +18,7 @@ training outputs directory. In the latter case, you might want to run examples/3
It requires the installation of the 'gym_pusht' simulation environment. Install it by running:
```bash
pip install -e ".[pusht]"`
pip install --no-binary=av -e ".[pusht]"`
```
"""

View File

@ -33,7 +33,7 @@ First, install the additional dependencies required for robots built with dynami
Using `pip`:
```bash
pip install -e ".[dynamixel]"
pip install --no-binary=av -e ".[dynamixel]"
```
Using `poetry`:

View File

@ -45,7 +45,7 @@ git clone https://github.com/huggingface/lerobot.git ~/lerobot
6. Install LeRobot with stretch dependencies:
```bash
cd ~/lerobot && pip install -e ".[stretch]"
cd ~/lerobot && pip install --no-binary=av -e ".[stretch]"
```
> **Note:** If you get this message, you can ignore it: `ERROR: pip's dependency resolver does not currently take into account all the packages that are installed.`

View File

@ -32,7 +32,7 @@ git clone https://github.com/huggingface/lerobot.git ~/lerobot
5. Install LeRobot with dependencies for the Aloha motors (dynamixel) and cameras (intelrealsense):
```bash
cd ~/lerobot && pip install -e ".[dynamixel, intelrealsense]"
cd ~/lerobot && pip install --no-binary=av -e ".[dynamixel, intelrealsense]"
```
## Teleoperate

View File

@ -24,7 +24,7 @@ Designed by Physical Intelligence. Ported from Jax by Hugging Face.
Install pi0 extra dependencies:
```bash
pip install -e ".[pi0]"
pip install --no-binary=av -e ".[pi0]"
```
Example of finetuning the pi0 pretrained model (`pi0_base` in `openpi`):

View File

@ -0,0 +1,132 @@
"""
Edit your dataset in-place.
Example of usage:
```bash
python lerobot/scripts/edit_dataset.py remove \
--root data \
--repo-id cadene/koch_bimanual_folding_2 \
--episodes 0 4 7 10 34 54 69
```
"""
import argparse
import shutil
from pathlib import Path
import torch
from lerobot.common.datasets.compute_stats import compute_stats
from lerobot.common.datasets.lerobot_dataset import CODEBASE_VERSION, LeRobotDataset
from lerobot.common.datasets.utils import calculate_episode_data_index, create_branch
from lerobot.scripts.push_dataset_to_hub import (
push_dataset_card_to_hub,
push_meta_data_to_hub,
push_videos_to_hub,
save_meta_data,
)
def remove_episodes(dataset, episodes):
if not dataset.video:
raise NotImplementedError()
repo_id = dataset.repo_id
info = dataset.info
hf_dataset = dataset.hf_dataset
# TODO(rcadene): implement tags
# if None, should use the same tags
tags = None
local_dir = dataset.videos_dir.parent
train_dir = local_dir / "train"
new_train_dir = local_dir / "new_train"
meta_data_dir = local_dir / "meta_data"
new_hf_dataset = hf_dataset.filter(lambda row: row["episode_index"] not in episodes)
unique_episode_idxs = torch.stack(new_hf_dataset["episode_index"]).unique().tolist()
episode_idx_to_reset_idx_mapping = {}
for new_ep_idx, ep_idx in enumerate(sorted(unique_episode_idxs)):
episode_idx_to_reset_idx_mapping[ep_idx] = new_ep_idx
for key in dataset.video_frame_keys:
path = dataset.videos_dir / f"{key}_episode_{ep_idx:06d}.mp4"
new_path = dataset.videos_dir / f"{key}_episode_{new_ep_idx:06d}.mp4"
path.rename(new_path)
def modify_ep_idx(row):
new_ep_idx = episode_idx_to_reset_idx_mapping[row["episode_index"].item()]
for key in dataset.video_frame_keys:
fname = f"{key}_episode_{new_ep_idx:06d}.mp4"
row[key]["path"] = f"videos/{fname}"
row["episode_index"] = new_ep_idx
return row
new_hf_dataset = new_hf_dataset.map(modify_ep_idx)
episode_data_index = calculate_episode_data_index(new_hf_dataset)
new_dataset = LeRobotDataset.from_preloaded(
repo_id=dataset.repo_id,
hf_dataset=new_hf_dataset,
episode_data_index=episode_data_index,
info=info,
videos_dir=dataset.videos_dir,
)
stats = compute_stats(new_dataset)
new_hf_dataset = new_hf_dataset.with_format(None) # to remove transforms that cant be saved
new_hf_dataset.save_to_disk(str(new_train_dir))
shutil.rmtree(train_dir)
new_train_dir.rename(train_dir)
save_meta_data(info, stats, episode_data_index, meta_data_dir)
new_hf_dataset.push_to_hub(repo_id, revision="main")
push_meta_data_to_hub(repo_id, meta_data_dir, revision="main")
push_dataset_card_to_hub(repo_id, revision="main", tags=tags)
if dataset.video:
push_videos_to_hub(repo_id, dataset.videos_dir, revision="main")
create_branch(repo_id, repo_type="dataset", branch=CODEBASE_VERSION)
if __name__ == "__main__":
parser = argparse.ArgumentParser()
subparsers = parser.add_subparsers(dest="mode", required=True)
# Set common options for all the subparsers
base_parser = argparse.ArgumentParser(add_help=False)
base_parser.add_argument(
"--root",
type=Path,
default="data",
help="Root directory where the dataset will be stored locally at '{root}/{repo_id}' (e.g. 'data/hf_username/dataset_name').",
)
base_parser.add_argument(
"--repo-id",
type=str,
default="lerobot/test",
help="Dataset identifier. By convention it should match '{hf_username}/{dataset_name}' (e.g. `lerobot/test`).",
)
remove_calib = subparsers.add_parser("remove", parents=[base_parser])
remove_calib.add_argument(
"--episodes",
type=int,
nargs="+",
help="Episode indices to remove (e.g. `0 1 5 6`).",
)
args = parser.parse_args()
input("It is recommended to make a copy of your dataset before modifying it. Press enter to continue.")
dataset = LeRobotDataset(args.repo_id, root=args.root)
if args.mode == "remove":
remove_episodes(dataset, args.episodes)