Following these steps, you should be able to reproduce behaviors like picking a lego block and placing it in a bin with a relatively high success rate.
While this tutorial is general and easily extendable to any type of robots by changing a configuration, it is based on the [Koch v1.1](https://github.com/jess-moss/koch-v1-1) affordable robot. Koch v1.1 is composed of a leader arm and a follower arm with 6 motors each. In addition, various cameras can record the scene and serve as visual sensors for the robot.
During data collection, you will control the follower arm by moving the leader arm. This is called "teleoperation". More specifically, the present position of the motors of the leader arm is read at high frequency and sent as a goal position for the motors of the follower arm, which effectively "follow" the movements of the leader arm. While you teleoperate the robot, a few modalities are recorded:
- the present position of the follower arm, called the "state",
- the goal position sent to the follower arm, called the "action",
- the video stream from the cameras.
Finally, you will train a neural network to predict the future actions given the state and camera frames as input ; and deploy it to autonomously control the robot via the high frequency communication of goal positions to the follower arm.
Follow the bill of materials on the [Koch v1.1 github page](https://github.com/jess-moss/koch-v1-1) to order a leader and a follower arm. Some parts and prices are a bit different with respect to the geo location.
Once the parts are received, follow this video to guide you through the assembly:
Connect the leader arm (the smaller one) with the 5V alimentation and the follower arm with the 12V alimentation. Then connect both arms to your computer with USB.
[`DynamixelMotorsBus`](lerobot/common/robot_devices/motors/dynamixel.py) allows to efficiently read from and write to the motors connected as a chain to the corresponding usb bus. Underneath, it relies on the python [dynamixel sdk](https://emanual.robotis.com/docs/en/software/dynamixel/dynamixel_sdk/sample_code/python_read_write_protocol_2_0/#python-read-write-protocol-20).
Each `DynamixelMotorsBus` requires its corresponding usb port (e.g. `DynamixelMotorsBus(port="/dev/tty.usbmodem575E0031751"`). Run our utility script for each arm to find their ports. Here is an example of what it looks like:
Then you can instantiate each arm by listing their motors with their name, motor index, and model. The initial motor index from factory for every motors is `1`. However, unique indices are required for these motors to function in a chain on a common bus. To this end, we set different indices and follow the ascendant convention starting from index `1` (e.g. "1, 2, 3, 4, 5, 6" ). These indices will be written inside the persisting memory of each motor during the first connection. Here is an example of what the instantiation looks like:
During the first connection of the motors, `DynamixelMotorsBus` automatically detects a mismatch between the present motor indices (all `1` by default) and the specified motor indices (e.g. "1, 2, 3, 4, 5, 6"). This triggers the configuration procedure which requires to unplug the power cord and motors, and to sequentially plug each motor again, starting from the closest to the bus. Because it is quite involved, we provide a youtube video for help. The output of the procedure looks like that:
Congrats! Now both arms are well configured and connected. Of course, next time you connect the arms, you won't have to follow configuration procedure ever again. For instance, let's try to disconnect and connect again like that:
```python
leader_arm.disconnect()
leader_arm.connect()
follower_arm.disconnect()
follower_arm.connect()
```
**Read and Write**
Just to get familiar with how `DynamixelMotorsBus` is used to command the motors, let's try to read from them. You should have something like:
```python
values = leader_arm.read("Present_Position")
print(values)
>>> TODO
values = follower_arm.read("Present_Position")
print(values)
>>> TODO
```
The full address is `X_SERIES_CONTROL_TABLE`. TODO
Now let's try to enable torque in the follower arm:
```python
from lerobot.common.robot_devices.motors.dynamixel import TorqueMode
The `OpenCVCamera` class allows to efficiently record images from cameras. It relies on opencv2 to communicate with the cameras. Most cameras are compatible. For more info, see the [Video I/O with OpenCV Overview](https://docs.opencv.org/4.x/d0/da7/videoio_overview.html).
An `OpenCVCamera` instance requires a camera index (e.g. `OpenCVCamera(camera_index=0)`). When you only have one camera like a webcam of a laptop, the camera index is expected to be 0, but it might also be very different, and the camera index might change if you reboot your computer or re-plug your camera. This behavior depends on your operation system.
To find the camera indices of your cameras, you can run our utility script that will save a few frames for each camera:
When an `OpenCVCamera` is instantiated, if no specific config is provided, the default fps, width, height and color_mode of the given camera will be used.
Example of usage of the class:
```python
camera = OpenCVCamera(camera_index=0)
camera.connect()
color_image = camera.read()
# when done using the camera, consider disconnecting
camera.disconnect()
```
**Add to robot**
TODO: explain that the cameras run asynchronously.