+
Skip to content

luckyrobots/mjlab

 
 

Repository files navigation

Project banner

mjlab

tests

mjlab combines Isaac Lab's proven API with best-in-class MuJoCo physics to provide lightweight, modular abstractions for RL robotics research and sim-to-real deployment.

⚠️ BETA PREVIEW mjlab is in active development. Expect breaking changes and missing features during the beta phase. There is no stable release yet. The PyPI package is only a snapshot — for the latest fixes and improvements, install from source or Git.


Quick Start

mjlab requires an NVIDIA GPU for training (via MuJoCo Warp). macOS is supported only for evaluation, which is significantly slower.

# Install uv if you haven't already
curl -LsSf https://astral.sh/uv/install.sh | sh

Run the demo (no installation needed):

uvx --from mjlab --with "mujoco-warp @ git+https://github.com/google-deepmind/mujoco_warp@486642c3fa262a989b482e0e506716d5793d61a9" demo

This launches an interactive viewer with a pre-trained Unitree G1 agent tracking a reference dance motion in MuJoCo Warp.

❓ Having issues? See the FAQ.


Installation

From source (recommended during beta):

git clone https://github.com/mujocolab/mjlab.git
cd mjlab
uv run demo

From PyPI (beta snapshot):

uv add mjlab "mujoco-warp @ git+https://github.com/google-deepmind/mujoco_warp@486642c3fa262a989b482e0e506716d5793d61a9"

For full setup instructions, see the Installation Guide.


Training Examples

1. Velocity Tracking

Train a Unitree G1 humanoid to follow velocity commands on flat terrain:

MUJOCO_GL=egl uv run train Mjlab-Velocity-Flat-Unitree-G1 --env.scene.num-envs 4096

Evaluate a policy while training (fetches latest checkpoint from Weights & Biases):

uv run play --task Mjlab-Velocity-Flat-Unitree-G1-Play --wandb-run-path your-org/mjlab/run-id

2. Motion Imitation

Train a Unitree G1 to mimic reference motions. mjlab uses WandB to manage reference motion datasets:

  1. Create a registry collection in your WandB workspace named Motions

  2. Set your WandB entity:

    export WANDB_ENTITY=your-organization-name
  3. Process and upload motion files:

    MUJOCO_GL=egl uv run src/mjlab/scripts/csv_to_npz.py \
      --input-file /path/to/motion.csv \
      --output-name motion_name \
      --input-fps 30 \
      --output-fps 50 \
      --render  # Optional: generates preview video

Note: For detailed motion preprocessing instructions, see the BeyondMimic documentation.

Train and Play

MUJOCO_GL=egl uv run train Mjlab-Tracking-Flat-Unitree-G1 --registry-name your-org/motions/motion-name --env.scene.num-envs 4096

uv run play --task Mjlab-Tracking-Flat-Unitree-G1-Play --wandb-run-path your-org/mjlab/run-id

Documentation


Development

Run tests:

make test

Format code:

uvx pre-commit install
make format

License

mjlab is licensed under the Apache License, Version 2.0.

Third-Party Code

The third_party/ directory contains files from external projects, each with its own license:

When distributing or modifying mjlab, comply with:

  1. The Apache-2.0 license for mjlab’s original code
  2. The respective licenses in third_party/

Acknowledgments

mjlab wouldn't exist without the excellent work of the Isaac Lab team, whose API design and abstractions mjlab builds upon.

Thanks to the MuJoCo Warp team — especially Erik Frey and Taylor Howell — for answering our questions, giving helpful feedback, and implementing features based on our requests countless times.

About

Isaac Lab API, powered by MuJoCo-Warp, for RL and robotics research.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.9%
  • Other 0.1%
点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载