PhyGile: Physics-Prefix Guided Motion Generation for Agile General Humanoid Motion Tracking

Jiacheng Bao1, 2 * Haoran Yang2, 3 * Yucheng Xin2, 4 * Junhong Liu1 Yuecheng Xu5
Han Liang6 Pengfei Han2 Xiaoguang Ma7 Dong Wang2 Bin Zhao1, 2
1Northwestern Polytechnical University 2Shanghai AI Laboratory
3University of Science and Technology of China 4Tsinghua University
5Fudan University 6ByteDance 7Northeastern University
*Equal contributions.
PhyGile Teaser

Abstract

Humanoid robots are expected to execute agile and expressive whole-body motions in real-world settings. Existing text-to-motion generation models are predominantly trained on captured human motion datasets, whose priors assume human biomechanics, actuation, mass distribution, and contact strategies. When such motions are directly retargeted to humanoid robots, the resulting trajectories may satisfy geometric constraints (e.g., joint limits, pose continuity) and appear kinematically reasonable. However, they frequently violate the physical feasibility required for real-world execution.

To address these issues, we present PhyGile, a unified framework that closes the loop between robot-native motion generation and General Motion Tracking (GMT). PhyGile performs Physics-prefix-Guided robot-native motion generation at inference time, directly generating robot-native motions in a 262-dimensional skeletal space with physics-guided prefix, thereby eliminating inference-time retargeting artifacts and reducing generation–execution discrepancies.

Before physics-prefix adaptation, we train the GMT controller with a curriculum-based mixture-of-experts scheme, followed by post-training on unlabeled motion data, to improve robustness over large-scale robot motions. During physics-prefix adaptation, the GMT controller is further fine-tuned with generated objectives under physics-derived prefixes, enabling agile and stable execution of complex motions on real robots.

Extensive offline and real-robot experiments demonstrate that our PhyGile expands the frontier of text-driven humanoid control, enabling stable tracking of agile, highly-difficult whole-body motions that go well beyond walking and low-dynamic motions typically achieved by prior methods.

Video

Method

PhyGile Method

Overview of PhyGile. (Left) GMT: A two-stage MoE tracker is first trained with curriculum-constrained routing to induce expert specialization, followed by global soft post-training with dynamic expert expansion to absorb persistently difficult motions. (Right) Generation of Diffusion Policy: A TP-MoE–conditioned robot-native diffusion model generating 262D robot motion sequences from text. (Center) Motion Generation Fine-tuning: Executable motion prefixes are concatenated with newly generated 1-second continuations and validated by pretrained GMT. Closed-loop simulation refinement further enforces dynamic feasibility and improves consistency between generated and trackable motions, and the fine-tuned GMT policy is deployed on real robots.

Results

Comparison of generated motions, fine-tuned motions, and real robot deployment.

Generated Motion
Fine-tuned Motion

Breakdance

A person performs breakdance moves, with a spin.

Real Robot
0:00 / 0:00
Generated Motion
Fine-tuned Motion

Cartwheel

A person does a cartwheel forward and turns to cartwheel backwards.

Real Robot
0:00 / 0:00
Generated Motion
Fine-tuned Motion

High Kick

A person performs a high kick.

Real Robot
0:00 / 0:00
Generated Motion
Fine-tuned Motion

180° Spin Jump

A person jumps and spins 180 degrees in the air before landing.

Real Robot
0:00 / 0:00
Generated Motion
Fine-tuned Motion

360° Spin Jump

A person jumps and spins a full 360 degrees in the air before landing.

Real Robot
0:00 / 0:00

More Real Robot Results

Diverse agile motions performed by the humanoid robot.

BibTeX

Coming soon.