From Motion to Behavior: Hierarchical Modeling of Humanoid Generative Behavior Control

Human motion generative modeling or synthesis aims to characterize complicated human motions of daily activities in diverse real-world environments. However, current research predominantly focuses on either low-level, short-period motions or high-level action planning, without taking into account the hierarchical goal-oriented nature of human activities. In this work, we take a step forward from human motion generation to human behavior modeling, which is inspired by cognitive science. We present a unified framework, dubbed Generative Behavior Control (GBC), to model diverse human motions driven by various high-level intentions by aligning motions with hierarchical behavior plans generated by large language models (LLMs). Our insight is that human motions can be jointly controlled by task and motion planning in robotics, but guided by LLMs to achieve improved motion diversity and physical fidelity. Meanwhile, to overcome the limitations of existing benchmarks, i.e., lack of behavioral plans, we propose GBC-100K dataset annotated with a hierarchical granularity of semantic and motion plans driven by target goals. Our experiments demonstrate that GBC can generate more diverse and purposeful high-quality human motions with 10* longer horizons compared with existing methods when trained on GBC-100K, laying a foundation for future research on behavioral modeling of human motions. Our dataset and source code will be made publicly available.
View on arXiv@article{zhang2025_2506.00043, title={ From Motion to Behavior: Hierarchical Modeling of Humanoid Generative Behavior Control }, author={ Jusheng Zhang and Jinzhou Tang and Sidi Liu and Mingyan Li and Sheng Zhang and Jian Wang and Keze Wang }, journal={arXiv preprint arXiv:2506.00043}, year={ 2025 } }