Stabilizing Humanoid Robot Trajectory Generation via Physics-Informed Learning and Control-Informed Steering
iros_2025_submission_video_github.mp4
We recommend using conda/mamba, available here, for this installation.
First, create and activate a conda environment with the necessary dependencies.
mamba create -n pi_trajectory_gen -c conda-forge bipedal-locomotion-framework jaxsim pytorch tensorboard adam-robotics jax2torch urdf-parser-py h5py ergocub-models meshcat-python
mamba activate pi_trajectory_gen
Next, within the conda environment, clone and install this repo:
git clone /~https://github.com/ami-iit/paper_delia_2025_iros_physics-informed_trajectory_generation.git
cd paper_delia_2025_iros_physics-informed_trajectory_generation
pip install .
You will also need to download the datasets necessary for running the code into this repo. This may take some time depending on your internet connection.
git clone git@hf.co:datasets/evelyd/paper_delia_2025_iros_physics-informed_trajectory_generation_dataset datasets/
cd datasets/
unzip D2.zip
Now you're ready to run the code.
To replicate our results, several scripts are available in the scripts
folder.
cd ../scripts/
The raw dataset contains 5 subsets: forward walking, backward walking, side walking, diagonal walking, and mixed walking. The raw data is recorded on a human and therefore needs to be retargeted onto the robot model. For example, the mirrored version of the forward walking subset is retargeted using:
python retargeting.py --KFWBGR --filename ../datasets/D2/1_forward_normal_step/data.log
With the retargeted data we can extract the features (data and labels) for training, for example:
python features_extraction.py --dataset D2 --portion 1
With the extracted features we can train the model. The weight given to the PI loss component can be specified as an argument.
python training.py --pi_weight 1.0
With a trained model, we can give dummy inputs and generate a trajectory based on the model's predictions.
python trajectory_generation.py
The various parameters used for the trajectory generation can be tuned in the files present in the config
folder. For example, in config_mann.toml
, the trained model can be changed by editing the onnx_model_path
parameter, and the correction block gains can be updated by changing the linear_pid_gain
and rotational_pid_gain
parameters.
We also include a script to reproduce some of the plots from our paper.
python show_selected_plots.py
![]() |
@evelyd |