iMotion-LLM

Instruction-Conditioned Trajectory Generation

WACV 2026

Abdulwahab Felemban, Nussair Hroub, Jian Ding, Eslam Abdelrahman, Xiaoqian Shen, Abduallah Mohamed, Mohamed Elhoseiny
KAUST

Abstract

iMotion-LLM is a large language model integrated with trajectory prediction modules for interactive motion generation. It generates feasible, safety-aligned trajectories from textual instructions, enabling context-aware driving behavior and interpretable reasoning about instruction feasibility and safety.

The paper introduces two datasets: InstructWaymo, a direction-conditioned extension of the Waymo Open Motion Dataset, and Open-Vocabulary InstructNuPlan, a safety-oriented benchmark with natural-language instructions and justifications.

Release Status

  • The cleaned public codebase is available now.
  • The supported public entry surface is documented under scripts/, configs/release/, and docs/setup/.
  • Most paper experiment code is present, but full public reproducibility still requires released checkpoints, evaluation manifests, and the generated Open-Vocabulary InstructNuPlan prompt bundle.
  • The exact paper-table coverage and missing artifacts are tracked in the linked experiment-status and artifact-manifest documents.

Users Download

  • Waymo raw data
  • nuPlan raw data and maps
  • Base LLM weights such as Llama 2 7B

We Release

  • Research checkpoints
  • Waymo evaluation manifests
  • GF-MTR mapping metadata
  • Generated NuPlan prompt bundle

BibTeX

@inproceedings{Felemban_2026_WACV,
  author    = {Felemban, Abdulwahab and Hroub, Nussair and Ding, Jian and Abdelrahman, Eslam and Shen, Xiaoqian and Mohamed, Abduallah and Elhoseiny, Mohamed},
  title     = {iMotion-LLM: Instruction-Conditioned Trajectory Generation},
  booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)},
  month     = {March},
  year      = {2026},
  pages     = {2710-2720}
}

Acknowledgement

This website template is adapted from Nerfies, licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.