OmniNWM: Omniscient Driving Navigation World Models

1SJTU, 2EIT(Ningbo), 3PhiGent, 4NUS, 5THU
Teaser Image

We introduce OmniNWM, a comprehensive navigation world model that simultaneously forecasts panoramic RGB, semantic, metric depth, 3D semantic occupancy videos, and future planning trajectories for autonomous driving.

Abstract

Autonomous driving world models are expected to work effectively across three core dimensions: state, action, and reward. However, existing methods are typically restricted to fragmented modality modeling, short-horizon drift, and imprecise action control, while lacking intrinsic mechanisms for policy evaluation. In this paper, we introduce OmniNWM, an Omniscient panoramic Navigation World Model that addresses all three dimensions within a consistent probabilistic framework. For State, OmniNWM generates panoramic videos of RGB, semantics, metric depth, and 3D occupancy, ensuring pixel-level alignment across modalities with joint distribution modeling. To mitigate autoregressive exposure bias, we propose a structured panoramic forcing strategy to stabilize long-horizon generation via stochastic manifold thickening. For Action, we introduce canonical geometric action encoding with normalized panoramic Plücker ray-maps. This representation decouples motion dynamics from sensor intrinsics, enabling precise, zero-shot trajectory control across heterogeneous datasets and camera configurations. For Reward, we derive intrinsic occupancy-grounded dense rewards directly from generated 3D volumes, establishing a reliable closed-loop simulation cycle for evaluating diverse planning agents. Extensive experiments demonstrate that OmniNWM achieves SOTA performance in generation fidelity and control precision, with remarkable zero-shot robustness to novel scenes on NuPlan and in-house datasets with distinct camera rigs.


Long-term Navigation

Examples of Long-term Generative Navigation with VLA Future Planning (Beyond GT Length)


Long-term Navigation Scenario 1

Long-term Navigation Scenario 2

Long-term Navigation Scenario 3

Long-term Navigation Scenario 4

Long-term Navigation Scenario 5


Trajectory Control

Examples of Out-of-distribution Trajectory Control with the Same Conditional Frame

Trajectory Control Scenario 1

Trajectory Control Scenario 2

Trajectory Control Scenario 3

Trajectory Control Scenario 4

Trajectory Control Scenario 5

Trajectory Control: Reversing


3D Semantic Occupancy

Examples of Diverse Generated Scenarios with 3D Semantic Occupancy

3D Semantic Occupancy Scenario 1

3D Semantic Occupancy Scenario 2

3D Semantic Occupancy Scenario3


Zero-shot Generalization

Examples of Zero-shot Generalization on Different Datasets and Camera View Configures


Nuplan 3 Camera Views

Nuplan 6 Camera Views

In-House Collected Dataset Scenario 1

In-House Collected Dataset Scenario 2


Diverse Generation

Examples of Diverse Generated Samples with Pixel-level Alighed Panoramic RGB, Semantic and Depth Videos

Diverse Generation Scenario 1

Diverse Generation Scenario 2

Diverse Generation Scenario 3

Diverse Generation Scenario 4

Diverse Generation Scenario 5

Diverse Generation Scenario 6

Diverse Generation Scenario 7

Diverse Generation Scenario 8

Diverse Generation Scenario 9

Diverse Generation Scenario 10

Diverse Generation Scenario 11

Diverse Generation Scenario 12

Diverse Generation Scenario 13

Diverse Generation Scenario 14

Diverse Generation Scenario 15

Diverse Generation Scenario 16

Diverse Generation Scenario 17

Diverse Generation Scenario 18

BibTeX

@article{li2025omninwm,
  title={OmniNWM: Omniscient Driving Navigation World Models},
  author={Li, Bohan and Ma, Zhuang and Du, Dalong and Peng, Baorui and Liang, Zhujin and Liu, Zhenqiang and Ma, Chao and Jin, Yueming and Zhao, Hao and Zeng, Wenjun and others},
  journal={arXiv preprint arXiv:2510.18313},
  year={2025}
}