World4Drive: End-to-End Autonomous Driving via Intention-aware Physical Latent World Model Article Swipe
YOU?
·
· 2025
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.2507.00603
End-to-end autonomous driving directly generates planning trajectories from raw sensor data, yet it typically relies on costly perception supervision to extract scene information. A critical research challenge arises: constructing an informative driving world model to enable perception annotation-free, end-to-end planning via self-supervised learning. In this paper, we present World4Drive, an end-to-end autonomous driving framework that employs vision foundation models to build latent world models for generating and evaluating multi-modal planning trajectories. Specifically, World4Drive first extracts scene features, including driving intention and world latent representations enriched with spatial-semantic priors provided by vision foundation models. It then generates multi-modal planning trajectories based on current scene features and driving intentions and predicts multiple intention-driven future states within the latent space. Finally, it introduces a world model selector module to evaluate and select the best trajectory. We achieve perception annotation-free, end-to-end planning through self-supervised alignment between actual future observations and predicted observations reconstructed from the latent space. World4Drive achieves state-of-the-art performance without manual perception annotations on both the open-loop nuScenes and closed-loop NavSim benchmarks, demonstrating an 18.1\% relative reduction in L2 error, 46.7% lower collision rate, and 3.75 faster training convergence. Codes will be accessed at https://github.com/ucaszyp/World4Drive.
Related Topics
- Type
- preprint
- Language
- en
- Landing Page
- http://arxiv.org/abs/2507.00603
- https://arxiv.org/pdf/2507.00603
- OA Status
- green
- OpenAlex ID
- https://openalex.org/W4416887510
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4416887510Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.48550/arxiv.2507.00603Digital Object Identifier
- Title
-
World4Drive: End-to-End Autonomous Driving via Intention-aware Physical Latent World ModelWork title
- Type
-
preprintOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2025Year of publication
- Publication date
-
2025-07-01Full publication date if available
- Authors
-
Zebin Xing, Qichao Zhang, Yuhang Zheng, Yinfeng Gao, Zhongpu Xia, Peng JiaList of authors in order
- Landing page
-
https://arxiv.org/abs/2507.00603Publisher landing page
- PDF URL
-
https://arxiv.org/pdf/2507.00603Direct link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
greenOpen access status per OpenAlex
- OA URL
-
https://arxiv.org/pdf/2507.00603Direct OA link when available
- Cited by
-
0Total citation count in OpenAlex
Full payload
| id | https://openalex.org/W4416887510 |
|---|---|
| doi | https://doi.org/10.48550/arxiv.2507.00603 |
| ids.doi | https://doi.org/10.48550/arxiv.2507.00603 |
| ids.openalex | https://openalex.org/W4416887510 |
| fwci | |
| type | preprint |
| title | World4Drive: End-to-End Autonomous Driving via Intention-aware Physical Latent World Model |
| biblio.issue | |
| biblio.volume | |
| biblio.last_page | |
| biblio.first_page | |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| language | en |
| locations[0].id | pmh:oai:arXiv.org:2507.00603 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S4306400194 |
| locations[0].source.issn | |
| locations[0].source.type | repository |
| locations[0].source.is_oa | True |
| locations[0].source.issn_l | |
| locations[0].source.is_core | False |
| locations[0].source.is_in_doaj | False |
| locations[0].source.display_name | arXiv (Cornell University) |
| locations[0].source.host_organization | https://openalex.org/I205783295 |
| locations[0].source.host_organization_name | Cornell University |
| locations[0].source.host_organization_lineage | https://openalex.org/I205783295 |
| locations[0].license | |
| locations[0].pdf_url | https://arxiv.org/pdf/2507.00603 |
| locations[0].version | submittedVersion |
| locations[0].raw_type | text |
| locations[0].license_id | |
| locations[0].is_accepted | False |
| locations[0].is_published | False |
| locations[0].raw_source_name | |
| locations[0].landing_page_url | http://arxiv.org/abs/2507.00603 |
| locations[1].id | doi:10.48550/arxiv.2507.00603 |
| locations[1].is_oa | True |
| locations[1].source.id | https://openalex.org/S4306400194 |
| locations[1].source.issn | |
| locations[1].source.type | repository |
| locations[1].source.is_oa | True |
| locations[1].source.issn_l | |
| locations[1].source.is_core | False |
| locations[1].source.is_in_doaj | False |
| locations[1].source.display_name | arXiv (Cornell University) |
| locations[1].source.host_organization | https://openalex.org/I205783295 |
| locations[1].source.host_organization_name | Cornell University |
| locations[1].source.host_organization_lineage | https://openalex.org/I205783295 |
| locations[1].license | |
| locations[1].pdf_url | |
| locations[1].version | |
| locations[1].raw_type | article |
| locations[1].license_id | |
| locations[1].is_accepted | False |
| locations[1].is_published | |
| locations[1].raw_source_name | |
| locations[1].landing_page_url | https://doi.org/10.48550/arxiv.2507.00603 |
| indexed_in | arxiv, datacite |
| authorships[0].author.id | https://openalex.org/A5100309450 |
| authorships[0].author.orcid | |
| authorships[0].author.display_name | Zebin Xing |
| authorships[0].author_position | middle |
| authorships[0].raw_author_name | Xing, Zebin |
| authorships[0].is_corresponding | False |
| authorships[1].author.id | https://openalex.org/A5049454999 |
| authorships[1].author.orcid | https://orcid.org/0000-0001-9747-391X |
| authorships[1].author.display_name | Qichao Zhang |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Zhang, Qichao |
| authorships[1].is_corresponding | False |
| authorships[2].author.id | https://openalex.org/A5030295721 |
| authorships[2].author.orcid | https://orcid.org/0000-0001-9628-1940 |
| authorships[2].author.display_name | Yuhang Zheng |
| authorships[2].author_position | middle |
| authorships[2].raw_author_name | Zheng, Yuhang |
| authorships[2].is_corresponding | False |
| authorships[3].author.id | https://openalex.org/A5074076701 |
| authorships[3].author.orcid | https://orcid.org/0000-0002-3513-1380 |
| authorships[3].author.display_name | Yinfeng Gao |
| authorships[3].author_position | last |
| authorships[3].raw_author_name | Gao, Yinfeng |
| authorships[3].is_corresponding | False |
| authorships[4].author.id | https://openalex.org/A5065256108 |
| authorships[4].author.orcid | https://orcid.org/0009-0003-4251-6849 |
| authorships[4].author.display_name | Zhongpu Xia |
| authorships[4].author_position | middle |
| authorships[4].raw_author_name | Xia, Zhongpu |
| authorships[4].is_corresponding | False |
| authorships[5].author.id | https://openalex.org/A5100550666 |
| authorships[5].author.orcid | https://orcid.org/0009-0003-7851-0609 |
| authorships[5].author.display_name | Peng Jia |
| authorships[5].author_position | middle |
| authorships[5].raw_author_name | Jia, Peng |
| authorships[5].is_corresponding | False |
| has_content.pdf | True |
| has_content.grobid_xml | True |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://arxiv.org/pdf/2507.00603 |
| open_access.oa_status | green |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | World4Drive: End-to-End Autonomous Driving via Intention-aware Physical Latent World Model |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-12-02T09:50:17.773051 |
| primary_topic | |
| cited_by_count | 0 |
| locations_count | 2 |
| best_oa_location.id | pmh:oai:arXiv.org:2507.00603 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S4306400194 |
| best_oa_location.source.issn | |
| best_oa_location.source.type | repository |
| best_oa_location.source.is_oa | True |
| best_oa_location.source.issn_l | |
| best_oa_location.source.is_core | False |
| best_oa_location.source.is_in_doaj | False |
| best_oa_location.source.display_name | arXiv (Cornell University) |
| best_oa_location.source.host_organization | https://openalex.org/I205783295 |
| best_oa_location.source.host_organization_name | Cornell University |
| best_oa_location.source.host_organization_lineage | https://openalex.org/I205783295 |
| best_oa_location.license | |
| best_oa_location.pdf_url | https://arxiv.org/pdf/2507.00603 |
| best_oa_location.version | submittedVersion |
| best_oa_location.raw_type | text |
| best_oa_location.license_id | |
| best_oa_location.is_accepted | False |
| best_oa_location.is_published | False |
| best_oa_location.raw_source_name | |
| best_oa_location.landing_page_url | http://arxiv.org/abs/2507.00603 |
| primary_location.id | pmh:oai:arXiv.org:2507.00603 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S4306400194 |
| primary_location.source.issn | |
| primary_location.source.type | repository |
| primary_location.source.is_oa | True |
| primary_location.source.issn_l | |
| primary_location.source.is_core | False |
| primary_location.source.is_in_doaj | False |
| primary_location.source.display_name | arXiv (Cornell University) |
| primary_location.source.host_organization | https://openalex.org/I205783295 |
| primary_location.source.host_organization_name | Cornell University |
| primary_location.source.host_organization_lineage | https://openalex.org/I205783295 |
| primary_location.license | |
| primary_location.pdf_url | https://arxiv.org/pdf/2507.00603 |
| primary_location.version | submittedVersion |
| primary_location.raw_type | text |
| primary_location.license_id | |
| primary_location.is_accepted | False |
| primary_location.is_published | False |
| primary_location.raw_source_name | |
| primary_location.landing_page_url | http://arxiv.org/abs/2507.00603 |
| publication_date | 2025-07-01 |
| publication_year | 2025 |
| referenced_works_count | 0 |
| abstract_inverted_index.A | 23 |
| abstract_inverted_index.a | 120 |
| abstract_inverted_index.In | 43 |
| abstract_inverted_index.It | 93 |
| abstract_inverted_index.L2 | 176 |
| abstract_inverted_index.We | 132 |
| abstract_inverted_index.an | 29, 49, 171 |
| abstract_inverted_index.at | 191 |
| abstract_inverted_index.be | 189 |
| abstract_inverted_index.by | 89 |
| abstract_inverted_index.in | 175 |
| abstract_inverted_index.it | 12, 118 |
| abstract_inverted_index.on | 15, 100, 161 |
| abstract_inverted_index.to | 19, 34, 59, 125 |
| abstract_inverted_index.we | 46 |
| abstract_inverted_index.and | 66, 80, 104, 107, 127, 145, 166, 182 |
| abstract_inverted_index.for | 64 |
| abstract_inverted_index.raw | 8 |
| abstract_inverted_index.the | 114, 129, 150, 163 |
| abstract_inverted_index.via | 40 |
| abstract_inverted_index.yet | 11 |
| abstract_inverted_index.3.75 | 183 |
| abstract_inverted_index.best | 130 |
| abstract_inverted_index.both | 162 |
| abstract_inverted_index.from | 7, 149 |
| abstract_inverted_index.that | 54 |
| abstract_inverted_index.then | 94 |
| abstract_inverted_index.this | 44 |
| abstract_inverted_index.will | 188 |
| abstract_inverted_index.with | 85 |
| abstract_inverted_index.46.7% | 178 |
| abstract_inverted_index.Codes | 187 |
| abstract_inverted_index.based | 99 |
| abstract_inverted_index.build | 60 |
| abstract_inverted_index.data, | 10 |
| abstract_inverted_index.first | 73 |
| abstract_inverted_index.lower | 179 |
| abstract_inverted_index.model | 33, 122 |
| abstract_inverted_index.rate, | 181 |
| abstract_inverted_index.scene | 21, 75, 102 |
| abstract_inverted_index.world | 32, 62, 81, 121 |
| abstract_inverted_index.18.1\% | 172 |
| abstract_inverted_index.NavSim | 168 |
| abstract_inverted_index.actual | 142 |
| abstract_inverted_index.costly | 16 |
| abstract_inverted_index.enable | 35 |
| abstract_inverted_index.error, | 177 |
| abstract_inverted_index.faster | 184 |
| abstract_inverted_index.future | 111, 143 |
| abstract_inverted_index.latent | 61, 82, 115, 151 |
| abstract_inverted_index.manual | 158 |
| abstract_inverted_index.models | 58, 63 |
| abstract_inverted_index.module | 124 |
| abstract_inverted_index.paper, | 45 |
| abstract_inverted_index.priors | 87 |
| abstract_inverted_index.relies | 14 |
| abstract_inverted_index.select | 128 |
| abstract_inverted_index.sensor | 9 |
| abstract_inverted_index.space. | 116, 152 |
| abstract_inverted_index.states | 112 |
| abstract_inverted_index.vision | 56, 90 |
| abstract_inverted_index.within | 113 |
| abstract_inverted_index.achieve | 133 |
| abstract_inverted_index.arises: | 27 |
| abstract_inverted_index.between | 141 |
| abstract_inverted_index.current | 101 |
| abstract_inverted_index.driving | 2, 31, 52, 78, 105 |
| abstract_inverted_index.employs | 55 |
| abstract_inverted_index.extract | 20 |
| abstract_inverted_index.models. | 92 |
| abstract_inverted_index.present | 47 |
| abstract_inverted_index.through | 138 |
| abstract_inverted_index.without | 157 |
| abstract_inverted_index.Finally, | 117 |
| abstract_inverted_index.accessed | 190 |
| abstract_inverted_index.achieves | 154 |
| abstract_inverted_index.critical | 24 |
| abstract_inverted_index.directly | 3 |
| abstract_inverted_index.enriched | 84 |
| abstract_inverted_index.evaluate | 126 |
| abstract_inverted_index.extracts | 74 |
| abstract_inverted_index.features | 103 |
| abstract_inverted_index.multiple | 109 |
| abstract_inverted_index.nuScenes | 165 |
| abstract_inverted_index.planning | 5, 39, 69, 97, 137 |
| abstract_inverted_index.predicts | 108 |
| abstract_inverted_index.provided | 88 |
| abstract_inverted_index.relative | 173 |
| abstract_inverted_index.research | 25 |
| abstract_inverted_index.selector | 123 |
| abstract_inverted_index.training | 185 |
| abstract_inverted_index.alignment | 140 |
| abstract_inverted_index.challenge | 26 |
| abstract_inverted_index.collision | 180 |
| abstract_inverted_index.features, | 76 |
| abstract_inverted_index.framework | 53 |
| abstract_inverted_index.generates | 4, 95 |
| abstract_inverted_index.including | 77 |
| abstract_inverted_index.intention | 79 |
| abstract_inverted_index.learning. | 42 |
| abstract_inverted_index.open-loop | 164 |
| abstract_inverted_index.predicted | 146 |
| abstract_inverted_index.reduction | 174 |
| abstract_inverted_index.typically | 13 |
| abstract_inverted_index.End-to-end | 0 |
| abstract_inverted_index.autonomous | 1, 51 |
| abstract_inverted_index.end-to-end | 38, 50, 136 |
| abstract_inverted_index.evaluating | 67 |
| abstract_inverted_index.foundation | 57, 91 |
| abstract_inverted_index.generating | 65 |
| abstract_inverted_index.intentions | 106 |
| abstract_inverted_index.introduces | 119 |
| abstract_inverted_index.perception | 17, 36, 134, 159 |
| abstract_inverted_index.World4Drive | 72, 153 |
| abstract_inverted_index.annotations | 160 |
| abstract_inverted_index.benchmarks, | 169 |
| abstract_inverted_index.closed-loop | 167 |
| abstract_inverted_index.informative | 30 |
| abstract_inverted_index.multi-modal | 68, 96 |
| abstract_inverted_index.performance | 156 |
| abstract_inverted_index.supervision | 18 |
| abstract_inverted_index.trajectory. | 131 |
| abstract_inverted_index.World4Drive, | 48 |
| abstract_inverted_index.constructing | 28 |
| abstract_inverted_index.convergence. | 186 |
| abstract_inverted_index.information. | 22 |
| abstract_inverted_index.observations | 144, 147 |
| abstract_inverted_index.trajectories | 6, 98 |
| abstract_inverted_index.Specifically, | 71 |
| abstract_inverted_index.demonstrating | 170 |
| abstract_inverted_index.reconstructed | 148 |
| abstract_inverted_index.trajectories. | 70 |
| abstract_inverted_index.representations | 83 |
| abstract_inverted_index.self-supervised | 41, 139 |
| abstract_inverted_index.annotation-free, | 37, 135 |
| abstract_inverted_index.intention-driven | 110 |
| abstract_inverted_index.spatial-semantic | 86 |
| abstract_inverted_index.state-of-the-art | 155 |
| abstract_inverted_index.https://github.com/ucaszyp/World4Drive. | 192 |
| cited_by_percentile_year | |
| countries_distinct_count | 0 |
| institutions_distinct_count | 6 |
| citation_normalized_percentile |