VISAT: Benchmarking Adversarial and Distribution Shift Robustness in Traffic Sign Recognition with Visual Attributes Article Swipe
YOU?
·
· 2025
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.2510.26833
We present VISAT, a novel open dataset and benchmarking suite for evaluating model robustness in the task of traffic sign recognition with the presence of visual attributes. Built upon the Mapillary Traffic Sign Dataset (MTSD), our dataset introduces two benchmarks that respectively emphasize robustness against adversarial attacks and distribution shifts. For our adversarial attack benchmark, we employ the state-of-the-art Projected Gradient Descent (PGD) method to generate adversarial inputs and evaluate their impact on popular models. Additionally, we investigate the effect of adversarial attacks on attribute-specific multi-task learning (MTL) networks, revealing spurious correlations among MTL tasks. The MTL networks leverage visual attributes (color, shape, symbol, and text) that we have created for each traffic sign in our dataset. For our distribution shift benchmark, we utilize ImageNet-C's realistic data corruption and natural variation techniques to perform evaluations on the robustness of both base and MTL models. Moreover, we further explore spurious correlations among MTL tasks through synthetic alterations of traffic sign colors using color quantization techniques. Our experiments focus on two major backbones, ResNet-152 and ViT-B/32, and compare the performance between base and MTL models. The VISAT dataset and benchmarking framework contribute to the understanding of model robustness for traffic sign recognition, shedding light on the challenges posed by adversarial attacks and distribution shifts. We believe this work will facilitate advancements in developing more robust models for real-world applications in autonomous driving and cyber-physical systems.
Related Topics
- Type
- preprint
- Landing Page
- http://arxiv.org/abs/2510.26833
- https://arxiv.org/pdf/2510.26833
- OA Status
- green
- OpenAlex ID
- https://openalex.org/W4415910178
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4415910178Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.48550/arxiv.2510.26833Digital Object Identifier
- Title
-
VISAT: Benchmarking Adversarial and Distribution Shift Robustness in Traffic Sign Recognition with Visual AttributesWork title
- Type
-
preprintOpenAlex work type
- Publication year
-
2025Year of publication
- Publication date
-
2025-10-29Full publication date if available
- Authors
-
Simon C.H. Yu, Philip L. H. Yu, Hongbo Zheng, Huajie Shao, Han Zhao, Lui ShaList of authors in order
- Landing page
-
https://arxiv.org/abs/2510.26833Publisher landing page
- PDF URL
-
https://arxiv.org/pdf/2510.26833Direct link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
greenOpen access status per OpenAlex
- OA URL
-
https://arxiv.org/pdf/2510.26833Direct OA link when available
- Cited by
-
0Total citation count in OpenAlex
Full payload
| id | https://openalex.org/W4415910178 |
|---|---|
| doi | https://doi.org/10.48550/arxiv.2510.26833 |
| ids.doi | https://doi.org/10.48550/arxiv.2510.26833 |
| ids.openalex | https://openalex.org/W4415910178 |
| fwci | 0.0 |
| type | preprint |
| title | VISAT: Benchmarking Adversarial and Distribution Shift Robustness in Traffic Sign Recognition with Visual Attributes |
| biblio.issue | |
| biblio.volume | |
| biblio.last_page | |
| biblio.first_page | |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| language | |
| locations[0].id | pmh:oai:arXiv.org:2510.26833 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S4306400194 |
| locations[0].source.issn | |
| locations[0].source.type | repository |
| locations[0].source.is_oa | True |
| locations[0].source.issn_l | |
| locations[0].source.is_core | False |
| locations[0].source.is_in_doaj | False |
| locations[0].source.display_name | arXiv (Cornell University) |
| locations[0].source.host_organization | https://openalex.org/I205783295 |
| locations[0].source.host_organization_name | Cornell University |
| locations[0].source.host_organization_lineage | https://openalex.org/I205783295 |
| locations[0].license | |
| locations[0].pdf_url | https://arxiv.org/pdf/2510.26833 |
| locations[0].version | submittedVersion |
| locations[0].raw_type | text |
| locations[0].license_id | |
| locations[0].is_accepted | False |
| locations[0].is_published | False |
| locations[0].raw_source_name | |
| locations[0].landing_page_url | http://arxiv.org/abs/2510.26833 |
| locations[1].id | doi:10.48550/arxiv.2510.26833 |
| locations[1].is_oa | True |
| locations[1].source.id | https://openalex.org/S4306400194 |
| locations[1].source.issn | |
| locations[1].source.type | repository |
| locations[1].source.is_oa | True |
| locations[1].source.issn_l | |
| locations[1].source.is_core | False |
| locations[1].source.is_in_doaj | False |
| locations[1].source.display_name | arXiv (Cornell University) |
| locations[1].source.host_organization | https://openalex.org/I205783295 |
| locations[1].source.host_organization_name | Cornell University |
| locations[1].source.host_organization_lineage | https://openalex.org/I205783295 |
| locations[1].license | |
| locations[1].pdf_url | |
| locations[1].version | |
| locations[1].raw_type | article |
| locations[1].license_id | |
| locations[1].is_accepted | False |
| locations[1].is_published | |
| locations[1].raw_source_name | |
| locations[1].landing_page_url | https://doi.org/10.48550/arxiv.2510.26833 |
| indexed_in | arxiv, datacite |
| authorships[0].author.id | https://openalex.org/A5040190424 |
| authorships[0].author.orcid | https://orcid.org/0000-0002-8715-5026 |
| authorships[0].author.display_name | Simon C.H. Yu |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Yu, Simon |
| authorships[0].is_corresponding | False |
| authorships[1].author.id | https://openalex.org/A5018396769 |
| authorships[1].author.orcid | https://orcid.org/0000-0002-9449-0420 |
| authorships[1].author.display_name | Philip L. H. Yu |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Yu, Peilin |
| authorships[1].is_corresponding | False |
| authorships[2].author.id | https://openalex.org/A5100657715 |
| authorships[2].author.orcid | https://orcid.org/0009-0000-6567-7379 |
| authorships[2].author.display_name | Hongbo Zheng |
| authorships[2].author_position | middle |
| authorships[2].raw_author_name | Zheng, Hongbo |
| authorships[2].is_corresponding | False |
| authorships[3].author.id | https://openalex.org/A5041685416 |
| authorships[3].author.orcid | https://orcid.org/0000-0001-7627-5615 |
| authorships[3].author.display_name | Huajie Shao |
| authorships[3].author_position | middle |
| authorships[3].raw_author_name | Shao, Huajie |
| authorships[3].is_corresponding | False |
| authorships[4].author.id | https://openalex.org/A5101868914 |
| authorships[4].author.orcid | https://orcid.org/0009-0005-0077-2451 |
| authorships[4].author.display_name | Han Zhao |
| authorships[4].author_position | middle |
| authorships[4].raw_author_name | Zhao, Han |
| authorships[4].is_corresponding | False |
| authorships[5].author.id | https://openalex.org/A5067032971 |
| authorships[5].author.orcid | https://orcid.org/0000-0002-5578-0791 |
| authorships[5].author.display_name | Lui Sha |
| authorships[5].author_position | last |
| authorships[5].raw_author_name | Sha, Lui |
| authorships[5].is_corresponding | False |
| has_content.pdf | False |
| has_content.grobid_xml | False |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://arxiv.org/pdf/2510.26833 |
| open_access.oa_status | green |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-11-05T00:00:00 |
| display_name | VISAT: Benchmarking Adversarial and Distribution Shift Robustness in Traffic Sign Recognition with Visual Attributes |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-11-06T06:51:31.235846 |
| primary_topic | |
| cited_by_count | 0 |
| locations_count | 2 |
| best_oa_location.id | pmh:oai:arXiv.org:2510.26833 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S4306400194 |
| best_oa_location.source.issn | |
| best_oa_location.source.type | repository |
| best_oa_location.source.is_oa | True |
| best_oa_location.source.issn_l | |
| best_oa_location.source.is_core | False |
| best_oa_location.source.is_in_doaj | False |
| best_oa_location.source.display_name | arXiv (Cornell University) |
| best_oa_location.source.host_organization | https://openalex.org/I205783295 |
| best_oa_location.source.host_organization_name | Cornell University |
| best_oa_location.source.host_organization_lineage | https://openalex.org/I205783295 |
| best_oa_location.license | |
| best_oa_location.pdf_url | https://arxiv.org/pdf/2510.26833 |
| best_oa_location.version | submittedVersion |
| best_oa_location.raw_type | text |
| best_oa_location.license_id | |
| best_oa_location.is_accepted | False |
| best_oa_location.is_published | False |
| best_oa_location.raw_source_name | |
| best_oa_location.landing_page_url | http://arxiv.org/abs/2510.26833 |
| primary_location.id | pmh:oai:arXiv.org:2510.26833 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S4306400194 |
| primary_location.source.issn | |
| primary_location.source.type | repository |
| primary_location.source.is_oa | True |
| primary_location.source.issn_l | |
| primary_location.source.is_core | False |
| primary_location.source.is_in_doaj | False |
| primary_location.source.display_name | arXiv (Cornell University) |
| primary_location.source.host_organization | https://openalex.org/I205783295 |
| primary_location.source.host_organization_name | Cornell University |
| primary_location.source.host_organization_lineage | https://openalex.org/I205783295 |
| primary_location.license | |
| primary_location.pdf_url | https://arxiv.org/pdf/2510.26833 |
| primary_location.version | submittedVersion |
| primary_location.raw_type | text |
| primary_location.license_id | |
| primary_location.is_accepted | False |
| primary_location.is_published | False |
| primary_location.raw_source_name | |
| primary_location.landing_page_url | http://arxiv.org/abs/2510.26833 |
| publication_date | 2025-10-29 |
| publication_year | 2025 |
| referenced_works_count | 0 |
| abstract_inverted_index.a | 3 |
| abstract_inverted_index.We | 0, 212 |
| abstract_inverted_index.by | 206 |
| abstract_inverted_index.in | 14, 114, 219, 227 |
| abstract_inverted_index.of | 17, 24, 80, 138, 156, 193 |
| abstract_inverted_index.on | 72, 83, 135, 167, 202 |
| abstract_inverted_index.to | 64, 132, 190 |
| abstract_inverted_index.we | 55, 76, 107, 122, 145 |
| abstract_inverted_index.For | 50, 117 |
| abstract_inverted_index.MTL | 93, 96, 142, 151, 181 |
| abstract_inverted_index.Our | 164 |
| abstract_inverted_index.The | 95, 183 |
| abstract_inverted_index.and | 7, 47, 68, 104, 128, 141, 172, 174, 180, 186, 209, 230 |
| abstract_inverted_index.for | 10, 110, 196, 224 |
| abstract_inverted_index.our | 35, 51, 115, 118 |
| abstract_inverted_index.the | 15, 22, 29, 57, 78, 136, 176, 191, 203 |
| abstract_inverted_index.two | 38, 168 |
| abstract_inverted_index.Sign | 32 |
| abstract_inverted_index.base | 140, 179 |
| abstract_inverted_index.both | 139 |
| abstract_inverted_index.data | 126 |
| abstract_inverted_index.each | 111 |
| abstract_inverted_index.have | 108 |
| abstract_inverted_index.more | 221 |
| abstract_inverted_index.open | 5 |
| abstract_inverted_index.sign | 19, 113, 158, 198 |
| abstract_inverted_index.task | 16 |
| abstract_inverted_index.that | 40, 106 |
| abstract_inverted_index.this | 214 |
| abstract_inverted_index.upon | 28 |
| abstract_inverted_index.will | 216 |
| abstract_inverted_index.with | 21 |
| abstract_inverted_index.work | 215 |
| abstract_inverted_index.(MTL) | 87 |
| abstract_inverted_index.(PGD) | 62 |
| abstract_inverted_index.Built | 27 |
| abstract_inverted_index.VISAT | 184 |
| abstract_inverted_index.among | 92, 150 |
| abstract_inverted_index.color | 161 |
| abstract_inverted_index.focus | 166 |
| abstract_inverted_index.light | 201 |
| abstract_inverted_index.major | 169 |
| abstract_inverted_index.model | 12, 194 |
| abstract_inverted_index.novel | 4 |
| abstract_inverted_index.posed | 205 |
| abstract_inverted_index.shift | 120 |
| abstract_inverted_index.suite | 9 |
| abstract_inverted_index.tasks | 152 |
| abstract_inverted_index.text) | 105 |
| abstract_inverted_index.their | 70 |
| abstract_inverted_index.using | 160 |
| abstract_inverted_index.VISAT, | 2 |
| abstract_inverted_index.attack | 53 |
| abstract_inverted_index.colors | 159 |
| abstract_inverted_index.effect | 79 |
| abstract_inverted_index.employ | 56 |
| abstract_inverted_index.impact | 71 |
| abstract_inverted_index.inputs | 67 |
| abstract_inverted_index.method | 63 |
| abstract_inverted_index.models | 223 |
| abstract_inverted_index.robust | 222 |
| abstract_inverted_index.shape, | 102 |
| abstract_inverted_index.tasks. | 94 |
| abstract_inverted_index.visual | 25, 99 |
| abstract_inverted_index.(MTSD), | 34 |
| abstract_inverted_index.(color, | 101 |
| abstract_inverted_index.Dataset | 33 |
| abstract_inverted_index.Descent | 61 |
| abstract_inverted_index.Traffic | 31 |
| abstract_inverted_index.against | 44 |
| abstract_inverted_index.attacks | 46, 82, 208 |
| abstract_inverted_index.believe | 213 |
| abstract_inverted_index.between | 178 |
| abstract_inverted_index.compare | 175 |
| abstract_inverted_index.created | 109 |
| abstract_inverted_index.dataset | 6, 36, 185 |
| abstract_inverted_index.driving | 229 |
| abstract_inverted_index.explore | 147 |
| abstract_inverted_index.further | 146 |
| abstract_inverted_index.models. | 74, 143, 182 |
| abstract_inverted_index.natural | 129 |
| abstract_inverted_index.perform | 133 |
| abstract_inverted_index.popular | 73 |
| abstract_inverted_index.present | 1 |
| abstract_inverted_index.shifts. | 49, 211 |
| abstract_inverted_index.symbol, | 103 |
| abstract_inverted_index.through | 153 |
| abstract_inverted_index.traffic | 18, 112, 157, 197 |
| abstract_inverted_index.utilize | 123 |
| abstract_inverted_index.Gradient | 60 |
| abstract_inverted_index.dataset. | 116 |
| abstract_inverted_index.evaluate | 69 |
| abstract_inverted_index.generate | 65 |
| abstract_inverted_index.learning | 86 |
| abstract_inverted_index.leverage | 98 |
| abstract_inverted_index.networks | 97 |
| abstract_inverted_index.presence | 23 |
| abstract_inverted_index.shedding | 200 |
| abstract_inverted_index.spurious | 90, 148 |
| abstract_inverted_index.systems. | 232 |
| abstract_inverted_index.Mapillary | 30 |
| abstract_inverted_index.Moreover, | 144 |
| abstract_inverted_index.Projected | 59 |
| abstract_inverted_index.ViT-B/32, | 173 |
| abstract_inverted_index.emphasize | 42 |
| abstract_inverted_index.framework | 188 |
| abstract_inverted_index.networks, | 88 |
| abstract_inverted_index.realistic | 125 |
| abstract_inverted_index.revealing | 89 |
| abstract_inverted_index.synthetic | 154 |
| abstract_inverted_index.variation | 130 |
| abstract_inverted_index.ResNet-152 | 171 |
| abstract_inverted_index.attributes | 100 |
| abstract_inverted_index.autonomous | 228 |
| abstract_inverted_index.backbones, | 170 |
| abstract_inverted_index.benchmark, | 54, 121 |
| abstract_inverted_index.benchmarks | 39 |
| abstract_inverted_index.challenges | 204 |
| abstract_inverted_index.contribute | 189 |
| abstract_inverted_index.corruption | 127 |
| abstract_inverted_index.developing | 220 |
| abstract_inverted_index.evaluating | 11 |
| abstract_inverted_index.facilitate | 217 |
| abstract_inverted_index.introduces | 37 |
| abstract_inverted_index.multi-task | 85 |
| abstract_inverted_index.real-world | 225 |
| abstract_inverted_index.robustness | 13, 43, 137, 195 |
| abstract_inverted_index.techniques | 131 |
| abstract_inverted_index.adversarial | 45, 52, 66, 81, 207 |
| abstract_inverted_index.alterations | 155 |
| abstract_inverted_index.attributes. | 26 |
| abstract_inverted_index.evaluations | 134 |
| abstract_inverted_index.experiments | 165 |
| abstract_inverted_index.investigate | 77 |
| abstract_inverted_index.performance | 177 |
| abstract_inverted_index.recognition | 20 |
| abstract_inverted_index.techniques. | 163 |
| abstract_inverted_index.ImageNet-C's | 124 |
| abstract_inverted_index.advancements | 218 |
| abstract_inverted_index.applications | 226 |
| abstract_inverted_index.benchmarking | 8, 187 |
| abstract_inverted_index.correlations | 91, 149 |
| abstract_inverted_index.distribution | 48, 119, 210 |
| abstract_inverted_index.quantization | 162 |
| abstract_inverted_index.recognition, | 199 |
| abstract_inverted_index.respectively | 41 |
| abstract_inverted_index.Additionally, | 75 |
| abstract_inverted_index.understanding | 192 |
| abstract_inverted_index.cyber-physical | 231 |
| abstract_inverted_index.state-of-the-art | 58 |
| abstract_inverted_index.attribute-specific | 84 |
| cited_by_percentile_year | |
| countries_distinct_count | 0 |
| institutions_distinct_count | 6 |
| citation_normalized_percentile |