Benchmarking Unified Face Attack Detection via Hierarchical Prompt Tuning Article Swipe
YOU?
·
· 2025
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.2505.13327
PAD and FFD are proposed to protect face data from physical media-based Presentation Attacks and digital editing-based DeepFakes, respectively. However, isolated training of these two models significantly increases vulnerability towards unknown attacks, burdening deployment environments. The lack of a Unified Face Attack Detection model to simultaneously handle attacks in these two categories is mainly attributed to two factors: (1) A benchmark that is sufficient for models to explore is lacking. Existing UAD datasets only contain limited attack types and samples, leading to the model's confined ability to address abundant advanced threats. In light of these, through an explainable hierarchical way, we propose the most extensive and sophisticated collection of forgery techniques available to date, namely UniAttackDataPlus. Our UniAttackData+ encompasses 2,875 identities and their 54 kinds of corresponding falsified samples, in a total of 697,347 videos. (2) The absence of a trustworthy classification criterion. Current methods endeavor to explore an arbitrary criterion within the same semantic space, which fails to exist when encountering diverse attacks. Thus, we present a novel Visual-Language Model-based Hierarchical Prompt Tuning Framework that adaptively explores multiple classification criteria from different semantic spaces. Specifically, we construct a VP-Tree to explore various classification rules hierarchically. Then, by adaptively pruning the prompts, the model can select the most suitable prompts guiding the encoder to extract discriminative features at different levels in a coarse-to-fine manner. Finally, to help the model understand the classification criteria in visual space, we propose a DPI module to project the visual prompts to the text encoder to help obtain a more accurate semantics.
Related Topics
- Type
- preprint
- Language
- en
- Landing Page
- http://arxiv.org/abs/2505.13327
- https://arxiv.org/pdf/2505.13327
- OA Status
- green
- OpenAlex ID
- https://openalex.org/W4415016602
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4415016602Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.48550/arxiv.2505.13327Digital Object Identifier
- Title
-
Benchmarking Unified Face Attack Detection via Hierarchical Prompt TuningWork title
- Type
-
preprintOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2025Year of publication
- Publication date
-
2025-05-19Full publication date if available
- Authors
-
Ajian Liu, Haocheng Yuan, Xiao Guo, Hui Ma, Wanyi Zhuang, Changtao Miao, Hong Yan, Chuanbiao Song, Jun Lan, Qi Chu, Tao Gong, Yanyan Liang, Weiqiang Wang, Jun Wan, Xiaoming Liu, Zhen LeiList of authors in order
- Landing page
-
https://arxiv.org/abs/2505.13327Publisher landing page
- PDF URL
-
https://arxiv.org/pdf/2505.13327Direct link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
greenOpen access status per OpenAlex
- OA URL
-
https://arxiv.org/pdf/2505.13327Direct OA link when available
- Cited by
-
0Total citation count in OpenAlex
Full payload
| id | https://openalex.org/W4415016602 |
|---|---|
| doi | https://doi.org/10.48550/arxiv.2505.13327 |
| ids.doi | https://doi.org/10.48550/arxiv.2505.13327 |
| ids.openalex | https://openalex.org/W4415016602 |
| fwci | |
| type | preprint |
| title | Benchmarking Unified Face Attack Detection via Hierarchical Prompt Tuning |
| biblio.issue | |
| biblio.volume | |
| biblio.last_page | |
| biblio.first_page | |
| topics[0].id | https://openalex.org/T11448 |
| topics[0].field.id | https://openalex.org/fields/17 |
| topics[0].field.display_name | Computer Science |
| topics[0].score | 0.9609000086784363 |
| topics[0].domain.id | https://openalex.org/domains/3 |
| topics[0].domain.display_name | Physical Sciences |
| topics[0].subfield.id | https://openalex.org/subfields/1707 |
| topics[0].subfield.display_name | Computer Vision and Pattern Recognition |
| topics[0].display_name | Face recognition and analysis |
| topics[1].id | https://openalex.org/T11512 |
| topics[1].field.id | https://openalex.org/fields/17 |
| topics[1].field.display_name | Computer Science |
| topics[1].score | 0.9419999718666077 |
| topics[1].domain.id | https://openalex.org/domains/3 |
| topics[1].domain.display_name | Physical Sciences |
| topics[1].subfield.id | https://openalex.org/subfields/1702 |
| topics[1].subfield.display_name | Artificial Intelligence |
| topics[1].display_name | Anomaly Detection Techniques and Applications |
| topics[2].id | https://openalex.org/T11241 |
| topics[2].field.id | https://openalex.org/fields/17 |
| topics[2].field.display_name | Computer Science |
| topics[2].score | 0.9366000294685364 |
| topics[2].domain.id | https://openalex.org/domains/3 |
| topics[2].domain.display_name | Physical Sciences |
| topics[2].subfield.id | https://openalex.org/subfields/1711 |
| topics[2].subfield.display_name | Signal Processing |
| topics[2].display_name | Advanced Malware Detection Techniques |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| language | en |
| locations[0].id | pmh:oai:arXiv.org:2505.13327 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S4306400194 |
| locations[0].source.issn | |
| locations[0].source.type | repository |
| locations[0].source.is_oa | True |
| locations[0].source.issn_l | |
| locations[0].source.is_core | False |
| locations[0].source.is_in_doaj | False |
| locations[0].source.display_name | arXiv (Cornell University) |
| locations[0].source.host_organization | https://openalex.org/I205783295 |
| locations[0].source.host_organization_name | Cornell University |
| locations[0].source.host_organization_lineage | https://openalex.org/I205783295 |
| locations[0].license | |
| locations[0].pdf_url | https://arxiv.org/pdf/2505.13327 |
| locations[0].version | submittedVersion |
| locations[0].raw_type | text |
| locations[0].license_id | |
| locations[0].is_accepted | False |
| locations[0].is_published | False |
| locations[0].raw_source_name | |
| locations[0].landing_page_url | http://arxiv.org/abs/2505.13327 |
| locations[1].id | doi:10.48550/arxiv.2505.13327 |
| locations[1].is_oa | True |
| locations[1].source.id | https://openalex.org/S4306400194 |
| locations[1].source.issn | |
| locations[1].source.type | repository |
| locations[1].source.is_oa | True |
| locations[1].source.issn_l | |
| locations[1].source.is_core | False |
| locations[1].source.is_in_doaj | False |
| locations[1].source.display_name | arXiv (Cornell University) |
| locations[1].source.host_organization | https://openalex.org/I205783295 |
| locations[1].source.host_organization_name | Cornell University |
| locations[1].source.host_organization_lineage | https://openalex.org/I205783295 |
| locations[1].license | cc-by |
| locations[1].pdf_url | |
| locations[1].version | |
| locations[1].raw_type | article |
| locations[1].license_id | https://openalex.org/licenses/cc-by |
| locations[1].is_accepted | False |
| locations[1].is_published | |
| locations[1].raw_source_name | |
| locations[1].landing_page_url | https://doi.org/10.48550/arxiv.2505.13327 |
| indexed_in | arxiv, datacite |
| authorships[0].author.id | https://openalex.org/A5101994611 |
| authorships[0].author.orcid | https://orcid.org/0000-0002-7788-9368 |
| authorships[0].author.display_name | Ajian Liu |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Liu, Ajian |
| authorships[0].is_corresponding | False |
| authorships[1].author.id | https://openalex.org/A5109679741 |
| authorships[1].author.orcid | |
| authorships[1].author.display_name | Haocheng Yuan |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Yuan, Haocheng |
| authorships[1].is_corresponding | False |
| authorships[2].author.id | https://openalex.org/A5103903071 |
| authorships[2].author.orcid | |
| authorships[2].author.display_name | Xiao Guo |
| authorships[2].author_position | middle |
| authorships[2].raw_author_name | Guo, Xiao |
| authorships[2].is_corresponding | False |
| authorships[3].author.id | https://openalex.org/A5100591774 |
| authorships[3].author.orcid | https://orcid.org/0009-0004-0640-5671 |
| authorships[3].author.display_name | Hui Ma |
| authorships[3].author_position | middle |
| authorships[3].raw_author_name | Ma, Hui |
| authorships[3].is_corresponding | False |
| authorships[4].author.id | https://openalex.org/A5088627439 |
| authorships[4].author.orcid | https://orcid.org/0000-0001-9305-1189 |
| authorships[4].author.display_name | Wanyi Zhuang |
| authorships[4].author_position | middle |
| authorships[4].raw_author_name | Zhuang, Wanyi |
| authorships[4].is_corresponding | False |
| authorships[5].author.id | https://openalex.org/A5077867418 |
| authorships[5].author.orcid | https://orcid.org/0000-0002-7634-9992 |
| authorships[5].author.display_name | Changtao Miao |
| authorships[5].author_position | middle |
| authorships[5].raw_author_name | Miao, Changtao |
| authorships[5].is_corresponding | False |
| authorships[6].author.id | https://openalex.org/A5038745722 |
| authorships[6].author.orcid | https://orcid.org/0000-0003-1834-8956 |
| authorships[6].author.display_name | Hong Yan |
| authorships[6].author_position | middle |
| authorships[6].raw_author_name | Hong, Yan |
| authorships[6].is_corresponding | False |
| authorships[7].author.id | https://openalex.org/A5083155503 |
| authorships[7].author.orcid | |
| authorships[7].author.display_name | Chuanbiao Song |
| authorships[7].author_position | middle |
| authorships[7].raw_author_name | Song, Chuanbiao |
| authorships[7].is_corresponding | False |
| authorships[8].author.id | https://openalex.org/A5100582842 |
| authorships[8].author.orcid | |
| authorships[8].author.display_name | Jun Lan |
| authorships[8].author_position | middle |
| authorships[8].raw_author_name | Lan, Jun |
| authorships[8].is_corresponding | False |
| authorships[9].author.id | https://openalex.org/A5101799726 |
| authorships[9].author.orcid | https://orcid.org/0000-0002-5264-6095 |
| authorships[9].author.display_name | Qi Chu |
| authorships[9].author_position | middle |
| authorships[9].raw_author_name | Chu, Qi |
| authorships[9].is_corresponding | False |
| authorships[10].author.id | https://openalex.org/A5088814076 |
| authorships[10].author.orcid | https://orcid.org/0000-0003-0248-9404 |
| authorships[10].author.display_name | Tao Gong |
| authorships[10].author_position | middle |
| authorships[10].raw_author_name | Gong, Tao |
| authorships[10].is_corresponding | False |
| authorships[11].author.id | https://openalex.org/A5062981171 |
| authorships[11].author.orcid | https://orcid.org/0000-0002-5780-8540 |
| authorships[11].author.display_name | Yanyan Liang |
| authorships[11].author_position | middle |
| authorships[11].raw_author_name | Liang, Yanyan |
| authorships[11].is_corresponding | False |
| authorships[12].author.id | https://openalex.org/A5100721595 |
| authorships[12].author.orcid | https://orcid.org/0000-0002-0771-2129 |
| authorships[12].author.display_name | Weiqiang Wang |
| authorships[12].author_position | middle |
| authorships[12].raw_author_name | Wang, Weiqiang |
| authorships[12].is_corresponding | False |
| authorships[13].author.id | https://openalex.org/A5101655825 |
| authorships[13].author.orcid | https://orcid.org/0000-0002-9961-7902 |
| authorships[13].author.display_name | Jun Wan |
| authorships[13].author_position | middle |
| authorships[13].raw_author_name | Wan, Jun |
| authorships[13].is_corresponding | False |
| authorships[14].author.id | https://openalex.org/A5100409053 |
| authorships[14].author.orcid | https://orcid.org/0000-0003-3467-5607 |
| authorships[14].author.display_name | Xiaoming Liu |
| authorships[14].author_position | middle |
| authorships[14].raw_author_name | Liu, Xiaoming |
| authorships[14].is_corresponding | False |
| authorships[15].author.id | https://openalex.org/A5109299788 |
| authorships[15].author.orcid | https://orcid.org/0000-0002-0791-189X |
| authorships[15].author.display_name | Zhen Lei |
| authorships[15].author_position | last |
| authorships[15].raw_author_name | Lei, Zhen |
| authorships[15].is_corresponding | False |
| has_content.pdf | False |
| has_content.grobid_xml | False |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://arxiv.org/pdf/2505.13327 |
| open_access.oa_status | green |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | Benchmarking Unified Face Attack Detection via Hierarchical Prompt Tuning |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-11-06T06:51:31.235846 |
| primary_topic.id | https://openalex.org/T11448 |
| primary_topic.field.id | https://openalex.org/fields/17 |
| primary_topic.field.display_name | Computer Science |
| primary_topic.score | 0.9609000086784363 |
| primary_topic.domain.id | https://openalex.org/domains/3 |
| primary_topic.domain.display_name | Physical Sciences |
| primary_topic.subfield.id | https://openalex.org/subfields/1707 |
| primary_topic.subfield.display_name | Computer Vision and Pattern Recognition |
| primary_topic.display_name | Face recognition and analysis |
| cited_by_count | 0 |
| locations_count | 2 |
| best_oa_location.id | pmh:oai:arXiv.org:2505.13327 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S4306400194 |
| best_oa_location.source.issn | |
| best_oa_location.source.type | repository |
| best_oa_location.source.is_oa | True |
| best_oa_location.source.issn_l | |
| best_oa_location.source.is_core | False |
| best_oa_location.source.is_in_doaj | False |
| best_oa_location.source.display_name | arXiv (Cornell University) |
| best_oa_location.source.host_organization | https://openalex.org/I205783295 |
| best_oa_location.source.host_organization_name | Cornell University |
| best_oa_location.source.host_organization_lineage | https://openalex.org/I205783295 |
| best_oa_location.license | |
| best_oa_location.pdf_url | https://arxiv.org/pdf/2505.13327 |
| best_oa_location.version | submittedVersion |
| best_oa_location.raw_type | text |
| best_oa_location.license_id | |
| best_oa_location.is_accepted | False |
| best_oa_location.is_published | False |
| best_oa_location.raw_source_name | |
| best_oa_location.landing_page_url | http://arxiv.org/abs/2505.13327 |
| primary_location.id | pmh:oai:arXiv.org:2505.13327 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S4306400194 |
| primary_location.source.issn | |
| primary_location.source.type | repository |
| primary_location.source.is_oa | True |
| primary_location.source.issn_l | |
| primary_location.source.is_core | False |
| primary_location.source.is_in_doaj | False |
| primary_location.source.display_name | arXiv (Cornell University) |
| primary_location.source.host_organization | https://openalex.org/I205783295 |
| primary_location.source.host_organization_name | Cornell University |
| primary_location.source.host_organization_lineage | https://openalex.org/I205783295 |
| primary_location.license | |
| primary_location.pdf_url | https://arxiv.org/pdf/2505.13327 |
| primary_location.version | submittedVersion |
| primary_location.raw_type | text |
| primary_location.license_id | |
| primary_location.is_accepted | False |
| primary_location.is_published | False |
| primary_location.raw_source_name | |
| primary_location.landing_page_url | http://arxiv.org/abs/2505.13327 |
| publication_date | 2025-05-19 |
| publication_year | 2025 |
| referenced_works_count | 0 |
| abstract_inverted_index.A | 59 |
| abstract_inverted_index.a | 38, 130, 139, 167, 188, 221, 238, 253 |
| abstract_inverted_index.54 | 123 |
| abstract_inverted_index.In | 91 |
| abstract_inverted_index.an | 96, 148 |
| abstract_inverted_index.at | 217 |
| abstract_inverted_index.by | 197 |
| abstract_inverted_index.in | 48, 129, 220, 233 |
| abstract_inverted_index.is | 52, 62, 68 |
| abstract_inverted_index.of | 22, 37, 93, 108, 125, 132, 138 |
| abstract_inverted_index.to | 5, 44, 55, 66, 81, 86, 112, 146, 158, 190, 213, 225, 241, 246, 250 |
| abstract_inverted_index.we | 100, 165, 186, 236 |
| abstract_inverted_index.(1) | 58 |
| abstract_inverted_index.(2) | 135 |
| abstract_inverted_index.DPI | 239 |
| abstract_inverted_index.FFD | 2 |
| abstract_inverted_index.Our | 116 |
| abstract_inverted_index.PAD | 0 |
| abstract_inverted_index.The | 35, 136 |
| abstract_inverted_index.UAD | 71 |
| abstract_inverted_index.and | 1, 14, 78, 105, 121 |
| abstract_inverted_index.are | 3 |
| abstract_inverted_index.can | 204 |
| abstract_inverted_index.for | 64 |
| abstract_inverted_index.the | 82, 102, 152, 200, 202, 206, 211, 227, 230, 243, 247 |
| abstract_inverted_index.two | 24, 50, 56 |
| abstract_inverted_index.Face | 40 |
| abstract_inverted_index.data | 8 |
| abstract_inverted_index.face | 7 |
| abstract_inverted_index.from | 9, 181 |
| abstract_inverted_index.help | 226, 251 |
| abstract_inverted_index.lack | 36 |
| abstract_inverted_index.more | 254 |
| abstract_inverted_index.most | 103, 207 |
| abstract_inverted_index.only | 73 |
| abstract_inverted_index.same | 153 |
| abstract_inverted_index.text | 248 |
| abstract_inverted_index.that | 61, 175 |
| abstract_inverted_index.way, | 99 |
| abstract_inverted_index.when | 160 |
| abstract_inverted_index.2,875 | 119 |
| abstract_inverted_index.Then, | 196 |
| abstract_inverted_index.Thus, | 164 |
| abstract_inverted_index.date, | 113 |
| abstract_inverted_index.exist | 159 |
| abstract_inverted_index.fails | 157 |
| abstract_inverted_index.kinds | 124 |
| abstract_inverted_index.light | 92 |
| abstract_inverted_index.model | 43, 203, 228 |
| abstract_inverted_index.novel | 168 |
| abstract_inverted_index.rules | 194 |
| abstract_inverted_index.their | 122 |
| abstract_inverted_index.these | 23, 49 |
| abstract_inverted_index.total | 131 |
| abstract_inverted_index.types | 77 |
| abstract_inverted_index.which | 156 |
| abstract_inverted_index.Attack | 41 |
| abstract_inverted_index.Prompt | 172 |
| abstract_inverted_index.Tuning | 173 |
| abstract_inverted_index.attack | 76 |
| abstract_inverted_index.handle | 46 |
| abstract_inverted_index.levels | 219 |
| abstract_inverted_index.mainly | 53 |
| abstract_inverted_index.models | 25, 65 |
| abstract_inverted_index.module | 240 |
| abstract_inverted_index.namely | 114 |
| abstract_inverted_index.obtain | 252 |
| abstract_inverted_index.select | 205 |
| abstract_inverted_index.space, | 155, 235 |
| abstract_inverted_index.these, | 94 |
| abstract_inverted_index.visual | 234, 244 |
| abstract_inverted_index.within | 151 |
| abstract_inverted_index.697,347 | 133 |
| abstract_inverted_index.Attacks | 13 |
| abstract_inverted_index.Current | 143 |
| abstract_inverted_index.Unified | 39 |
| abstract_inverted_index.VP-Tree | 189 |
| abstract_inverted_index.ability | 85 |
| abstract_inverted_index.absence | 137 |
| abstract_inverted_index.address | 87 |
| abstract_inverted_index.attacks | 47 |
| abstract_inverted_index.contain | 74 |
| abstract_inverted_index.digital | 15 |
| abstract_inverted_index.diverse | 162 |
| abstract_inverted_index.encoder | 212, 249 |
| abstract_inverted_index.explore | 67, 147, 191 |
| abstract_inverted_index.extract | 214 |
| abstract_inverted_index.forgery | 109 |
| abstract_inverted_index.guiding | 210 |
| abstract_inverted_index.leading | 80 |
| abstract_inverted_index.limited | 75 |
| abstract_inverted_index.manner. | 223 |
| abstract_inverted_index.methods | 144 |
| abstract_inverted_index.model's | 83 |
| abstract_inverted_index.present | 166 |
| abstract_inverted_index.project | 242 |
| abstract_inverted_index.prompts | 209, 245 |
| abstract_inverted_index.propose | 101, 237 |
| abstract_inverted_index.protect | 6 |
| abstract_inverted_index.pruning | 199 |
| abstract_inverted_index.spaces. | 184 |
| abstract_inverted_index.through | 95 |
| abstract_inverted_index.towards | 29 |
| abstract_inverted_index.unknown | 30 |
| abstract_inverted_index.various | 192 |
| abstract_inverted_index.videos. | 134 |
| abstract_inverted_index.Existing | 70 |
| abstract_inverted_index.Finally, | 224 |
| abstract_inverted_index.However, | 19 |
| abstract_inverted_index.abundant | 88 |
| abstract_inverted_index.accurate | 255 |
| abstract_inverted_index.advanced | 89 |
| abstract_inverted_index.attacks, | 31 |
| abstract_inverted_index.attacks. | 163 |
| abstract_inverted_index.confined | 84 |
| abstract_inverted_index.criteria | 180, 232 |
| abstract_inverted_index.datasets | 72 |
| abstract_inverted_index.endeavor | 145 |
| abstract_inverted_index.explores | 177 |
| abstract_inverted_index.factors: | 57 |
| abstract_inverted_index.features | 216 |
| abstract_inverted_index.isolated | 20 |
| abstract_inverted_index.lacking. | 69 |
| abstract_inverted_index.multiple | 178 |
| abstract_inverted_index.physical | 10 |
| abstract_inverted_index.prompts, | 201 |
| abstract_inverted_index.proposed | 4 |
| abstract_inverted_index.samples, | 79, 128 |
| abstract_inverted_index.semantic | 154, 183 |
| abstract_inverted_index.suitable | 208 |
| abstract_inverted_index.threats. | 90 |
| abstract_inverted_index.training | 21 |
| abstract_inverted_index.Detection | 42 |
| abstract_inverted_index.Framework | 174 |
| abstract_inverted_index.arbitrary | 149 |
| abstract_inverted_index.available | 111 |
| abstract_inverted_index.benchmark | 60 |
| abstract_inverted_index.burdening | 32 |
| abstract_inverted_index.construct | 187 |
| abstract_inverted_index.criterion | 150 |
| abstract_inverted_index.different | 182, 218 |
| abstract_inverted_index.extensive | 104 |
| abstract_inverted_index.falsified | 127 |
| abstract_inverted_index.increases | 27 |
| abstract_inverted_index.DeepFakes, | 17 |
| abstract_inverted_index.adaptively | 176, 198 |
| abstract_inverted_index.attributed | 54 |
| abstract_inverted_index.categories | 51 |
| abstract_inverted_index.collection | 107 |
| abstract_inverted_index.criterion. | 142 |
| abstract_inverted_index.deployment | 33 |
| abstract_inverted_index.identities | 120 |
| abstract_inverted_index.semantics. | 256 |
| abstract_inverted_index.sufficient | 63 |
| abstract_inverted_index.techniques | 110 |
| abstract_inverted_index.understand | 229 |
| abstract_inverted_index.Model-based | 170 |
| abstract_inverted_index.encompasses | 118 |
| abstract_inverted_index.explainable | 97 |
| abstract_inverted_index.media-based | 11 |
| abstract_inverted_index.trustworthy | 140 |
| abstract_inverted_index.Hierarchical | 171 |
| abstract_inverted_index.Presentation | 12 |
| abstract_inverted_index.encountering | 161 |
| abstract_inverted_index.hierarchical | 98 |
| abstract_inverted_index.Specifically, | 185 |
| abstract_inverted_index.corresponding | 126 |
| abstract_inverted_index.editing-based | 16 |
| abstract_inverted_index.environments. | 34 |
| abstract_inverted_index.respectively. | 18 |
| abstract_inverted_index.significantly | 26 |
| abstract_inverted_index.sophisticated | 106 |
| abstract_inverted_index.vulnerability | 28 |
| abstract_inverted_index.UniAttackData+ | 117 |
| abstract_inverted_index.classification | 141, 179, 193, 231 |
| abstract_inverted_index.coarse-to-fine | 222 |
| abstract_inverted_index.discriminative | 215 |
| abstract_inverted_index.simultaneously | 45 |
| abstract_inverted_index.Visual-Language | 169 |
| abstract_inverted_index.hierarchically. | 195 |
| abstract_inverted_index.UniAttackDataPlus. | 115 |
| cited_by_percentile_year | |
| countries_distinct_count | 0 |
| institutions_distinct_count | 16 |
| citation_normalized_percentile |