Semantic Segmentation of Foggy Scenes Based on Progressive Domain Gap Decoupling Article Swipe
YOU?
·
· 2023
· Open Access
·
· DOI: https://doi.org/10.36227/techrxiv.22682161
Robust Semantic Foggy Scene Segmentation (SFSS) is crucial for the security of autonomous driving. However, the blurring of images caused by fog increases the difficulty of recognition and makes annotation of foggy scene images more expensive, resulting poor performance when recognizing entities in the fog. Currently, many methods use domain adaptation to transfer segmentation knowledge from clear scenes to foggy ones. But these method are often ineffective due to the large domain gap between different cities' styles and the quality degradation of images caused by fog. The latest research has attempted to introduce an intermediate domain to decouple the domain gap and gradually complete the semantic segmentation of foggy scenes, but the exploration of the intermediate domain information is often insufficient. To solve these problems, we first analyze the self-training in domain adaptation and propose the concept of "label reference value". We prove that the higher the total label reference value, the easier the self-training performance gets improved. With this precondition, we can reasonably split the original problem into two-stage domain adaptation. In each stage, the "label reference value" can be controlled and maximized. Specifically, the first stage only process the style gap between source domain and the intermediate domain, and the second stage process the fog gap. The fog gap includes: (1) real fog gap between the intermediate domain and target domain; (2) the synthetic fog gap between clear source domain and synthetic foggy source domain. This allows the model to make full use of "label reference value" and gradually develop good semantic segmentation skills for foggy scenes. Our approach significantly outperforms the baseline algorithm on all the mainstream SFSS benchmarks, with good generalization ability demonstrated in other adverse scenes such as rain and snow. We also compare our method with latest large segmentation models, which shows that our method has more robust performance in the foggy scenes.
Related Topics
- Type
- preprint
- Language
- en
- Landing Page
- https://doi.org/10.36227/techrxiv.22682161
- OA Status
- gold
- Cited By
- 2
- References
- 44
- Related Works
- 10
- OpenAlex ID
- https://openalex.org/W4367672173
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4367672173Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.36227/techrxiv.22682161Digital Object Identifier
- Title
-
Semantic Segmentation of Foggy Scenes Based on Progressive Domain Gap DecouplingWork title
- Type
-
preprintOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2023Year of publication
- Publication date
-
2023-05-02Full publication date if available
- Authors
-
Ziquan Wang, Yongsheng Zhang, XianZheng Ma, Ying Yu, Zhenchao Zhang, Zhipeng Jiang, Binbin ChengList of authors in order
- Landing page
-
https://doi.org/10.36227/techrxiv.22682161Publisher landing page
- Open access
-
YesWhether a free full text is available
- OA status
-
goldOpen access status per OpenAlex
- OA URL
-
https://doi.org/10.36227/techrxiv.22682161Direct OA link when available
- Concepts
-
Computer science, Segmentation, Domain (mathematical analysis), Artificial intelligence, Semantic gap, Computer vision, Decoupling (probability), Process (computing), Image (mathematics), Image retrieval, Mathematics, Control engineering, Engineering, Mathematical analysis, Operating systemTop concepts (fields/topics) attached by OpenAlex
- Cited by
-
2Total citation count in OpenAlex
- Citations by year (recent)
-
2024: 2Per-year citation counts (last 5 years)
- References (count)
-
44Number of works referenced by this work
- Related works (count)
-
10Other works algorithmically related by OpenAlex
Full payload
| id | https://openalex.org/W4367672173 |
|---|---|
| doi | https://doi.org/10.36227/techrxiv.22682161 |
| ids.doi | https://doi.org/10.36227/techrxiv.22682161 |
| ids.openalex | https://openalex.org/W4367672173 |
| fwci | 0.36393668 |
| type | preprint |
| title | Semantic Segmentation of Foggy Scenes Based on Progressive Domain Gap Decoupling |
| biblio.issue | |
| biblio.volume | |
| biblio.last_page | |
| biblio.first_page | |
| topics[0].id | https://openalex.org/T10036 |
| topics[0].field.id | https://openalex.org/fields/17 |
| topics[0].field.display_name | Computer Science |
| topics[0].score | 0.9948999881744385 |
| topics[0].domain.id | https://openalex.org/domains/3 |
| topics[0].domain.display_name | Physical Sciences |
| topics[0].subfield.id | https://openalex.org/subfields/1707 |
| topics[0].subfield.display_name | Computer Vision and Pattern Recognition |
| topics[0].display_name | Advanced Neural Network Applications |
| topics[1].id | https://openalex.org/T11019 |
| topics[1].field.id | https://openalex.org/fields/17 |
| topics[1].field.display_name | Computer Science |
| topics[1].score | 0.987500011920929 |
| topics[1].domain.id | https://openalex.org/domains/3 |
| topics[1].domain.display_name | Physical Sciences |
| topics[1].subfield.id | https://openalex.org/subfields/1707 |
| topics[1].subfield.display_name | Computer Vision and Pattern Recognition |
| topics[1].display_name | Image Enhancement Techniques |
| topics[2].id | https://openalex.org/T12597 |
| topics[2].field.id | https://openalex.org/fields/22 |
| topics[2].field.display_name | Engineering |
| topics[2].score | 0.9861999750137329 |
| topics[2].domain.id | https://openalex.org/domains/3 |
| topics[2].domain.display_name | Physical Sciences |
| topics[2].subfield.id | https://openalex.org/subfields/2213 |
| topics[2].subfield.display_name | Safety, Risk, Reliability and Quality |
| topics[2].display_name | Fire Detection and Safety Systems |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| concepts[0].id | https://openalex.org/C41008148 |
| concepts[0].level | 0 |
| concepts[0].score | 0.7968682646751404 |
| concepts[0].wikidata | https://www.wikidata.org/wiki/Q21198 |
| concepts[0].display_name | Computer science |
| concepts[1].id | https://openalex.org/C89600930 |
| concepts[1].level | 2 |
| concepts[1].score | 0.7459340691566467 |
| concepts[1].wikidata | https://www.wikidata.org/wiki/Q1423946 |
| concepts[1].display_name | Segmentation |
| concepts[2].id | https://openalex.org/C36503486 |
| concepts[2].level | 2 |
| concepts[2].score | 0.5640236735343933 |
| concepts[2].wikidata | https://www.wikidata.org/wiki/Q11235244 |
| concepts[2].display_name | Domain (mathematical analysis) |
| concepts[3].id | https://openalex.org/C154945302 |
| concepts[3].level | 1 |
| concepts[3].score | 0.5448264479637146 |
| concepts[3].wikidata | https://www.wikidata.org/wiki/Q11660 |
| concepts[3].display_name | Artificial intelligence |
| concepts[4].id | https://openalex.org/C86034646 |
| concepts[4].level | 4 |
| concepts[4].score | 0.5175294280052185 |
| concepts[4].wikidata | https://www.wikidata.org/wiki/Q474311 |
| concepts[4].display_name | Semantic gap |
| concepts[5].id | https://openalex.org/C31972630 |
| concepts[5].level | 1 |
| concepts[5].score | 0.5137530565261841 |
| concepts[5].wikidata | https://www.wikidata.org/wiki/Q844240 |
| concepts[5].display_name | Computer vision |
| concepts[6].id | https://openalex.org/C205606062 |
| concepts[6].level | 2 |
| concepts[6].score | 0.4717945456504822 |
| concepts[6].wikidata | https://www.wikidata.org/wiki/Q5249645 |
| concepts[6].display_name | Decoupling (probability) |
| concepts[7].id | https://openalex.org/C98045186 |
| concepts[7].level | 2 |
| concepts[7].score | 0.41358494758605957 |
| concepts[7].wikidata | https://www.wikidata.org/wiki/Q205663 |
| concepts[7].display_name | Process (computing) |
| concepts[8].id | https://openalex.org/C115961682 |
| concepts[8].level | 2 |
| concepts[8].score | 0.32585692405700684 |
| concepts[8].wikidata | https://www.wikidata.org/wiki/Q860623 |
| concepts[8].display_name | Image (mathematics) |
| concepts[9].id | https://openalex.org/C1667742 |
| concepts[9].level | 3 |
| concepts[9].score | 0.14981451630592346 |
| concepts[9].wikidata | https://www.wikidata.org/wiki/Q10927554 |
| concepts[9].display_name | Image retrieval |
| concepts[10].id | https://openalex.org/C33923547 |
| concepts[10].level | 0 |
| concepts[10].score | 0.10014915466308594 |
| concepts[10].wikidata | https://www.wikidata.org/wiki/Q395 |
| concepts[10].display_name | Mathematics |
| concepts[11].id | https://openalex.org/C133731056 |
| concepts[11].level | 1 |
| concepts[11].score | 0.0 |
| concepts[11].wikidata | https://www.wikidata.org/wiki/Q4917288 |
| concepts[11].display_name | Control engineering |
| concepts[12].id | https://openalex.org/C127413603 |
| concepts[12].level | 0 |
| concepts[12].score | 0.0 |
| concepts[12].wikidata | https://www.wikidata.org/wiki/Q11023 |
| concepts[12].display_name | Engineering |
| concepts[13].id | https://openalex.org/C134306372 |
| concepts[13].level | 1 |
| concepts[13].score | 0.0 |
| concepts[13].wikidata | https://www.wikidata.org/wiki/Q7754 |
| concepts[13].display_name | Mathematical analysis |
| concepts[14].id | https://openalex.org/C111919701 |
| concepts[14].level | 1 |
| concepts[14].score | 0.0 |
| concepts[14].wikidata | https://www.wikidata.org/wiki/Q9135 |
| concepts[14].display_name | Operating system |
| keywords[0].id | https://openalex.org/keywords/computer-science |
| keywords[0].score | 0.7968682646751404 |
| keywords[0].display_name | Computer science |
| keywords[1].id | https://openalex.org/keywords/segmentation |
| keywords[1].score | 0.7459340691566467 |
| keywords[1].display_name | Segmentation |
| keywords[2].id | https://openalex.org/keywords/domain |
| keywords[2].score | 0.5640236735343933 |
| keywords[2].display_name | Domain (mathematical analysis) |
| keywords[3].id | https://openalex.org/keywords/artificial-intelligence |
| keywords[3].score | 0.5448264479637146 |
| keywords[3].display_name | Artificial intelligence |
| keywords[4].id | https://openalex.org/keywords/semantic-gap |
| keywords[4].score | 0.5175294280052185 |
| keywords[4].display_name | Semantic gap |
| keywords[5].id | https://openalex.org/keywords/computer-vision |
| keywords[5].score | 0.5137530565261841 |
| keywords[5].display_name | Computer vision |
| keywords[6].id | https://openalex.org/keywords/decoupling |
| keywords[6].score | 0.4717945456504822 |
| keywords[6].display_name | Decoupling (probability) |
| keywords[7].id | https://openalex.org/keywords/process |
| keywords[7].score | 0.41358494758605957 |
| keywords[7].display_name | Process (computing) |
| keywords[8].id | https://openalex.org/keywords/image |
| keywords[8].score | 0.32585692405700684 |
| keywords[8].display_name | Image (mathematics) |
| keywords[9].id | https://openalex.org/keywords/image-retrieval |
| keywords[9].score | 0.14981451630592346 |
| keywords[9].display_name | Image retrieval |
| keywords[10].id | https://openalex.org/keywords/mathematics |
| keywords[10].score | 0.10014915466308594 |
| keywords[10].display_name | Mathematics |
| language | en |
| locations[0].id | doi:10.36227/techrxiv.22682161 |
| locations[0].is_oa | True |
| locations[0].source | |
| locations[0].license | cc-by |
| locations[0].pdf_url | |
| locations[0].version | acceptedVersion |
| locations[0].raw_type | posted-content |
| locations[0].license_id | https://openalex.org/licenses/cc-by |
| locations[0].is_accepted | True |
| locations[0].is_published | False |
| locations[0].raw_source_name | |
| locations[0].landing_page_url | https://doi.org/10.36227/techrxiv.22682161 |
| indexed_in | crossref |
| authorships[0].author.id | https://openalex.org/A5101949989 |
| authorships[0].author.orcid | https://orcid.org/0000-0002-3153-5417 |
| authorships[0].author.display_name | Ziquan Wang |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Ziquan Wang |
| authorships[0].is_corresponding | False |
| authorships[1].author.id | https://openalex.org/A5100374669 |
| authorships[1].author.orcid | |
| authorships[1].author.display_name | Yongsheng Zhang |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Yongsheng Zhang |
| authorships[1].is_corresponding | False |
| authorships[2].author.id | https://openalex.org/A5083071138 |
| authorships[2].author.orcid | |
| authorships[2].author.display_name | XianZheng Ma |
| authorships[2].author_position | middle |
| authorships[2].raw_author_name | XianZheng Ma |
| authorships[2].is_corresponding | False |
| authorships[3].author.id | https://openalex.org/A5100451632 |
| authorships[3].author.orcid | https://orcid.org/0000-0001-7840-9891 |
| authorships[3].author.display_name | Ying Yu |
| authorships[3].author_position | middle |
| authorships[3].raw_author_name | Ying Yu |
| authorships[3].is_corresponding | False |
| authorships[4].author.id | https://openalex.org/A5059339041 |
| authorships[4].author.orcid | https://orcid.org/0000-0002-2405-2038 |
| authorships[4].author.display_name | Zhenchao Zhang |
| authorships[4].author_position | middle |
| authorships[4].raw_author_name | ZhenChao Zhang |
| authorships[4].is_corresponding | False |
| authorships[5].author.id | https://openalex.org/A5051471223 |
| authorships[5].author.orcid | https://orcid.org/0000-0003-3953-1231 |
| authorships[5].author.display_name | Zhipeng Jiang |
| authorships[5].author_position | middle |
| authorships[5].raw_author_name | Zhipeng Jiang |
| authorships[5].is_corresponding | False |
| authorships[6].author.id | https://openalex.org/A5054324342 |
| authorships[6].author.orcid | https://orcid.org/0000-0002-8321-130X |
| authorships[6].author.display_name | Binbin Cheng |
| authorships[6].author_position | last |
| authorships[6].raw_author_name | Binbin Cheng |
| authorships[6].is_corresponding | False |
| has_content.pdf | False |
| has_content.grobid_xml | False |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://doi.org/10.36227/techrxiv.22682161 |
| open_access.oa_status | gold |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | Semantic Segmentation of Foggy Scenes Based on Progressive Domain Gap Decoupling |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-11-06T03:46:38.306776 |
| primary_topic.id | https://openalex.org/T10036 |
| primary_topic.field.id | https://openalex.org/fields/17 |
| primary_topic.field.display_name | Computer Science |
| primary_topic.score | 0.9948999881744385 |
| primary_topic.domain.id | https://openalex.org/domains/3 |
| primary_topic.domain.display_name | Physical Sciences |
| primary_topic.subfield.id | https://openalex.org/subfields/1707 |
| primary_topic.subfield.display_name | Computer Vision and Pattern Recognition |
| primary_topic.display_name | Advanced Neural Network Applications |
| related_works | https://openalex.org/W2152950565, https://openalex.org/W1617565119, https://openalex.org/W160381218, https://openalex.org/W2512958550, https://openalex.org/W2329266651, https://openalex.org/W3103825105, https://openalex.org/W2004102934, https://openalex.org/W3093339210, https://openalex.org/W2990774877, https://openalex.org/W3015037427 |
| cited_by_count | 2 |
| counts_by_year[0].year | 2024 |
| counts_by_year[0].cited_by_count | 2 |
| locations_count | 1 |
| best_oa_location.id | doi:10.36227/techrxiv.22682161 |
| best_oa_location.is_oa | True |
| best_oa_location.source | |
| best_oa_location.license | cc-by |
| best_oa_location.pdf_url | |
| best_oa_location.version | acceptedVersion |
| best_oa_location.raw_type | posted-content |
| best_oa_location.license_id | https://openalex.org/licenses/cc-by |
| best_oa_location.is_accepted | True |
| best_oa_location.is_published | False |
| best_oa_location.raw_source_name | |
| best_oa_location.landing_page_url | https://doi.org/10.36227/techrxiv.22682161 |
| primary_location.id | doi:10.36227/techrxiv.22682161 |
| primary_location.is_oa | True |
| primary_location.source | |
| primary_location.license | cc-by |
| primary_location.pdf_url | |
| primary_location.version | acceptedVersion |
| primary_location.raw_type | posted-content |
| primary_location.license_id | https://openalex.org/licenses/cc-by |
| primary_location.is_accepted | True |
| primary_location.is_published | False |
| primary_location.raw_source_name | |
| primary_location.landing_page_url | https://doi.org/10.36227/techrxiv.22682161 |
| publication_date | 2023-05-02 |
| publication_year | 2023 |
| referenced_works | https://openalex.org/W3195858154, https://openalex.org/W4362707078, https://openalex.org/W2187089797, https://openalex.org/W2964309882, https://openalex.org/W2748021867, https://openalex.org/W3119635706, https://openalex.org/W2963943912, https://openalex.org/W3120804725, https://openalex.org/W3207649350, https://openalex.org/W3120562181, https://openalex.org/W2932414082, https://openalex.org/W3211490618, https://openalex.org/W2563705555, https://openalex.org/W4239072543, https://openalex.org/W2963306157, https://openalex.org/W2985406498, https://openalex.org/W2340897893, https://openalex.org/W4362679702, https://openalex.org/W2951970475, https://openalex.org/W2467473805, https://openalex.org/W2963107255, https://openalex.org/W4225343834, https://openalex.org/W2972285644, https://openalex.org/W2412782625, https://openalex.org/W4292779060, https://openalex.org/W3217147624, https://openalex.org/W2593414223, https://openalex.org/W3215632849, https://openalex.org/W3193917007, https://openalex.org/W4305037850, https://openalex.org/W2519481857, https://openalex.org/W1990592195, https://openalex.org/W4307981045, https://openalex.org/W3107502112, https://openalex.org/W3175294391, https://openalex.org/W2962754725, https://openalex.org/W4312368691, https://openalex.org/W2962808524, https://openalex.org/W2963073217, https://openalex.org/W2887964057, https://openalex.org/W3021472582, https://openalex.org/W4319301015, https://openalex.org/W3035294798, https://openalex.org/W4313192952 |
| referenced_works_count | 44 |
| abstract_inverted_index.In | 172 |
| abstract_inverted_index.To | 121 |
| abstract_inverted_index.We | 141, 286 |
| abstract_inverted_index.an | 93 |
| abstract_inverted_index.as | 282 |
| abstract_inverted_index.be | 180 |
| abstract_inverted_index.by | 20, 84 |
| abstract_inverted_index.in | 42, 130, 277, 305 |
| abstract_inverted_index.is | 6, 118 |
| abstract_inverted_index.of | 11, 17, 25, 30, 81, 107, 113, 137, 245 |
| abstract_inverted_index.on | 266 |
| abstract_inverted_index.to | 51, 58, 68, 91, 96, 241 |
| abstract_inverted_index.we | 125, 161 |
| abstract_inverted_index.(1) | 212 |
| abstract_inverted_index.(2) | 223 |
| abstract_inverted_index.But | 61 |
| abstract_inverted_index.Our | 259 |
| abstract_inverted_index.The | 86, 208 |
| abstract_inverted_index.all | 267 |
| abstract_inverted_index.and | 27, 77, 101, 133, 182, 196, 200, 220, 232, 249, 284 |
| abstract_inverted_index.are | 64 |
| abstract_inverted_index.but | 110 |
| abstract_inverted_index.can | 162 |
| abstract_inverted_index.due | 67 |
| abstract_inverted_index.fog | 21, 206, 209, 214, 226 |
| abstract_inverted_index.for | 8, 256 |
| abstract_inverted_index.gap | 72, 100, 192, 210, 215, 227 |
| abstract_inverted_index.has | 89, 301 |
| abstract_inverted_index.our | 289, 299 |
| abstract_inverted_index.the | 9, 15, 23, 43, 69, 78, 98, 104, 111, 114, 128, 135, 144, 146, 151, 153, 165, 175, 185, 190, 197, 201, 205, 217, 224, 239, 263, 268, 306 |
| abstract_inverted_index.use | 48, 244 |
| abstract_inverted_index.SFSS | 270 |
| abstract_inverted_index.This | 237 |
| abstract_inverted_index.With | 158 |
| abstract_inverted_index.also | 287 |
| abstract_inverted_index.each | 173 |
| abstract_inverted_index.fog. | 44, 85 |
| abstract_inverted_index.from | 55 |
| abstract_inverted_index.full | 243 |
| abstract_inverted_index.gap. | 207 |
| abstract_inverted_index.gets | 156 |
| abstract_inverted_index.good | 252, 273 |
| abstract_inverted_index.into | 168 |
| abstract_inverted_index.make | 242 |
| abstract_inverted_index.many | 46 |
| abstract_inverted_index.more | 34, 302 |
| abstract_inverted_index.only | 188 |
| abstract_inverted_index.poor | 37 |
| abstract_inverted_index.rain | 283 |
| abstract_inverted_index.real | 213 |
| abstract_inverted_index.such | 281 |
| abstract_inverted_index.that | 143, 298 |
| abstract_inverted_index.this | 159 |
| abstract_inverted_index.when | 39 |
| abstract_inverted_index.with | 272, 291 |
| abstract_inverted_index.Foggy | 2 |
| abstract_inverted_index.Scene | 3 |
| abstract_inverted_index.clear | 56, 229 |
| abstract_inverted_index.first | 126, 186 |
| abstract_inverted_index.foggy | 31, 59, 108, 234, 257, 307 |
| abstract_inverted_index.label | 148 |
| abstract_inverted_index.large | 70, 293 |
| abstract_inverted_index.makes | 28 |
| abstract_inverted_index.model | 240 |
| abstract_inverted_index.often | 65, 119 |
| abstract_inverted_index.ones. | 60 |
| abstract_inverted_index.other | 278 |
| abstract_inverted_index.prove | 142 |
| abstract_inverted_index.scene | 32 |
| abstract_inverted_index.shows | 297 |
| abstract_inverted_index.snow. | 285 |
| abstract_inverted_index.solve | 122 |
| abstract_inverted_index.split | 164 |
| abstract_inverted_index.stage | 187, 203 |
| abstract_inverted_index.style | 191 |
| abstract_inverted_index.these | 62, 123 |
| abstract_inverted_index.total | 147 |
| abstract_inverted_index.which | 296 |
| abstract_inverted_index.(SFSS) | 5 |
| abstract_inverted_index.allows | 238 |
| abstract_inverted_index.caused | 19, 83 |
| abstract_inverted_index.domain | 49, 71, 95, 99, 116, 131, 170, 195, 219, 231 |
| abstract_inverted_index.easier | 152 |
| abstract_inverted_index.higher | 145 |
| abstract_inverted_index.images | 18, 33, 82 |
| abstract_inverted_index.latest | 87, 292 |
| abstract_inverted_index.method | 63, 290, 300 |
| abstract_inverted_index.robust | 303 |
| abstract_inverted_index.scenes | 57, 280 |
| abstract_inverted_index.second | 202 |
| abstract_inverted_index.skills | 255 |
| abstract_inverted_index.source | 194, 230, 235 |
| abstract_inverted_index.stage, | 174 |
| abstract_inverted_index.styles | 76 |
| abstract_inverted_index.target | 221 |
| abstract_inverted_index.value" | 178 |
| abstract_inverted_index.value, | 150 |
| abstract_inverted_index.ability | 275 |
| abstract_inverted_index.adverse | 279 |
| abstract_inverted_index.analyze | 127 |
| abstract_inverted_index.between | 73, 193, 216, 228 |
| abstract_inverted_index.cities' | 75 |
| abstract_inverted_index.compare | 288 |
| abstract_inverted_index.concept | 136 |
| abstract_inverted_index.crucial | 7 |
| abstract_inverted_index.develop | 251 |
| abstract_inverted_index.domain, | 199 |
| abstract_inverted_index.domain. | 236 |
| abstract_inverted_index.domain; | 222 |
| abstract_inverted_index.methods | 47 |
| abstract_inverted_index.models, | 295 |
| abstract_inverted_index.problem | 167 |
| abstract_inverted_index.process | 189, 204 |
| abstract_inverted_index.propose | 134 |
| abstract_inverted_index.quality | 79 |
| abstract_inverted_index.scenes, | 109 |
| abstract_inverted_index.scenes. | 258, 308 |
| abstract_inverted_index.However, | 14 |
| abstract_inverted_index.Semantic | 1 |
| abstract_inverted_index.approach | 260 |
| abstract_inverted_index.baseline | 264 |
| abstract_inverted_index.blurring | 16 |
| abstract_inverted_index.complete | 103 |
| abstract_inverted_index.decouple | 97 |
| abstract_inverted_index.driving. | 13 |
| abstract_inverted_index.entities | 41 |
| abstract_inverted_index.original | 166 |
| abstract_inverted_index.research | 88 |
| abstract_inverted_index.security | 10 |
| abstract_inverted_index.semantic | 105, 253 |
| abstract_inverted_index.transfer | 52 |
| abstract_inverted_index.algorithm | 265 |
| abstract_inverted_index.attempted | 90 |
| abstract_inverted_index.different | 74 |
| abstract_inverted_index.gradually | 102, 250 |
| abstract_inverted_index.improved. | 157 |
| abstract_inverted_index.includes: | 211 |
| abstract_inverted_index.increases | 22 |
| abstract_inverted_index.introduce | 92 |
| abstract_inverted_index.knowledge | 54 |
| abstract_inverted_index.problems, | 124 |
| abstract_inverted_index.reference | 139, 149, 177, 247 |
| abstract_inverted_index.resulting | 36 |
| abstract_inverted_index.synthetic | 225, 233 |
| abstract_inverted_index.two-stage | 169 |
| abstract_inverted_index.</p> | 309 |
| abstract_inverted_index.Currently, | 45 |
| abstract_inverted_index.adaptation | 50, 132 |
| abstract_inverted_index.annotation | 29 |
| abstract_inverted_index.autonomous | 12 |
| abstract_inverted_index.controlled | 181 |
| abstract_inverted_index.difficulty | 24 |
| abstract_inverted_index.expensive, | 35 |
| abstract_inverted_index.mainstream | 269 |
| abstract_inverted_index.maximized. | 183 |
| abstract_inverted_index.reasonably | 163 |
| abstract_inverted_index.adaptation. | 171 |
| abstract_inverted_index.benchmarks, | 271 |
| abstract_inverted_index.degradation | 80 |
| abstract_inverted_index.exploration | 112 |
| abstract_inverted_index.ineffective | 66 |
| abstract_inverted_index.information | 117 |
| abstract_inverted_index.outperforms | 262 |
| abstract_inverted_index.performance | 38, 155, 304 |
| abstract_inverted_index.recognition | 26 |
| abstract_inverted_index.recognizing | 40 |
| abstract_inverted_index.Segmentation | 4 |
| abstract_inverted_index.demonstrated | 276 |
| abstract_inverted_index.intermediate | 94, 115, 198, 218 |
| abstract_inverted_index.segmentation | 53, 106, 254, 294 |
| abstract_inverted_index.Specifically, | 184 |
| abstract_inverted_index.insufficient. | 120 |
| abstract_inverted_index.precondition, | 160 |
| abstract_inverted_index.self-training | 129, 154 |
| abstract_inverted_index.significantly | 261 |
| abstract_inverted_index.</em>can | 179 |
| abstract_inverted_index.generalization | 274 |
| abstract_inverted_index.<p>Robust | 0 |
| abstract_inverted_index.<em>"label | 138, 176, 246 |
| abstract_inverted_index.value"</em> | 248 |
| abstract_inverted_index.value"</em>. | 140 |
| cited_by_percentile_year.max | 96 |
| cited_by_percentile_year.min | 94 |
| countries_distinct_count | 0 |
| institutions_distinct_count | 7 |
| sustainable_development_goals[0].id | https://metadata.un.org/sdg/11 |
| sustainable_development_goals[0].score | 0.5799999833106995 |
| sustainable_development_goals[0].display_name | Sustainable cities and communities |
| citation_normalized_percentile.value | 0.53422013 |
| citation_normalized_percentile.is_in_top_1_percent | False |
| citation_normalized_percentile.is_in_top_10_percent | False |