HybridATNet: Multi-Scale Attention and Hybrid Feature Refinement Network for Remote Sensing Image Super-Resolution Article Swipe
YOU?
·
· 2025
· Open Access
·
· DOI: https://doi.org/10.1109/access.2025.3608009
Remote sensing image super-resolution faces significant challenges, including detail loss, sensor noise, and atmospheric interference that compromise image quality for critical applications. Current deep learning approaches struggle with three key limitations: insufficient multi-scale feature extraction, inadequate attention to crucial image regions, and poor preservation of refined spatial details during reconstruction. We propose HybridATNet, a novel CNN-Transformer architecture that addresses these limitations through three primary innovations. First, our shallow feature extraction integrates Efficient Channel Attention (ECA), Multi-scale Spatial Attention (MSA), and Residual Atrous Spatial Pyramid Pooling (RASPP) to capture features across multiple scales in both channel and spatial domains. Second, we employ an Activated Sparsely Sub-Pixel Transformer (ASSPT) for deep feature extraction that efficiently captures long-range dependencies while maintaining computational efficiency through a novel sparse attention mechanism that reduces computational complexity from O(n2) to approximately O(n log n). Third, we introduce a dual-domain approach that combines multi-scale spatial refinement with frequency-domain processing to preserve structural details while simultaneously reducing high-frequency noise. Extensive experiments on the UCMerced and AID datasets demonstrate that our model consistently outperforms state-of-the-art methods across multiple scale factors (, , ), achieving PSNR improvements of up to 0.26dB on the UCMerced dataset and 0.13dB on the larger AID dataset. Visual comparisons confirm superior preservation of fine details and improved noise reduction, particularly in complex structural regions. HybridATNet delivers high-quality reconstructions while maintaining computational efficiency, requiring approximately 15% fewer parameters and 18% fewer floating-point operations (FLOPs) than other transformer-based models, making it well-suited for practical remote sensing applications.
Related Topics
- Type
- article
- Language
- en
- Landing Page
- https://doi.org/10.1109/access.2025.3608009
- OA Status
- gold
- References
- 59
- OpenAlex ID
- https://openalex.org/W4414166042
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4414166042Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.1109/access.2025.3608009Digital Object Identifier
- Title
-
HybridATNet: Multi-Scale Attention and Hybrid Feature Refinement Network for Remote Sensing Image Super-ResolutionWork title
- Type
-
articleOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2025Year of publication
- Publication date
-
2025-01-01Full publication date if available
- Authors
-
Naveed Sultan, Watchara Ruangsang, Supavadee AramvithList of authors in order
- Landing page
-
https://doi.org/10.1109/access.2025.3608009Publisher landing page
- Open access
-
YesWhether a free full text is available
- OA status
-
goldOpen access status per OpenAlex
- OA URL
-
https://doi.org/10.1109/access.2025.3608009Direct OA link when available
- Cited by
-
0Total citation count in OpenAlex
- References (count)
-
59Number of works referenced by this work
Full payload
| id | https://openalex.org/W4414166042 |
|---|---|
| doi | https://doi.org/10.1109/access.2025.3608009 |
| ids.doi | https://doi.org/10.1109/access.2025.3608009 |
| ids.openalex | https://openalex.org/W4414166042 |
| fwci | 0.0 |
| type | article |
| title | HybridATNet: Multi-Scale Attention and Hybrid Feature Refinement Network for Remote Sensing Image Super-Resolution |
| biblio.issue | |
| biblio.volume | 13 |
| biblio.last_page | 159997 |
| biblio.first_page | 159979 |
| topics[0].id | https://openalex.org/T11659 |
| topics[0].field.id | https://openalex.org/fields/22 |
| topics[0].field.display_name | Engineering |
| topics[0].score | 0.9718000292778015 |
| topics[0].domain.id | https://openalex.org/domains/3 |
| topics[0].domain.display_name | Physical Sciences |
| topics[0].subfield.id | https://openalex.org/subfields/2214 |
| topics[0].subfield.display_name | Media Technology |
| topics[0].display_name | Advanced Image Fusion Techniques |
| topics[1].id | https://openalex.org/T11105 |
| topics[1].field.id | https://openalex.org/fields/17 |
| topics[1].field.display_name | Computer Science |
| topics[1].score | 0.9661999940872192 |
| topics[1].domain.id | https://openalex.org/domains/3 |
| topics[1].domain.display_name | Physical Sciences |
| topics[1].subfield.id | https://openalex.org/subfields/1707 |
| topics[1].subfield.display_name | Computer Vision and Pattern Recognition |
| topics[1].display_name | Advanced Image Processing Techniques |
| topics[2].id | https://openalex.org/T11588 |
| topics[2].field.id | https://openalex.org/fields/23 |
| topics[2].field.display_name | Environmental Science |
| topics[2].score | 0.9107000231742859 |
| topics[2].domain.id | https://openalex.org/domains/3 |
| topics[2].domain.display_name | Physical Sciences |
| topics[2].subfield.id | https://openalex.org/subfields/2306 |
| topics[2].subfield.display_name | Global and Planetary Change |
| topics[2].display_name | Atmospheric and Environmental Gas Dynamics |
| is_xpac | False |
| apc_list.value | 1850 |
| apc_list.currency | USD |
| apc_list.value_usd | 1850 |
| apc_paid.value | 1850 |
| apc_paid.currency | USD |
| apc_paid.value_usd | 1850 |
| language | en |
| locations[0].id | doi:10.1109/access.2025.3608009 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S2485537415 |
| locations[0].source.issn | 2169-3536 |
| locations[0].source.type | journal |
| locations[0].source.is_oa | True |
| locations[0].source.issn_l | 2169-3536 |
| locations[0].source.is_core | True |
| locations[0].source.is_in_doaj | True |
| locations[0].source.display_name | IEEE Access |
| locations[0].source.host_organization | https://openalex.org/P4310319808 |
| locations[0].source.host_organization_name | Institute of Electrical and Electronics Engineers |
| locations[0].source.host_organization_lineage | https://openalex.org/P4310319808 |
| locations[0].source.host_organization_lineage_names | Institute of Electrical and Electronics Engineers |
| locations[0].license | cc-by |
| locations[0].pdf_url | |
| locations[0].version | publishedVersion |
| locations[0].raw_type | journal-article |
| locations[0].license_id | https://openalex.org/licenses/cc-by |
| locations[0].is_accepted | True |
| locations[0].is_published | True |
| locations[0].raw_source_name | IEEE Access |
| locations[0].landing_page_url | https://doi.org/10.1109/access.2025.3608009 |
| locations[1].id | pmh:oai:doaj.org/article:106a98b5ebab45f09bb00bde916b6de6 |
| locations[1].is_oa | False |
| locations[1].source.id | https://openalex.org/S4306401280 |
| locations[1].source.issn | |
| locations[1].source.type | repository |
| locations[1].source.is_oa | False |
| locations[1].source.issn_l | |
| locations[1].source.is_core | False |
| locations[1].source.is_in_doaj | False |
| locations[1].source.display_name | DOAJ (DOAJ: Directory of Open Access Journals) |
| locations[1].source.host_organization | |
| locations[1].source.host_organization_name | |
| locations[1].license | |
| locations[1].pdf_url | |
| locations[1].version | submittedVersion |
| locations[1].raw_type | article |
| locations[1].license_id | |
| locations[1].is_accepted | False |
| locations[1].is_published | False |
| locations[1].raw_source_name | IEEE Access, Vol 13, Pp 159979-159997 (2025) |
| locations[1].landing_page_url | https://doaj.org/article/106a98b5ebab45f09bb00bde916b6de6 |
| indexed_in | crossref, doaj |
| authorships[0].author.id | https://openalex.org/A5113204437 |
| authorships[0].author.orcid | https://orcid.org/0009-0007-6020-1903 |
| authorships[0].author.display_name | Naveed Sultan |
| authorships[0].countries | TH |
| authorships[0].affiliations[0].institution_ids | https://openalex.org/I158708052 |
| authorships[0].affiliations[0].raw_affiliation_string | Department of Electrical Engineering, Faculty of Engineering, Multimedia Data Analytics and Processing Research Unit, Chulalongkorn University, Bangkok, Thailand |
| authorships[0].institutions[0].id | https://openalex.org/I158708052 |
| authorships[0].institutions[0].ror | https://ror.org/028wp3y58 |
| authorships[0].institutions[0].type | education |
| authorships[0].institutions[0].lineage | https://openalex.org/I158708052 |
| authorships[0].institutions[0].country_code | TH |
| authorships[0].institutions[0].display_name | Chulalongkorn University |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Naveed Sultan |
| authorships[0].is_corresponding | False |
| authorships[0].raw_affiliation_strings | Department of Electrical Engineering, Faculty of Engineering, Multimedia Data Analytics and Processing Research Unit, Chulalongkorn University, Bangkok, Thailand |
| authorships[1].author.id | https://openalex.org/A5066340947 |
| authorships[1].author.orcid | |
| authorships[1].author.display_name | Watchara Ruangsang |
| authorships[1].countries | TH |
| authorships[1].affiliations[0].institution_ids | https://openalex.org/I158708052 |
| authorships[1].affiliations[0].raw_affiliation_string | Department of Electrical Engineering, Faculty of Engineering, Multimedia Data Analytics and Processing Research Unit, Chulalongkorn University, Bangkok, Thailand |
| authorships[1].institutions[0].id | https://openalex.org/I158708052 |
| authorships[1].institutions[0].ror | https://ror.org/028wp3y58 |
| authorships[1].institutions[0].type | education |
| authorships[1].institutions[0].lineage | https://openalex.org/I158708052 |
| authorships[1].institutions[0].country_code | TH |
| authorships[1].institutions[0].display_name | Chulalongkorn University |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Watchara Ruangsang |
| authorships[1].is_corresponding | False |
| authorships[1].raw_affiliation_strings | Department of Electrical Engineering, Faculty of Engineering, Multimedia Data Analytics and Processing Research Unit, Chulalongkorn University, Bangkok, Thailand |
| authorships[2].author.id | https://openalex.org/A5069698375 |
| authorships[2].author.orcid | https://orcid.org/0000-0001-9840-3171 |
| authorships[2].author.display_name | Supavadee Aramvith |
| authorships[2].countries | TH |
| authorships[2].affiliations[0].institution_ids | https://openalex.org/I158708052 |
| authorships[2].affiliations[0].raw_affiliation_string | Department of Electrical Engineering, Faculty of Engineering, Multimedia Data Analytics and Processing Research Unit, Chulalongkorn University, Bangkok, Thailand |
| authorships[2].institutions[0].id | https://openalex.org/I158708052 |
| authorships[2].institutions[0].ror | https://ror.org/028wp3y58 |
| authorships[2].institutions[0].type | education |
| authorships[2].institutions[0].lineage | https://openalex.org/I158708052 |
| authorships[2].institutions[0].country_code | TH |
| authorships[2].institutions[0].display_name | Chulalongkorn University |
| authorships[2].author_position | last |
| authorships[2].raw_author_name | Supavadee Aramvith |
| authorships[2].is_corresponding | False |
| authorships[2].raw_affiliation_strings | Department of Electrical Engineering, Faculty of Engineering, Multimedia Data Analytics and Processing Research Unit, Chulalongkorn University, Bangkok, Thailand |
| has_content.pdf | False |
| has_content.grobid_xml | False |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://doi.org/10.1109/access.2025.3608009 |
| open_access.oa_status | gold |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | HybridATNet: Multi-Scale Attention and Hybrid Feature Refinement Network for Remote Sensing Image Super-Resolution |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-11-06T03:46:38.306776 |
| primary_topic.id | https://openalex.org/T11659 |
| primary_topic.field.id | https://openalex.org/fields/22 |
| primary_topic.field.display_name | Engineering |
| primary_topic.score | 0.9718000292778015 |
| primary_topic.domain.id | https://openalex.org/domains/3 |
| primary_topic.domain.display_name | Physical Sciences |
| primary_topic.subfield.id | https://openalex.org/subfields/2214 |
| primary_topic.subfield.display_name | Media Technology |
| primary_topic.display_name | Advanced Image Fusion Techniques |
| cited_by_count | 0 |
| locations_count | 2 |
| best_oa_location.id | doi:10.1109/access.2025.3608009 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S2485537415 |
| best_oa_location.source.issn | 2169-3536 |
| best_oa_location.source.type | journal |
| best_oa_location.source.is_oa | True |
| best_oa_location.source.issn_l | 2169-3536 |
| best_oa_location.source.is_core | True |
| best_oa_location.source.is_in_doaj | True |
| best_oa_location.source.display_name | IEEE Access |
| best_oa_location.source.host_organization | https://openalex.org/P4310319808 |
| best_oa_location.source.host_organization_name | Institute of Electrical and Electronics Engineers |
| best_oa_location.source.host_organization_lineage | https://openalex.org/P4310319808 |
| best_oa_location.source.host_organization_lineage_names | Institute of Electrical and Electronics Engineers |
| best_oa_location.license | cc-by |
| best_oa_location.pdf_url | |
| best_oa_location.version | publishedVersion |
| best_oa_location.raw_type | journal-article |
| best_oa_location.license_id | https://openalex.org/licenses/cc-by |
| best_oa_location.is_accepted | True |
| best_oa_location.is_published | True |
| best_oa_location.raw_source_name | IEEE Access |
| best_oa_location.landing_page_url | https://doi.org/10.1109/access.2025.3608009 |
| primary_location.id | doi:10.1109/access.2025.3608009 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S2485537415 |
| primary_location.source.issn | 2169-3536 |
| primary_location.source.type | journal |
| primary_location.source.is_oa | True |
| primary_location.source.issn_l | 2169-3536 |
| primary_location.source.is_core | True |
| primary_location.source.is_in_doaj | True |
| primary_location.source.display_name | IEEE Access |
| primary_location.source.host_organization | https://openalex.org/P4310319808 |
| primary_location.source.host_organization_name | Institute of Electrical and Electronics Engineers |
| primary_location.source.host_organization_lineage | https://openalex.org/P4310319808 |
| primary_location.source.host_organization_lineage_names | Institute of Electrical and Electronics Engineers |
| primary_location.license | cc-by |
| primary_location.pdf_url | |
| primary_location.version | publishedVersion |
| primary_location.raw_type | journal-article |
| primary_location.license_id | https://openalex.org/licenses/cc-by |
| primary_location.is_accepted | True |
| primary_location.is_published | True |
| primary_location.raw_source_name | IEEE Access |
| primary_location.landing_page_url | https://doi.org/10.1109/access.2025.3608009 |
| publication_date | 2025-01-01 |
| publication_year | 2025 |
| referenced_works | https://openalex.org/W4309534737, https://openalex.org/W4285301580, https://openalex.org/W4381886284, https://openalex.org/W4383220087, https://openalex.org/W4391547538, https://openalex.org/W4389778595, https://openalex.org/W3121000959, https://openalex.org/W1586298956, https://openalex.org/W2604334942, https://openalex.org/W2120577161, https://openalex.org/W2138598313, https://openalex.org/W2117709736, https://openalex.org/W2038036044, https://openalex.org/W2324396614, https://openalex.org/W1885185971, https://openalex.org/W2964101377, https://openalex.org/W4294982794, https://openalex.org/W4388634307, https://openalex.org/W4385245566, https://openalex.org/W3035022492, https://openalex.org/W4287020683, https://openalex.org/W4390272253, https://openalex.org/W4398781999, https://openalex.org/W2242218935, https://openalex.org/W2866634454, https://openalex.org/W2963729050, https://openalex.org/W2963610452, https://openalex.org/W2964164961, https://openalex.org/W2976372274, https://openalex.org/W2414132572, https://openalex.org/W2621121458, https://openalex.org/W3010041600, https://openalex.org/W2097117768, https://openalex.org/W753847829, https://openalex.org/W2898832590, https://openalex.org/W2907551576, https://openalex.org/W2964951804, https://openalex.org/W3129538137, https://openalex.org/W4295308250, https://openalex.org/W4293182926, https://openalex.org/W3207918547, https://openalex.org/W4383818974, https://openalex.org/W4206433182, https://openalex.org/W4291653231, https://openalex.org/W1980038761, https://openalex.org/W2515866431, https://openalex.org/W2133665775, https://openalex.org/W2121058967, https://openalex.org/W2503339013, https://openalex.org/W4206244656, https://openalex.org/W2920074116, https://openalex.org/W3046108465, https://openalex.org/W3153239544, https://openalex.org/W4312869699, https://openalex.org/W4226219071, https://openalex.org/W4386075509, https://openalex.org/W4385805019, https://openalex.org/W4390285150, https://openalex.org/W4225466252 |
| referenced_works_count | 59 |
| abstract_inverted_index.a | 53, 121, 140 |
| abstract_inverted_index.2$ | 183 |
| abstract_inverted_index.3$ | 188 |
| abstract_inverted_index.4$ | 193 |
| abstract_inverted_index.We | 50 |
| abstract_inverted_index.an | 101 |
| abstract_inverted_index.in | 92, 226 |
| abstract_inverted_index.it | 254 |
| abstract_inverted_index.of | 44, 198, 218 |
| abstract_inverted_index.on | 162, 202, 208 |
| abstract_inverted_index.to | 37, 86, 132, 151, 200 |
| abstract_inverted_index.up | 199 |
| abstract_inverted_index.we | 99, 138 |
| abstract_inverted_index.15% | 240 |
| abstract_inverted_index.18% | 244 |
| abstract_inverted_index.AID | 166, 211 |
| abstract_inverted_index.O(n | 134 |
| abstract_inverted_index.and | 12, 41, 79, 95, 165, 206, 221, 243 |
| abstract_inverted_index.for | 19, 107, 256 |
| abstract_inverted_index.key | 29 |
| abstract_inverted_index.log | 135 |
| abstract_inverted_index.n). | 136 |
| abstract_inverted_index.our | 66, 170 |
| abstract_inverted_index.the | 163, 203, 209 |
| abstract_inverted_index.PSNR | 196 |
| abstract_inverted_index.both | 93 |
| abstract_inverted_index.deep | 23, 108 |
| abstract_inverted_index.fine | 219 |
| abstract_inverted_index.from | 130 |
| abstract_inverted_index.poor | 42 |
| abstract_inverted_index.than | 249 |
| abstract_inverted_index.that | 15, 57, 111, 126, 143, 169 |
| abstract_inverted_index.with | 27, 148 |
| abstract_inverted_index.O(n2) | 131 |
| abstract_inverted_index.faces | 4 |
| abstract_inverted_index.fewer | 241, 245 |
| abstract_inverted_index.image | 2, 17, 39 |
| abstract_inverted_index.loss, | 9 |
| abstract_inverted_index.model | 171 |
| abstract_inverted_index.noise | 223 |
| abstract_inverted_index.novel | 54, 122 |
| abstract_inverted_index.other | 250 |
| abstract_inverted_index.scale | 178 |
| abstract_inverted_index.these | 59 |
| abstract_inverted_index.three | 28, 62 |
| abstract_inverted_index.while | 116, 155, 234 |
| abstract_inverted_index.(ECA), | 74 |
| abstract_inverted_index.(MSA), | 78 |
| abstract_inverted_index.0.13dB | 207 |
| abstract_inverted_index.0.26dB | 201 |
| abstract_inverted_index.Atrous | 81 |
| abstract_inverted_index.First, | 65 |
| abstract_inverted_index.Remote | 0 |
| abstract_inverted_index.Third, | 137 |
| abstract_inverted_index.Visual | 213 |
| abstract_inverted_index.across | 89, 176 |
| abstract_inverted_index.detail | 8 |
| abstract_inverted_index.during | 48 |
| abstract_inverted_index.employ | 100 |
| abstract_inverted_index.larger | 210 |
| abstract_inverted_index.making | 253 |
| abstract_inverted_index.noise, | 11 |
| abstract_inverted_index.noise. | 159 |
| abstract_inverted_index.remote | 258 |
| abstract_inverted_index.scales | 91 |
| abstract_inverted_index.sensor | 10 |
| abstract_inverted_index.sparse | 123 |
| abstract_inverted_index.(ASSPT) | 106 |
| abstract_inverted_index.(FLOPs) | 248 |
| abstract_inverted_index.(RASPP) | 85 |
| abstract_inverted_index.Channel | 72 |
| abstract_inverted_index.Current | 22 |
| abstract_inverted_index.Pooling | 84 |
| abstract_inverted_index.Pyramid | 83 |
| abstract_inverted_index.Second, | 98 |
| abstract_inverted_index.Spatial | 76, 82 |
| abstract_inverted_index.capture | 87 |
| abstract_inverted_index.channel | 94 |
| abstract_inverted_index.complex | 227 |
| abstract_inverted_index.confirm | 215 |
| abstract_inverted_index.crucial | 38 |
| abstract_inverted_index.dataset | 205 |
| abstract_inverted_index.details | 47, 154, 220 |
| abstract_inverted_index.factors | 179 |
| abstract_inverted_index.feature | 33, 68, 109 |
| abstract_inverted_index.methods | 175 |
| abstract_inverted_index.models, | 252 |
| abstract_inverted_index.primary | 63 |
| abstract_inverted_index.propose | 51 |
| abstract_inverted_index.quality | 18 |
| abstract_inverted_index.reduces | 127 |
| abstract_inverted_index.refined | 45 |
| abstract_inverted_index.sensing | 1, 259 |
| abstract_inverted_index.shallow | 67 |
| abstract_inverted_index.spatial | 46, 96, 146 |
| abstract_inverted_index.through | 61, 120 |
| abstract_inverted_index.Residual | 80 |
| abstract_inverted_index.Sparsely | 103 |
| abstract_inverted_index.UCMerced | 164, 204 |
| abstract_inverted_index.approach | 142 |
| abstract_inverted_index.captures | 113 |
| abstract_inverted_index.combines | 144 |
| abstract_inverted_index.critical | 20 |
| abstract_inverted_index.dataset. | 212 |
| abstract_inverted_index.datasets | 167 |
| abstract_inverted_index.delivers | 231 |
| abstract_inverted_index.domains. | 97 |
| abstract_inverted_index.features | 88 |
| abstract_inverted_index.improved | 222 |
| abstract_inverted_index.learning | 24 |
| abstract_inverted_index.multiple | 90, 177 |
| abstract_inverted_index.preserve | 152 |
| abstract_inverted_index.reducing | 157 |
| abstract_inverted_index.regions, | 40 |
| abstract_inverted_index.regions. | 229 |
| abstract_inverted_index.struggle | 26 |
| abstract_inverted_index.superior | 216 |
| abstract_inverted_index.<tex-math | 181, 186, 191 |
| abstract_inverted_index.Activated | 102 |
| abstract_inverted_index.Attention | 73, 77 |
| abstract_inverted_index.Efficient | 71 |
| abstract_inverted_index.Extensive | 160 |
| abstract_inverted_index.Sub-Pixel | 104 |
| abstract_inverted_index.achieving | 195 |
| abstract_inverted_index.addresses | 58 |
| abstract_inverted_index.attention | 36, 124 |
| abstract_inverted_index.including | 7 |
| abstract_inverted_index.introduce | 139 |
| abstract_inverted_index.mechanism | 125 |
| abstract_inverted_index.practical | 257 |
| abstract_inverted_index.requiring | 238 |
| abstract_inverted_index.approaches | 25 |
| abstract_inverted_index.complexity | 129 |
| abstract_inverted_index.compromise | 16 |
| abstract_inverted_index.efficiency | 119 |
| abstract_inverted_index.extraction | 69, 110 |
| abstract_inverted_index.inadequate | 35 |
| abstract_inverted_index.integrates | 70 |
| abstract_inverted_index.long-range | 114 |
| abstract_inverted_index.operations | 247 |
| abstract_inverted_index.parameters | 242 |
| abstract_inverted_index.processing | 150 |
| abstract_inverted_index.reduction, | 224 |
| abstract_inverted_index.refinement | 147 |
| abstract_inverted_index.structural | 153, 228 |
| abstract_inverted_index.HybridATNet | 230 |
| abstract_inverted_index.Multi-scale | 75 |
| abstract_inverted_index.Transformer | 105 |
| abstract_inverted_index.atmospheric | 13 |
| abstract_inverted_index.challenges, | 6 |
| abstract_inverted_index.comparisons | 214 |
| abstract_inverted_index.demonstrate | 168 |
| abstract_inverted_index.dual-domain | 141 |
| abstract_inverted_index.efficiency, | 237 |
| abstract_inverted_index.efficiently | 112 |
| abstract_inverted_index.experiments | 161 |
| abstract_inverted_index.extraction, | 34 |
| abstract_inverted_index.limitations | 60 |
| abstract_inverted_index.maintaining | 117, 235 |
| abstract_inverted_index.multi-scale | 32, 145 |
| abstract_inverted_index.outperforms | 173 |
| abstract_inverted_index.significant | 5 |
| abstract_inverted_index.well-suited | 255 |
| abstract_inverted_index.HybridATNet, | 52 |
| abstract_inverted_index.architecture | 56 |
| abstract_inverted_index.consistently | 172 |
| abstract_inverted_index.dependencies | 115 |
| abstract_inverted_index.high-quality | 232 |
| abstract_inverted_index.improvements | 197 |
| abstract_inverted_index.innovations. | 64 |
| abstract_inverted_index.insufficient | 31 |
| abstract_inverted_index.interference | 14 |
| abstract_inverted_index.limitations: | 30 |
| abstract_inverted_index.particularly | 225 |
| abstract_inverted_index.preservation | 43, 217 |
| abstract_inverted_index.applications. | 21, 260 |
| abstract_inverted_index.approximately | 133, 239 |
| abstract_inverted_index.computational | 118, 128, 236 |
| abstract_inverted_index.floating-point | 246 |
| abstract_inverted_index.high-frequency | 158 |
| abstract_inverted_index.simultaneously | 156 |
| abstract_inverted_index.CNN-Transformer | 55 |
| abstract_inverted_index.reconstruction. | 49 |
| abstract_inverted_index.reconstructions | 233 |
| abstract_inverted_index.<inline-formula> | 185, 190 |
| abstract_inverted_index.frequency-domain | 149 |
| abstract_inverted_index.state-of-the-art | 174 |
| abstract_inverted_index.super-resolution | 3 |
| abstract_inverted_index.(<inline-formula> | 180 |
| abstract_inverted_index.transformer-based | 251 |
| abstract_inverted_index.notation="LaTeX">$\times | 182, 187, 192 |
| abstract_inverted_index.</tex-math></inline-formula>, | 184, 189 |
| abstract_inverted_index.</tex-math></inline-formula>), | 194 |
| cited_by_percentile_year | |
| countries_distinct_count | 1 |
| institutions_distinct_count | 3 |
| citation_normalized_percentile.value | 0.56855662 |
| citation_normalized_percentile.is_in_top_1_percent | False |
| citation_normalized_percentile.is_in_top_10_percent | False |