Image Generation and Recognition Technology Based on Attention Residual GAN Article Swipe
YOU?
·
· 2023
· Open Access
·
· DOI: https://doi.org/10.1109/access.2023.3287854
In accordance with the concept of game antagonism, Generative Adversarial Network (GAN) is a popular model in current image generation technology. However, GAN has problems such as unstable training and difficult convergence, which seriously affect the effectiveness of input feature extraction and image recognition. The study introduces residual network structure and self attention mechanism to calculate the weight parameters of features, and then guides image generation through image label information. The improved GAN model classifier is applied to image recognition. The final experimental data shows that the Fréchet Inception Distance (FID) values of the iGAN in facial expressions and behavioral actions are 77.68 and 176.84, respectively, which are closer to the distribution of real image data. In behavioral image recognition, the accuracy of the model is 96.8%, and the required time is 30 seconds. In facial expression recognition, the accuracy and recognition time of the model are 90.1% and 24 seconds, respectively. This indicates that it can generate high-quality images, has stronger feature extraction capabilities, and has higher recognition efficiency. This model provides a new technical reference for the further improvement of image processing technology, and has certain application potential and value.
Related Topics
- Type
- article
- Language
- en
- Landing Page
- https://doi.org/10.1109/access.2023.3287854
- https://ieeexplore.ieee.org/ielx7/6287639/10005208/10156812.pdf
- OA Status
- gold
- Cited By
- 10
- References
- 36
- Related Works
- 10
- OpenAlex ID
- https://openalex.org/W4381327795
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4381327795Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.1109/access.2023.3287854Digital Object Identifier
- Title
-
Image Generation and Recognition Technology Based on Attention Residual GANWork title
- Type
-
articleOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2023Year of publication
- Publication date
-
2023-01-01Full publication date if available
- Authors
-
Huazhe Wang, Li MaList of authors in order
- Landing page
-
https://doi.org/10.1109/access.2023.3287854Publisher landing page
- PDF URL
-
https://ieeexplore.ieee.org/ielx7/6287639/10005208/10156812.pdfDirect link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
goldOpen access status per OpenAlex
- OA URL
-
https://ieeexplore.ieee.org/ielx7/6287639/10005208/10156812.pdfDirect OA link when available
- Concepts
-
Computer science, Artificial intelligence, Residual, Feature extraction, Pattern recognition (psychology), Classifier (UML), Facial recognition system, Image (mathematics), Computer vision, AlgorithmTop concepts (fields/topics) attached by OpenAlex
- Cited by
-
10Total citation count in OpenAlex
- Citations by year (recent)
-
2025: 3, 2024: 7Per-year citation counts (last 5 years)
- References (count)
-
36Number of works referenced by this work
- Related works (count)
-
10Other works algorithmically related by OpenAlex
Full payload
| id | https://openalex.org/W4381327795 |
|---|---|
| doi | https://doi.org/10.1109/access.2023.3287854 |
| ids.doi | https://doi.org/10.1109/access.2023.3287854 |
| ids.openalex | https://openalex.org/W4381327795 |
| fwci | 1.8196834 |
| type | article |
| title | Image Generation and Recognition Technology Based on Attention Residual GAN |
| biblio.issue | |
| biblio.volume | 11 |
| biblio.last_page | 61865 |
| biblio.first_page | 61855 |
| topics[0].id | https://openalex.org/T10775 |
| topics[0].field.id | https://openalex.org/fields/17 |
| topics[0].field.display_name | Computer Science |
| topics[0].score | 0.9927999973297119 |
| topics[0].domain.id | https://openalex.org/domains/3 |
| topics[0].domain.display_name | Physical Sciences |
| topics[0].subfield.id | https://openalex.org/subfields/1707 |
| topics[0].subfield.display_name | Computer Vision and Pattern Recognition |
| topics[0].display_name | Generative Adversarial Networks and Image Synthesis |
| topics[1].id | https://openalex.org/T13904 |
| topics[1].field.id | https://openalex.org/fields/17 |
| topics[1].field.display_name | Computer Science |
| topics[1].score | 0.9717000126838684 |
| topics[1].domain.id | https://openalex.org/domains/3 |
| topics[1].domain.display_name | Physical Sciences |
| topics[1].subfield.id | https://openalex.org/subfields/1702 |
| topics[1].subfield.display_name | Artificial Intelligence |
| topics[1].display_name | Artificial Intelligence Applications |
| topics[2].id | https://openalex.org/T11105 |
| topics[2].field.id | https://openalex.org/fields/17 |
| topics[2].field.display_name | Computer Science |
| topics[2].score | 0.9524999856948853 |
| topics[2].domain.id | https://openalex.org/domains/3 |
| topics[2].domain.display_name | Physical Sciences |
| topics[2].subfield.id | https://openalex.org/subfields/1707 |
| topics[2].subfield.display_name | Computer Vision and Pattern Recognition |
| topics[2].display_name | Advanced Image Processing Techniques |
| is_xpac | False |
| apc_list.value | 1850 |
| apc_list.currency | USD |
| apc_list.value_usd | 1850 |
| apc_paid.value | 1850 |
| apc_paid.currency | USD |
| apc_paid.value_usd | 1850 |
| concepts[0].id | https://openalex.org/C41008148 |
| concepts[0].level | 0 |
| concepts[0].score | 0.7467710971832275 |
| concepts[0].wikidata | https://www.wikidata.org/wiki/Q21198 |
| concepts[0].display_name | Computer science |
| concepts[1].id | https://openalex.org/C154945302 |
| concepts[1].level | 1 |
| concepts[1].score | 0.6736008524894714 |
| concepts[1].wikidata | https://www.wikidata.org/wiki/Q11660 |
| concepts[1].display_name | Artificial intelligence |
| concepts[2].id | https://openalex.org/C155512373 |
| concepts[2].level | 2 |
| concepts[2].score | 0.6130399703979492 |
| concepts[2].wikidata | https://www.wikidata.org/wiki/Q287450 |
| concepts[2].display_name | Residual |
| concepts[3].id | https://openalex.org/C52622490 |
| concepts[3].level | 2 |
| concepts[3].score | 0.5820748805999756 |
| concepts[3].wikidata | https://www.wikidata.org/wiki/Q1026626 |
| concepts[3].display_name | Feature extraction |
| concepts[4].id | https://openalex.org/C153180895 |
| concepts[4].level | 2 |
| concepts[4].score | 0.5381814241409302 |
| concepts[4].wikidata | https://www.wikidata.org/wiki/Q7148389 |
| concepts[4].display_name | Pattern recognition (psychology) |
| concepts[5].id | https://openalex.org/C95623464 |
| concepts[5].level | 2 |
| concepts[5].score | 0.5095643997192383 |
| concepts[5].wikidata | https://www.wikidata.org/wiki/Q1096149 |
| concepts[5].display_name | Classifier (UML) |
| concepts[6].id | https://openalex.org/C31510193 |
| concepts[6].level | 3 |
| concepts[6].score | 0.4715716242790222 |
| concepts[6].wikidata | https://www.wikidata.org/wiki/Q1192553 |
| concepts[6].display_name | Facial recognition system |
| concepts[7].id | https://openalex.org/C115961682 |
| concepts[7].level | 2 |
| concepts[7].score | 0.43805402517318726 |
| concepts[7].wikidata | https://www.wikidata.org/wiki/Q860623 |
| concepts[7].display_name | Image (mathematics) |
| concepts[8].id | https://openalex.org/C31972630 |
| concepts[8].level | 1 |
| concepts[8].score | 0.433917760848999 |
| concepts[8].wikidata | https://www.wikidata.org/wiki/Q844240 |
| concepts[8].display_name | Computer vision |
| concepts[9].id | https://openalex.org/C11413529 |
| concepts[9].level | 1 |
| concepts[9].score | 0.0999341607093811 |
| concepts[9].wikidata | https://www.wikidata.org/wiki/Q8366 |
| concepts[9].display_name | Algorithm |
| keywords[0].id | https://openalex.org/keywords/computer-science |
| keywords[0].score | 0.7467710971832275 |
| keywords[0].display_name | Computer science |
| keywords[1].id | https://openalex.org/keywords/artificial-intelligence |
| keywords[1].score | 0.6736008524894714 |
| keywords[1].display_name | Artificial intelligence |
| keywords[2].id | https://openalex.org/keywords/residual |
| keywords[2].score | 0.6130399703979492 |
| keywords[2].display_name | Residual |
| keywords[3].id | https://openalex.org/keywords/feature-extraction |
| keywords[3].score | 0.5820748805999756 |
| keywords[3].display_name | Feature extraction |
| keywords[4].id | https://openalex.org/keywords/pattern-recognition |
| keywords[4].score | 0.5381814241409302 |
| keywords[4].display_name | Pattern recognition (psychology) |
| keywords[5].id | https://openalex.org/keywords/classifier |
| keywords[5].score | 0.5095643997192383 |
| keywords[5].display_name | Classifier (UML) |
| keywords[6].id | https://openalex.org/keywords/facial-recognition-system |
| keywords[6].score | 0.4715716242790222 |
| keywords[6].display_name | Facial recognition system |
| keywords[7].id | https://openalex.org/keywords/image |
| keywords[7].score | 0.43805402517318726 |
| keywords[7].display_name | Image (mathematics) |
| keywords[8].id | https://openalex.org/keywords/computer-vision |
| keywords[8].score | 0.433917760848999 |
| keywords[8].display_name | Computer vision |
| keywords[9].id | https://openalex.org/keywords/algorithm |
| keywords[9].score | 0.0999341607093811 |
| keywords[9].display_name | Algorithm |
| language | en |
| locations[0].id | doi:10.1109/access.2023.3287854 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S2485537415 |
| locations[0].source.issn | 2169-3536 |
| locations[0].source.type | journal |
| locations[0].source.is_oa | True |
| locations[0].source.issn_l | 2169-3536 |
| locations[0].source.is_core | True |
| locations[0].source.is_in_doaj | True |
| locations[0].source.display_name | IEEE Access |
| locations[0].source.host_organization | https://openalex.org/P4310319808 |
| locations[0].source.host_organization_name | Institute of Electrical and Electronics Engineers |
| locations[0].source.host_organization_lineage | https://openalex.org/P4310319808 |
| locations[0].source.host_organization_lineage_names | Institute of Electrical and Electronics Engineers |
| locations[0].license | |
| locations[0].pdf_url | https://ieeexplore.ieee.org/ielx7/6287639/10005208/10156812.pdf |
| locations[0].version | publishedVersion |
| locations[0].raw_type | journal-article |
| locations[0].license_id | |
| locations[0].is_accepted | True |
| locations[0].is_published | True |
| locations[0].raw_source_name | IEEE Access |
| locations[0].landing_page_url | https://doi.org/10.1109/access.2023.3287854 |
| locations[1].id | pmh:oai:doaj.org/article:768780d801424eb3bc2a16557065a866 |
| locations[1].is_oa | False |
| locations[1].source.id | https://openalex.org/S4306401280 |
| locations[1].source.issn | |
| locations[1].source.type | repository |
| locations[1].source.is_oa | False |
| locations[1].source.issn_l | |
| locations[1].source.is_core | False |
| locations[1].source.is_in_doaj | False |
| locations[1].source.display_name | DOAJ (DOAJ: Directory of Open Access Journals) |
| locations[1].source.host_organization | |
| locations[1].source.host_organization_name | |
| locations[1].license | |
| locations[1].pdf_url | |
| locations[1].version | submittedVersion |
| locations[1].raw_type | article |
| locations[1].license_id | |
| locations[1].is_accepted | False |
| locations[1].is_published | False |
| locations[1].raw_source_name | IEEE Access, Vol 11, Pp 61855-61865 (2023) |
| locations[1].landing_page_url | https://doaj.org/article/768780d801424eb3bc2a16557065a866 |
| indexed_in | crossref, doaj |
| authorships[0].author.id | https://openalex.org/A5078387621 |
| authorships[0].author.orcid | https://orcid.org/0009-0002-5839-5220 |
| authorships[0].author.display_name | Huazhe Wang |
| authorships[0].countries | CN |
| authorships[0].affiliations[0].institution_ids | https://openalex.org/I4210099388 |
| authorships[0].affiliations[0].raw_affiliation_string | College of Computer Engineering, Shangqiu Polytechnic, Shangqiu, China |
| authorships[0].institutions[0].id | https://openalex.org/I4210099388 |
| authorships[0].institutions[0].ror | https://ror.org/016qtng06 |
| authorships[0].institutions[0].type | education |
| authorships[0].institutions[0].lineage | https://openalex.org/I4210099388 |
| authorships[0].institutions[0].country_code | CN |
| authorships[0].institutions[0].display_name | Shangqiu Institute of Technology |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Huazhe Wang |
| authorships[0].is_corresponding | False |
| authorships[0].raw_affiliation_strings | College of Computer Engineering, Shangqiu Polytechnic, Shangqiu, China |
| authorships[1].author.id | https://openalex.org/A5100462130 |
| authorships[1].author.orcid | https://orcid.org/0000-0002-5353-3543 |
| authorships[1].author.display_name | Li Ma |
| authorships[1].countries | CN |
| authorships[1].affiliations[0].institution_ids | https://openalex.org/I4210099388 |
| authorships[1].affiliations[0].raw_affiliation_string | Soft Vocational Technology Institute, Shangqiu Polytechnic, Shangqiu, China |
| authorships[1].institutions[0].id | https://openalex.org/I4210099388 |
| authorships[1].institutions[0].ror | https://ror.org/016qtng06 |
| authorships[1].institutions[0].type | education |
| authorships[1].institutions[0].lineage | https://openalex.org/I4210099388 |
| authorships[1].institutions[0].country_code | CN |
| authorships[1].institutions[0].display_name | Shangqiu Institute of Technology |
| authorships[1].author_position | last |
| authorships[1].raw_author_name | Li Ma |
| authorships[1].is_corresponding | False |
| authorships[1].raw_affiliation_strings | Soft Vocational Technology Institute, Shangqiu Polytechnic, Shangqiu, China |
| has_content.pdf | True |
| has_content.grobid_xml | True |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://ieeexplore.ieee.org/ielx7/6287639/10005208/10156812.pdf |
| open_access.oa_status | gold |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | Image Generation and Recognition Technology Based on Attention Residual GAN |
| has_fulltext | True |
| is_retracted | False |
| updated_date | 2025-11-06T03:46:38.306776 |
| primary_topic.id | https://openalex.org/T10775 |
| primary_topic.field.id | https://openalex.org/fields/17 |
| primary_topic.field.display_name | Computer Science |
| primary_topic.score | 0.9927999973297119 |
| primary_topic.domain.id | https://openalex.org/domains/3 |
| primary_topic.domain.display_name | Physical Sciences |
| primary_topic.subfield.id | https://openalex.org/subfields/1707 |
| primary_topic.subfield.display_name | Computer Vision and Pattern Recognition |
| primary_topic.display_name | Generative Adversarial Networks and Image Synthesis |
| related_works | https://openalex.org/W2560215812, https://openalex.org/W2949601986, https://openalex.org/W2788972299, https://openalex.org/W2521347458, https://openalex.org/W2498789492, https://openalex.org/W2729981612, https://openalex.org/W2925692864, https://openalex.org/W4233449973, https://openalex.org/W4391013256, https://openalex.org/W2985118265 |
| cited_by_count | 10 |
| counts_by_year[0].year | 2025 |
| counts_by_year[0].cited_by_count | 3 |
| counts_by_year[1].year | 2024 |
| counts_by_year[1].cited_by_count | 7 |
| locations_count | 2 |
| best_oa_location.id | doi:10.1109/access.2023.3287854 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S2485537415 |
| best_oa_location.source.issn | 2169-3536 |
| best_oa_location.source.type | journal |
| best_oa_location.source.is_oa | True |
| best_oa_location.source.issn_l | 2169-3536 |
| best_oa_location.source.is_core | True |
| best_oa_location.source.is_in_doaj | True |
| best_oa_location.source.display_name | IEEE Access |
| best_oa_location.source.host_organization | https://openalex.org/P4310319808 |
| best_oa_location.source.host_organization_name | Institute of Electrical and Electronics Engineers |
| best_oa_location.source.host_organization_lineage | https://openalex.org/P4310319808 |
| best_oa_location.source.host_organization_lineage_names | Institute of Electrical and Electronics Engineers |
| best_oa_location.license | |
| best_oa_location.pdf_url | https://ieeexplore.ieee.org/ielx7/6287639/10005208/10156812.pdf |
| best_oa_location.version | publishedVersion |
| best_oa_location.raw_type | journal-article |
| best_oa_location.license_id | |
| best_oa_location.is_accepted | True |
| best_oa_location.is_published | True |
| best_oa_location.raw_source_name | IEEE Access |
| best_oa_location.landing_page_url | https://doi.org/10.1109/access.2023.3287854 |
| primary_location.id | doi:10.1109/access.2023.3287854 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S2485537415 |
| primary_location.source.issn | 2169-3536 |
| primary_location.source.type | journal |
| primary_location.source.is_oa | True |
| primary_location.source.issn_l | 2169-3536 |
| primary_location.source.is_core | True |
| primary_location.source.is_in_doaj | True |
| primary_location.source.display_name | IEEE Access |
| primary_location.source.host_organization | https://openalex.org/P4310319808 |
| primary_location.source.host_organization_name | Institute of Electrical and Electronics Engineers |
| primary_location.source.host_organization_lineage | https://openalex.org/P4310319808 |
| primary_location.source.host_organization_lineage_names | Institute of Electrical and Electronics Engineers |
| primary_location.license | |
| primary_location.pdf_url | https://ieeexplore.ieee.org/ielx7/6287639/10005208/10156812.pdf |
| primary_location.version | publishedVersion |
| primary_location.raw_type | journal-article |
| primary_location.license_id | |
| primary_location.is_accepted | True |
| primary_location.is_published | True |
| primary_location.raw_source_name | IEEE Access |
| primary_location.landing_page_url | https://doi.org/10.1109/access.2023.3287854 |
| publication_date | 2023-01-01 |
| publication_year | 2023 |
| referenced_works | https://openalex.org/W4205204006, https://openalex.org/W3126434301, https://openalex.org/W3191359713, https://openalex.org/W3109809751, https://openalex.org/W3141929194, https://openalex.org/W4224059121, https://openalex.org/W3160499679, https://openalex.org/W4306411981, https://openalex.org/W4211033763, https://openalex.org/W3122043357, https://openalex.org/W4210482911, https://openalex.org/W3151042244, https://openalex.org/W3164317619, https://openalex.org/W4310846220, https://openalex.org/W4313472426, https://openalex.org/W3155375169, https://openalex.org/W3108277758, https://openalex.org/W3034920607, https://openalex.org/W6794711119, https://openalex.org/W4313340187, https://openalex.org/W3173857193, https://openalex.org/W3161934848, https://openalex.org/W4210768519, https://openalex.org/W3216322097, https://openalex.org/W3193773061, https://openalex.org/W3165822876, https://openalex.org/W3198343608, https://openalex.org/W4297267242, https://openalex.org/W3175504132, https://openalex.org/W4220795864, https://openalex.org/W4212903198, https://openalex.org/W3031060177, https://openalex.org/W4214874138, https://openalex.org/W3163977797, https://openalex.org/W4206958764, https://openalex.org/W4229019212 |
| referenced_works_count | 36 |
| abstract_inverted_index.a | 13, 173 |
| abstract_inverted_index.24 | 149 |
| abstract_inverted_index.30 | 132 |
| abstract_inverted_index.In | 0, 116, 134 |
| abstract_inverted_index.as | 26 |
| abstract_inverted_index.in | 16, 95 |
| abstract_inverted_index.is | 12, 75, 125, 131 |
| abstract_inverted_index.it | 155 |
| abstract_inverted_index.of | 5, 37, 59, 92, 112, 122, 143, 181 |
| abstract_inverted_index.to | 54, 77, 109 |
| abstract_inverted_index.GAN | 22, 72 |
| abstract_inverted_index.The | 44, 70, 80 |
| abstract_inverted_index.and | 29, 41, 50, 61, 98, 103, 127, 140, 148, 165, 185, 190 |
| abstract_inverted_index.are | 101, 107, 146 |
| abstract_inverted_index.can | 156 |
| abstract_inverted_index.for | 177 |
| abstract_inverted_index.has | 23, 160, 166, 186 |
| abstract_inverted_index.new | 174 |
| abstract_inverted_index.the | 3, 35, 56, 86, 93, 110, 120, 123, 128, 138, 144, 178 |
| abstract_inverted_index.This | 152, 170 |
| abstract_inverted_index.data | 83 |
| abstract_inverted_index.game | 6 |
| abstract_inverted_index.iGAN | 94 |
| abstract_inverted_index.real | 113 |
| abstract_inverted_index.self | 51 |
| abstract_inverted_index.such | 25 |
| abstract_inverted_index.that | 85, 154 |
| abstract_inverted_index.then | 62 |
| abstract_inverted_index.time | 130, 142 |
| abstract_inverted_index.with | 2 |
| abstract_inverted_index.(FID) | 90 |
| abstract_inverted_index.(GAN) | 11 |
| abstract_inverted_index.77.68 | 102 |
| abstract_inverted_index.data. | 115 |
| abstract_inverted_index.final | 81 |
| abstract_inverted_index.image | 18, 42, 64, 67, 78, 114, 118, 182 |
| abstract_inverted_index.input | 38 |
| abstract_inverted_index.label | 68 |
| abstract_inverted_index.model | 15, 73, 124, 145, 171 |
| abstract_inverted_index.shows | 84 |
| abstract_inverted_index.study | 45 |
| abstract_inverted_index.which | 32, 106 |
| abstract_inverted_index.affect | 34 |
| abstract_inverted_index.closer | 108 |
| abstract_inverted_index.facial | 96, 135 |
| abstract_inverted_index.guides | 63 |
| abstract_inverted_index.higher | 167 |
| abstract_inverted_index.value. | 191 |
| abstract_inverted_index.values | 91 |
| abstract_inverted_index.weight | 57 |
| abstract_inverted_index.176.84, | 104 |
| abstract_inverted_index.Network | 10 |
| abstract_inverted_index.actions | 100 |
| abstract_inverted_index.applied | 76 |
| abstract_inverted_index.certain | 187 |
| abstract_inverted_index.concept | 4 |
| abstract_inverted_index.current | 17 |
| abstract_inverted_index.feature | 39, 162 |
| abstract_inverted_index.further | 179 |
| abstract_inverted_index.images, | 159 |
| abstract_inverted_index.network | 48 |
| abstract_inverted_index.popular | 14 |
| abstract_inverted_index.through | 66 |
| abstract_inverted_index.Distance | 89 |
| abstract_inverted_index.However, | 21 |
| abstract_inverted_index.accuracy | 121, 139 |
| abstract_inverted_index.generate | 157 |
| abstract_inverted_index.improved | 71 |
| abstract_inverted_index.problems | 24 |
| abstract_inverted_index.provides | 172 |
| abstract_inverted_index.required | 129 |
| abstract_inverted_index.residual | 47 |
| abstract_inverted_index.seconds, | 150 |
| abstract_inverted_index.seconds. | 133 |
| abstract_inverted_index.stronger | 161 |
| abstract_inverted_index.training | 28 |
| abstract_inverted_index.unstable | 27 |
| abstract_inverted_index.Inception | 88 |
| abstract_inverted_index.attention | 52 |
| abstract_inverted_index.calculate | 55 |
| abstract_inverted_index.difficult | 30 |
| abstract_inverted_index.features, | 60 |
| abstract_inverted_index.indicates | 153 |
| abstract_inverted_index.mechanism | 53 |
| abstract_inverted_index.potential | 189 |
| abstract_inverted_index.reference | 176 |
| abstract_inverted_index.seriously | 33 |
| abstract_inverted_index.structure | 49 |
| abstract_inverted_index.technical | 175 |
| abstract_inverted_index.Generative | 8 |
| abstract_inverted_index.accordance | 1 |
| abstract_inverted_index.behavioral | 99, 117 |
| abstract_inverted_index.classifier | 74 |
| abstract_inverted_index.expression | 136 |
| abstract_inverted_index.extraction | 40, 163 |
| abstract_inverted_index.generation | 19, 65 |
| abstract_inverted_index.introduces | 46 |
| abstract_inverted_index.parameters | 58 |
| abstract_inverted_index.processing | 183 |
| abstract_inverted_index.Adversarial | 9 |
| abstract_inverted_index.antagonism, | 7 |
| abstract_inverted_index.application | 188 |
| abstract_inverted_index.efficiency. | 169 |
| abstract_inverted_index.expressions | 97 |
| abstract_inverted_index.improvement | 180 |
| abstract_inverted_index.recognition | 141, 168 |
| abstract_inverted_index.technology, | 184 |
| abstract_inverted_index.technology. | 20 |
| abstract_inverted_index.90.1% | 147 |
| abstract_inverted_index.convergence, | 31 |
| abstract_inverted_index.distribution | 111 |
| abstract_inverted_index.experimental | 82 |
| abstract_inverted_index.high-quality | 158 |
| abstract_inverted_index.information. | 69 |
| abstract_inverted_index.recognition, | 119, 137 |
| abstract_inverted_index.recognition. | 43, 79 |
| abstract_inverted_index.96.8%, | 126 |
| abstract_inverted_index.capabilities, | 164 |
| abstract_inverted_index.effectiveness | 36 |
| abstract_inverted_index.respectively, | 105 |
| abstract_inverted_index.respectively. | 151 |
| abstract_inverted_index.Fréchet | 87 |
| cited_by_percentile_year.max | 99 |
| cited_by_percentile_year.min | 97 |
| countries_distinct_count | 1 |
| institutions_distinct_count | 2 |
| citation_normalized_percentile.value | 0.8409789 |
| citation_normalized_percentile.is_in_top_1_percent | False |
| citation_normalized_percentile.is_in_top_10_percent | False |