Multi-Style Shape Matching GAN for Text Images Article Swipe
YOU?
·
· 2024
· Open Access
·
· DOI: https://doi.org/10.1587/transinf.2023ihp0010
Deep learning techniques are used to transform the style of images and produce diverse images. In the text style transformation field, many previous studies attempted to generate stylized text using deep learning networks. However, to achieve multiple style transformations for text images, the methods proposed in previous studies require learning multiple networks or cannot be guided by style images. Thus, in this study we focused on multistyle transformation of text images using style images to guide the generation of results. We propose a multiple-style transformation network for text style transfer, which we refer to as the Multi-Style Shape Matching GAN (Multi-Style SMGAN). The proposed method generates multiple styles of text images using a single model by training the model only once, and allows users to control the text style according to style images. The proposed method implements conditions to the network such that all styles can be distinguished effectively in the network, and the generation of each styled text can be controlled according to these conditions. The proposed network is optimized such that the conditional information can be transmitted effectively throughout the network. The proposed method was evaluated experimentally on a large number of text images, and the results show that the trained model can generate multiple-style text in realtime according to the style image. In addition, the results of a user survey study indicate that the proposed method produces higher quality results compared to existing methods.
Related Topics
- Type
- article
- Language
- en
- Landing Page
- https://doi.org/10.1587/transinf.2023ihp0010
- https://www.jstage.jst.go.jp/article/transinf/E107.D/4/E107.D_2023IHP0010/_pdf
- OA Status
- diamond
- Cited By
- 2
- References
- 28
- Related Works
- 10
- OpenAlex ID
- https://openalex.org/W4393352246
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4393352246Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.1587/transinf.2023ihp0010Digital Object Identifier
- Title
-
Multi-Style Shape Matching GAN for Text ImagesWork title
- Type
-
articleOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2024Year of publication
- Publication date
-
2024-03-31Full publication date if available
- Authors
-
Honghui Yuan, Keiji YanaiList of authors in order
- Landing page
-
https://doi.org/10.1587/transinf.2023ihp0010Publisher landing page
- PDF URL
-
https://www.jstage.jst.go.jp/article/transinf/E107.D/4/E107.D_2023IHP0010/_pdfDirect link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
diamondOpen access status per OpenAlex
- OA URL
-
https://www.jstage.jst.go.jp/article/transinf/E107.D/4/E107.D_2023IHP0010/_pdfDirect OA link when available
- Concepts
-
Computer science, Style (visual arts), Matching (statistics), Computer vision, Artificial intelligence, Pattern recognition (psychology), Art, Mathematics, Statistics, Visual artsTop concepts (fields/topics) attached by OpenAlex
- Cited by
-
2Total citation count in OpenAlex
- Citations by year (recent)
-
2025: 1, 2024: 1Per-year citation counts (last 5 years)
- References (count)
-
28Number of works referenced by this work
- Related works (count)
-
10Other works algorithmically related by OpenAlex
Full payload
| id | https://openalex.org/W4393352246 |
|---|---|
| doi | https://doi.org/10.1587/transinf.2023ihp0010 |
| ids.doi | https://doi.org/10.1587/transinf.2023ihp0010 |
| ids.openalex | https://openalex.org/W4393352246 |
| fwci | 1.06031512 |
| type | article |
| title | Multi-Style Shape Matching GAN for Text Images |
| biblio.issue | 4 |
| biblio.volume | E107.D |
| biblio.last_page | 514 |
| biblio.first_page | 505 |
| topics[0].id | https://openalex.org/T10824 |
| topics[0].field.id | https://openalex.org/fields/17 |
| topics[0].field.display_name | Computer Science |
| topics[0].score | 0.9980000257492065 |
| topics[0].domain.id | https://openalex.org/domains/3 |
| topics[0].domain.display_name | Physical Sciences |
| topics[0].subfield.id | https://openalex.org/subfields/1707 |
| topics[0].subfield.display_name | Computer Vision and Pattern Recognition |
| topics[0].display_name | Image Retrieval and Classification Techniques |
| topics[1].id | https://openalex.org/T10601 |
| topics[1].field.id | https://openalex.org/fields/17 |
| topics[1].field.display_name | Computer Science |
| topics[1].score | 0.9923999905586243 |
| topics[1].domain.id | https://openalex.org/domains/3 |
| topics[1].domain.display_name | Physical Sciences |
| topics[1].subfield.id | https://openalex.org/subfields/1707 |
| topics[1].subfield.display_name | Computer Vision and Pattern Recognition |
| topics[1].display_name | Handwritten Text Recognition Techniques |
| topics[2].id | https://openalex.org/T10627 |
| topics[2].field.id | https://openalex.org/fields/17 |
| topics[2].field.display_name | Computer Science |
| topics[2].score | 0.9837999939918518 |
| topics[2].domain.id | https://openalex.org/domains/3 |
| topics[2].domain.display_name | Physical Sciences |
| topics[2].subfield.id | https://openalex.org/subfields/1707 |
| topics[2].subfield.display_name | Computer Vision and Pattern Recognition |
| topics[2].display_name | Advanced Image and Video Retrieval Techniques |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| concepts[0].id | https://openalex.org/C41008148 |
| concepts[0].level | 0 |
| concepts[0].score | 0.8467285633087158 |
| concepts[0].wikidata | https://www.wikidata.org/wiki/Q21198 |
| concepts[0].display_name | Computer science |
| concepts[1].id | https://openalex.org/C2776445246 |
| concepts[1].level | 2 |
| concepts[1].score | 0.694674551486969 |
| concepts[1].wikidata | https://www.wikidata.org/wiki/Q1792644 |
| concepts[1].display_name | Style (visual arts) |
| concepts[2].id | https://openalex.org/C165064840 |
| concepts[2].level | 2 |
| concepts[2].score | 0.620708703994751 |
| concepts[2].wikidata | https://www.wikidata.org/wiki/Q1321061 |
| concepts[2].display_name | Matching (statistics) |
| concepts[3].id | https://openalex.org/C31972630 |
| concepts[3].level | 1 |
| concepts[3].score | 0.4791121482849121 |
| concepts[3].wikidata | https://www.wikidata.org/wiki/Q844240 |
| concepts[3].display_name | Computer vision |
| concepts[4].id | https://openalex.org/C154945302 |
| concepts[4].level | 1 |
| concepts[4].score | 0.47413942217826843 |
| concepts[4].wikidata | https://www.wikidata.org/wiki/Q11660 |
| concepts[4].display_name | Artificial intelligence |
| concepts[5].id | https://openalex.org/C153180895 |
| concepts[5].level | 2 |
| concepts[5].score | 0.32508206367492676 |
| concepts[5].wikidata | https://www.wikidata.org/wiki/Q7148389 |
| concepts[5].display_name | Pattern recognition (psychology) |
| concepts[6].id | https://openalex.org/C142362112 |
| concepts[6].level | 0 |
| concepts[6].score | 0.08383002877235413 |
| concepts[6].wikidata | https://www.wikidata.org/wiki/Q735 |
| concepts[6].display_name | Art |
| concepts[7].id | https://openalex.org/C33923547 |
| concepts[7].level | 0 |
| concepts[7].score | 0.06567621231079102 |
| concepts[7].wikidata | https://www.wikidata.org/wiki/Q395 |
| concepts[7].display_name | Mathematics |
| concepts[8].id | https://openalex.org/C105795698 |
| concepts[8].level | 1 |
| concepts[8].score | 0.05123540759086609 |
| concepts[8].wikidata | https://www.wikidata.org/wiki/Q12483 |
| concepts[8].display_name | Statistics |
| concepts[9].id | https://openalex.org/C153349607 |
| concepts[9].level | 1 |
| concepts[9].score | 0.04595762491226196 |
| concepts[9].wikidata | https://www.wikidata.org/wiki/Q36649 |
| concepts[9].display_name | Visual arts |
| keywords[0].id | https://openalex.org/keywords/computer-science |
| keywords[0].score | 0.8467285633087158 |
| keywords[0].display_name | Computer science |
| keywords[1].id | https://openalex.org/keywords/style |
| keywords[1].score | 0.694674551486969 |
| keywords[1].display_name | Style (visual arts) |
| keywords[2].id | https://openalex.org/keywords/matching |
| keywords[2].score | 0.620708703994751 |
| keywords[2].display_name | Matching (statistics) |
| keywords[3].id | https://openalex.org/keywords/computer-vision |
| keywords[3].score | 0.4791121482849121 |
| keywords[3].display_name | Computer vision |
| keywords[4].id | https://openalex.org/keywords/artificial-intelligence |
| keywords[4].score | 0.47413942217826843 |
| keywords[4].display_name | Artificial intelligence |
| keywords[5].id | https://openalex.org/keywords/pattern-recognition |
| keywords[5].score | 0.32508206367492676 |
| keywords[5].display_name | Pattern recognition (psychology) |
| keywords[6].id | https://openalex.org/keywords/art |
| keywords[6].score | 0.08383002877235413 |
| keywords[6].display_name | Art |
| keywords[7].id | https://openalex.org/keywords/mathematics |
| keywords[7].score | 0.06567621231079102 |
| keywords[7].display_name | Mathematics |
| keywords[8].id | https://openalex.org/keywords/statistics |
| keywords[8].score | 0.05123540759086609 |
| keywords[8].display_name | Statistics |
| keywords[9].id | https://openalex.org/keywords/visual-arts |
| keywords[9].score | 0.04595762491226196 |
| keywords[9].display_name | Visual arts |
| language | en |
| locations[0].id | doi:10.1587/transinf.2023ihp0010 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S2486202937 |
| locations[0].source.issn | 0916-8532, 1745-1361 |
| locations[0].source.type | journal |
| locations[0].source.is_oa | True |
| locations[0].source.issn_l | 0916-8532 |
| locations[0].source.is_core | True |
| locations[0].source.is_in_doaj | False |
| locations[0].source.display_name | IEICE Transactions on Information and Systems |
| locations[0].source.host_organization | https://openalex.org/P4320800604 |
| locations[0].source.host_organization_name | Institute of Electronics, Information and Communication Engineers |
| locations[0].source.host_organization_lineage | https://openalex.org/P4320800604 |
| locations[0].source.host_organization_lineage_names | Institute of Electronics, Information and Communication Engineers |
| locations[0].license | |
| locations[0].pdf_url | https://www.jstage.jst.go.jp/article/transinf/E107.D/4/E107.D_2023IHP0010/_pdf |
| locations[0].version | publishedVersion |
| locations[0].raw_type | journal-article |
| locations[0].license_id | |
| locations[0].is_accepted | True |
| locations[0].is_published | True |
| locations[0].raw_source_name | IEICE Transactions on Information and Systems |
| locations[0].landing_page_url | https://doi.org/10.1587/transinf.2023ihp0010 |
| indexed_in | crossref |
| authorships[0].author.id | https://openalex.org/A5101325557 |
| authorships[0].author.orcid | https://orcid.org/0009-0001-4334-9363 |
| authorships[0].author.display_name | Honghui Yuan |
| authorships[0].countries | JP |
| authorships[0].affiliations[0].institution_ids | https://openalex.org/I20529979 |
| authorships[0].affiliations[0].raw_affiliation_string | Department of Informatics, The University of Electro-Communications |
| authorships[0].institutions[0].id | https://openalex.org/I20529979 |
| authorships[0].institutions[0].ror | https://ror.org/02x73b849 |
| authorships[0].institutions[0].type | education |
| authorships[0].institutions[0].lineage | https://openalex.org/I20529979 |
| authorships[0].institutions[0].country_code | JP |
| authorships[0].institutions[0].display_name | University of Electro-Communications |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Honghui YUAN |
| authorships[0].is_corresponding | False |
| authorships[0].raw_affiliation_strings | Department of Informatics, The University of Electro-Communications |
| authorships[1].author.id | https://openalex.org/A5054600485 |
| authorships[1].author.orcid | https://orcid.org/0000-0002-0431-183X |
| authorships[1].author.display_name | Keiji Yanai |
| authorships[1].countries | JP |
| authorships[1].affiliations[0].institution_ids | https://openalex.org/I20529979 |
| authorships[1].affiliations[0].raw_affiliation_string | Department of Informatics, The University of Electro-Communications |
| authorships[1].institutions[0].id | https://openalex.org/I20529979 |
| authorships[1].institutions[0].ror | https://ror.org/02x73b849 |
| authorships[1].institutions[0].type | education |
| authorships[1].institutions[0].lineage | https://openalex.org/I20529979 |
| authorships[1].institutions[0].country_code | JP |
| authorships[1].institutions[0].display_name | University of Electro-Communications |
| authorships[1].author_position | last |
| authorships[1].raw_author_name | Keiji YANAI |
| authorships[1].is_corresponding | False |
| authorships[1].raw_affiliation_strings | Department of Informatics, The University of Electro-Communications |
| has_content.pdf | True |
| has_content.grobid_xml | False |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://www.jstage.jst.go.jp/article/transinf/E107.D/4/E107.D_2023IHP0010/_pdf |
| open_access.oa_status | diamond |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | Multi-Style Shape Matching GAN for Text Images |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-11-06T03:46:38.306776 |
| primary_topic.id | https://openalex.org/T10824 |
| primary_topic.field.id | https://openalex.org/fields/17 |
| primary_topic.field.display_name | Computer Science |
| primary_topic.score | 0.9980000257492065 |
| primary_topic.domain.id | https://openalex.org/domains/3 |
| primary_topic.domain.display_name | Physical Sciences |
| primary_topic.subfield.id | https://openalex.org/subfields/1707 |
| primary_topic.subfield.display_name | Computer Vision and Pattern Recognition |
| primary_topic.display_name | Image Retrieval and Classification Techniques |
| related_works | https://openalex.org/W2058170566, https://openalex.org/W2755342338, https://openalex.org/W2772917594, https://openalex.org/W2775347418, https://openalex.org/W2166024367, https://openalex.org/W3116076068, https://openalex.org/W2229312674, https://openalex.org/W2951359407, https://openalex.org/W2079911747, https://openalex.org/W1969923398 |
| cited_by_count | 2 |
| counts_by_year[0].year | 2025 |
| counts_by_year[0].cited_by_count | 1 |
| counts_by_year[1].year | 2024 |
| counts_by_year[1].cited_by_count | 1 |
| locations_count | 1 |
| best_oa_location.id | doi:10.1587/transinf.2023ihp0010 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S2486202937 |
| best_oa_location.source.issn | 0916-8532, 1745-1361 |
| best_oa_location.source.type | journal |
| best_oa_location.source.is_oa | True |
| best_oa_location.source.issn_l | 0916-8532 |
| best_oa_location.source.is_core | True |
| best_oa_location.source.is_in_doaj | False |
| best_oa_location.source.display_name | IEICE Transactions on Information and Systems |
| best_oa_location.source.host_organization | https://openalex.org/P4320800604 |
| best_oa_location.source.host_organization_name | Institute of Electronics, Information and Communication Engineers |
| best_oa_location.source.host_organization_lineage | https://openalex.org/P4320800604 |
| best_oa_location.source.host_organization_lineage_names | Institute of Electronics, Information and Communication Engineers |
| best_oa_location.license | |
| best_oa_location.pdf_url | https://www.jstage.jst.go.jp/article/transinf/E107.D/4/E107.D_2023IHP0010/_pdf |
| best_oa_location.version | publishedVersion |
| best_oa_location.raw_type | journal-article |
| best_oa_location.license_id | |
| best_oa_location.is_accepted | True |
| best_oa_location.is_published | True |
| best_oa_location.raw_source_name | IEICE Transactions on Information and Systems |
| best_oa_location.landing_page_url | https://doi.org/10.1587/transinf.2023ihp0010 |
| primary_location.id | doi:10.1587/transinf.2023ihp0010 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S2486202937 |
| primary_location.source.issn | 0916-8532, 1745-1361 |
| primary_location.source.type | journal |
| primary_location.source.is_oa | True |
| primary_location.source.issn_l | 0916-8532 |
| primary_location.source.is_core | True |
| primary_location.source.is_in_doaj | False |
| primary_location.source.display_name | IEICE Transactions on Information and Systems |
| primary_location.source.host_organization | https://openalex.org/P4320800604 |
| primary_location.source.host_organization_name | Institute of Electronics, Information and Communication Engineers |
| primary_location.source.host_organization_lineage | https://openalex.org/P4320800604 |
| primary_location.source.host_organization_lineage_names | Institute of Electronics, Information and Communication Engineers |
| primary_location.license | |
| primary_location.pdf_url | https://www.jstage.jst.go.jp/article/transinf/E107.D/4/E107.D_2023IHP0010/_pdf |
| primary_location.version | publishedVersion |
| primary_location.raw_type | journal-article |
| primary_location.license_id | |
| primary_location.is_accepted | True |
| primary_location.is_published | True |
| primary_location.raw_source_name | IEICE Transactions on Information and Systems |
| primary_location.landing_page_url | https://doi.org/10.1587/transinf.2023ihp0010 |
| publication_date | 2024-03-31 |
| publication_year | 2024 |
| referenced_works | https://openalex.org/W2983541695, https://openalex.org/W2962974533, https://openalex.org/W2990269423, https://openalex.org/W3210481694, https://openalex.org/W2963073614, https://openalex.org/W2962793481, https://openalex.org/W2768959015, https://openalex.org/W2592480533, https://openalex.org/W2963890275, https://openalex.org/W2963767194, https://openalex.org/W2125389028, https://openalex.org/W2475287302, https://openalex.org/W2331128040, https://openalex.org/W2603777577, https://openalex.org/W3040462728, https://openalex.org/W2962770929, https://openalex.org/W4313029666, https://openalex.org/W3204777666, https://openalex.org/W3135367836, https://openalex.org/W3216156094, https://openalex.org/W4214926101, https://openalex.org/W2905469793, https://openalex.org/W2998185443, https://openalex.org/W2968456405, https://openalex.org/W3034446980, https://openalex.org/W2962968458, https://openalex.org/W2948930600, https://openalex.org/W3106333289 |
| referenced_works_count | 28 |
| abstract_inverted_index.a | 82, 112, 190, 220 |
| abstract_inverted_index.In | 15, 215 |
| abstract_inverted_index.We | 80 |
| abstract_inverted_index.as | 94 |
| abstract_inverted_index.be | 54, 146, 160, 177 |
| abstract_inverted_index.by | 56, 115 |
| abstract_inverted_index.in | 45, 60, 149, 208 |
| abstract_inverted_index.is | 169 |
| abstract_inverted_index.of | 9, 68, 78, 108, 155, 193, 219 |
| abstract_inverted_index.on | 65, 189 |
| abstract_inverted_index.or | 52 |
| abstract_inverted_index.to | 5, 25, 34, 74, 93, 124, 130, 138, 163, 211, 234 |
| abstract_inverted_index.we | 63, 91 |
| abstract_inverted_index.GAN | 99 |
| abstract_inverted_index.The | 102, 133, 166, 183 |
| abstract_inverted_index.all | 143 |
| abstract_inverted_index.and | 11, 121, 152, 196 |
| abstract_inverted_index.are | 3 |
| abstract_inverted_index.can | 145, 159, 176, 204 |
| abstract_inverted_index.for | 39, 86 |
| abstract_inverted_index.the | 7, 16, 42, 76, 95, 117, 126, 139, 150, 153, 173, 181, 197, 201, 212, 217, 226 |
| abstract_inverted_index.was | 186 |
| abstract_inverted_index.Deep | 0 |
| abstract_inverted_index.deep | 30 |
| abstract_inverted_index.each | 156 |
| abstract_inverted_index.many | 21 |
| abstract_inverted_index.only | 119 |
| abstract_inverted_index.show | 199 |
| abstract_inverted_index.such | 141, 171 |
| abstract_inverted_index.text | 17, 28, 40, 69, 87, 109, 127, 158, 194, 207 |
| abstract_inverted_index.that | 142, 172, 200, 225 |
| abstract_inverted_index.this | 61 |
| abstract_inverted_index.used | 4 |
| abstract_inverted_index.user | 221 |
| abstract_inverted_index.Shape | 97 |
| abstract_inverted_index.Thus, | 59 |
| abstract_inverted_index.guide | 75 |
| abstract_inverted_index.large | 191 |
| abstract_inverted_index.model | 114, 118, 203 |
| abstract_inverted_index.once, | 120 |
| abstract_inverted_index.refer | 92 |
| abstract_inverted_index.study | 62, 223 |
| abstract_inverted_index.style | 8, 18, 37, 57, 72, 88, 128, 131, 213 |
| abstract_inverted_index.these | 164 |
| abstract_inverted_index.users | 123 |
| abstract_inverted_index.using | 29, 71, 111 |
| abstract_inverted_index.which | 90 |
| abstract_inverted_index.allows | 122 |
| abstract_inverted_index.cannot | 53 |
| abstract_inverted_index.field, | 20 |
| abstract_inverted_index.guided | 55 |
| abstract_inverted_index.higher | 230 |
| abstract_inverted_index.image. | 214 |
| abstract_inverted_index.images | 10, 70, 73, 110 |
| abstract_inverted_index.method | 104, 135, 185, 228 |
| abstract_inverted_index.number | 192 |
| abstract_inverted_index.single | 113 |
| abstract_inverted_index.styled | 157 |
| abstract_inverted_index.styles | 107, 144 |
| abstract_inverted_index.survey | 222 |
| abstract_inverted_index.SMGAN). | 101 |
| abstract_inverted_index.achieve | 35 |
| abstract_inverted_index.control | 125 |
| abstract_inverted_index.diverse | 13 |
| abstract_inverted_index.focused | 64 |
| abstract_inverted_index.images, | 41, 195 |
| abstract_inverted_index.images. | 14, 58, 132 |
| abstract_inverted_index.methods | 43 |
| abstract_inverted_index.network | 85, 140, 168 |
| abstract_inverted_index.produce | 12 |
| abstract_inverted_index.propose | 81 |
| abstract_inverted_index.quality | 231 |
| abstract_inverted_index.require | 48 |
| abstract_inverted_index.results | 198, 218, 232 |
| abstract_inverted_index.studies | 23, 47 |
| abstract_inverted_index.trained | 202 |
| abstract_inverted_index.However, | 33 |
| abstract_inverted_index.Matching | 98 |
| abstract_inverted_index.compared | 233 |
| abstract_inverted_index.existing | 235 |
| abstract_inverted_index.generate | 26, 205 |
| abstract_inverted_index.indicate | 224 |
| abstract_inverted_index.learning | 1, 31, 49 |
| abstract_inverted_index.methods. | 236 |
| abstract_inverted_index.multiple | 36, 50, 106 |
| abstract_inverted_index.network, | 151 |
| abstract_inverted_index.network. | 182 |
| abstract_inverted_index.networks | 51 |
| abstract_inverted_index.previous | 22, 46 |
| abstract_inverted_index.produces | 229 |
| abstract_inverted_index.proposed | 44, 103, 134, 167, 184, 227 |
| abstract_inverted_index.realtime | 209 |
| abstract_inverted_index.results. | 79 |
| abstract_inverted_index.stylized | 27 |
| abstract_inverted_index.training | 116 |
| abstract_inverted_index.according | 129, 162, 210 |
| abstract_inverted_index.addition, | 216 |
| abstract_inverted_index.attempted | 24 |
| abstract_inverted_index.evaluated | 187 |
| abstract_inverted_index.generates | 105 |
| abstract_inverted_index.networks. | 32 |
| abstract_inverted_index.optimized | 170 |
| abstract_inverted_index.transfer, | 89 |
| abstract_inverted_index.transform | 6 |
| abstract_inverted_index.conditions | 137 |
| abstract_inverted_index.controlled | 161 |
| abstract_inverted_index.generation | 77, 154 |
| abstract_inverted_index.implements | 136 |
| abstract_inverted_index.multistyle | 66 |
| abstract_inverted_index.techniques | 2 |
| abstract_inverted_index.throughout | 180 |
| abstract_inverted_index.Multi-Style | 96 |
| abstract_inverted_index.conditional | 174 |
| abstract_inverted_index.conditions. | 165 |
| abstract_inverted_index.effectively | 148, 179 |
| abstract_inverted_index.information | 175 |
| abstract_inverted_index.transmitted | 178 |
| abstract_inverted_index.(Multi-Style | 100 |
| abstract_inverted_index.distinguished | 147 |
| abstract_inverted_index.experimentally | 188 |
| abstract_inverted_index.multiple-style | 83, 206 |
| abstract_inverted_index.transformation | 19, 67, 84 |
| abstract_inverted_index.transformations | 38 |
| cited_by_percentile_year.max | 95 |
| cited_by_percentile_year.min | 90 |
| countries_distinct_count | 1 |
| institutions_distinct_count | 2 |
| sustainable_development_goals[0].id | https://metadata.un.org/sdg/4 |
| sustainable_development_goals[0].score | 0.7599999904632568 |
| sustainable_development_goals[0].display_name | Quality Education |
| citation_normalized_percentile.value | 0.67993439 |
| citation_normalized_percentile.is_in_top_1_percent | False |
| citation_normalized_percentile.is_in_top_10_percent | False |