LongT5: Efficient Text-To-Text Transformer for Long Sequences Article Swipe
YOU?
·
· 2022
· Open Access
·
· DOI: https://doi.org/10.18653/v1/2022.findings-naacl.55
Recent work has shown that either (1) increasing the input length or (2) increasing model size can improve the performance of Transformer-based neural models. In this paper, we present LongT5, a new model that explores the effects of scaling both the input length and model size at the same time. Specifically, we integrate attention ideas from long-input transformers (ETC), and adopt pre-training strategies from summarization pre-training (PEGASUS) into the scalable T5 architecture. The result is a new attention mechanism we call Transient Global (TGlobal), which mimics ETC’s local/global attention mechanism, but without requiring additional side-inputs. We are able to achieve state-of-the-art results on several summarization and question answering tasks, as well as outperform the original T5 models on these tasks. We have open sourced our architecture and training code, as well as our pre-trained model checkpoints.
Related Topics
- Type
- article
- Language
- en
- Landing Page
- https://doi.org/10.18653/v1/2022.findings-naacl.55
- https://aclanthology.org/2022.findings-naacl.55.pdf
- OA Status
- hybrid
- Cited By
- 183
- References
- 39
- Related Works
- 10
- OpenAlex ID
- https://openalex.org/W4225727438
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4225727438Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.18653/v1/2022.findings-naacl.55Digital Object Identifier
- Title
-
LongT5: Efficient Text-To-Text Transformer for Long SequencesWork title
- Type
-
articleOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2022Year of publication
- Publication date
-
2022-01-01Full publication date if available
- Authors
-
Mandy Guo, Joshua Ainslie, David Uthus, Santiago Ontañón, Jianmo Ni, Yun-Hsuan Sung, Yinfei YangList of authors in order
- Landing page
-
https://doi.org/10.18653/v1/2022.findings-naacl.55Publisher landing page
- PDF URL
-
https://aclanthology.org/2022.findings-naacl.55.pdfDirect link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
hybridOpen access status per OpenAlex
- OA URL
-
https://aclanthology.org/2022.findings-naacl.55.pdfDirect OA link when available
- Concepts
-
Automatic summarization, Computer science, Transformer, Scalability, Architecture, Artificial intelligence, Question answering, Machine learning, Natural language processing, Database, Engineering, Art, Visual arts, Electrical engineering, VoltageTop concepts (fields/topics) attached by OpenAlex
- Cited by
-
183Total citation count in OpenAlex
- Citations by year (recent)
-
2025: 44, 2024: 33, 2023: 79, 2022: 25, 2019: 1Per-year citation counts (last 5 years)
- References (count)
-
39Number of works referenced by this work
- Related works (count)
-
10Other works algorithmically related by OpenAlex
Full payload
| id | https://openalex.org/W4225727438 |
|---|---|
| doi | https://doi.org/10.18653/v1/2022.findings-naacl.55 |
| ids.doi | https://doi.org/10.18653/v1/2022.findings-naacl.55 |
| ids.openalex | https://openalex.org/W4225727438 |
| fwci | 21.2774622 |
| type | article |
| title | LongT5: Efficient Text-To-Text Transformer for Long Sequences |
| biblio.issue | |
| biblio.volume | |
| biblio.last_page | 736 |
| biblio.first_page | 724 |
| topics[0].id | https://openalex.org/T10028 |
| topics[0].field.id | https://openalex.org/fields/17 |
| topics[0].field.display_name | Computer Science |
| topics[0].score | 0.9986000061035156 |
| topics[0].domain.id | https://openalex.org/domains/3 |
| topics[0].domain.display_name | Physical Sciences |
| topics[0].subfield.id | https://openalex.org/subfields/1702 |
| topics[0].subfield.display_name | Artificial Intelligence |
| topics[0].display_name | Topic Modeling |
| topics[1].id | https://openalex.org/T10181 |
| topics[1].field.id | https://openalex.org/fields/17 |
| topics[1].field.display_name | Computer Science |
| topics[1].score | 0.9957000017166138 |
| topics[1].domain.id | https://openalex.org/domains/3 |
| topics[1].domain.display_name | Physical Sciences |
| topics[1].subfield.id | https://openalex.org/subfields/1702 |
| topics[1].subfield.display_name | Artificial Intelligence |
| topics[1].display_name | Natural Language Processing Techniques |
| topics[2].id | https://openalex.org/T13650 |
| topics[2].field.id | https://openalex.org/fields/17 |
| topics[2].field.display_name | Computer Science |
| topics[2].score | 0.9639000296592712 |
| topics[2].domain.id | https://openalex.org/domains/3 |
| topics[2].domain.display_name | Physical Sciences |
| topics[2].subfield.id | https://openalex.org/subfields/1702 |
| topics[2].subfield.display_name | Artificial Intelligence |
| topics[2].display_name | Computational Physics and Python Applications |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| concepts[0].id | https://openalex.org/C170858558 |
| concepts[0].level | 2 |
| concepts[0].score | 0.9284830093383789 |
| concepts[0].wikidata | https://www.wikidata.org/wiki/Q1394144 |
| concepts[0].display_name | Automatic summarization |
| concepts[1].id | https://openalex.org/C41008148 |
| concepts[1].level | 0 |
| concepts[1].score | 0.8147361278533936 |
| concepts[1].wikidata | https://www.wikidata.org/wiki/Q21198 |
| concepts[1].display_name | Computer science |
| concepts[2].id | https://openalex.org/C66322947 |
| concepts[2].level | 3 |
| concepts[2].score | 0.7684248685836792 |
| concepts[2].wikidata | https://www.wikidata.org/wiki/Q11658 |
| concepts[2].display_name | Transformer |
| concepts[3].id | https://openalex.org/C48044578 |
| concepts[3].level | 2 |
| concepts[3].score | 0.7323483228683472 |
| concepts[3].wikidata | https://www.wikidata.org/wiki/Q727490 |
| concepts[3].display_name | Scalability |
| concepts[4].id | https://openalex.org/C123657996 |
| concepts[4].level | 2 |
| concepts[4].score | 0.621179461479187 |
| concepts[4].wikidata | https://www.wikidata.org/wiki/Q12271 |
| concepts[4].display_name | Architecture |
| concepts[5].id | https://openalex.org/C154945302 |
| concepts[5].level | 1 |
| concepts[5].score | 0.4995849132537842 |
| concepts[5].wikidata | https://www.wikidata.org/wiki/Q11660 |
| concepts[5].display_name | Artificial intelligence |
| concepts[6].id | https://openalex.org/C44291984 |
| concepts[6].level | 2 |
| concepts[6].score | 0.4368782341480255 |
| concepts[6].wikidata | https://www.wikidata.org/wiki/Q1074173 |
| concepts[6].display_name | Question answering |
| concepts[7].id | https://openalex.org/C119857082 |
| concepts[7].level | 1 |
| concepts[7].score | 0.36599624156951904 |
| concepts[7].wikidata | https://www.wikidata.org/wiki/Q2539 |
| concepts[7].display_name | Machine learning |
| concepts[8].id | https://openalex.org/C204321447 |
| concepts[8].level | 1 |
| concepts[8].score | 0.35352760553359985 |
| concepts[8].wikidata | https://www.wikidata.org/wiki/Q30642 |
| concepts[8].display_name | Natural language processing |
| concepts[9].id | https://openalex.org/C77088390 |
| concepts[9].level | 1 |
| concepts[9].score | 0.09438613057136536 |
| concepts[9].wikidata | https://www.wikidata.org/wiki/Q8513 |
| concepts[9].display_name | Database |
| concepts[10].id | https://openalex.org/C127413603 |
| concepts[10].level | 0 |
| concepts[10].score | 0.09324902296066284 |
| concepts[10].wikidata | https://www.wikidata.org/wiki/Q11023 |
| concepts[10].display_name | Engineering |
| concepts[11].id | https://openalex.org/C142362112 |
| concepts[11].level | 0 |
| concepts[11].score | 0.0 |
| concepts[11].wikidata | https://www.wikidata.org/wiki/Q735 |
| concepts[11].display_name | Art |
| concepts[12].id | https://openalex.org/C153349607 |
| concepts[12].level | 1 |
| concepts[12].score | 0.0 |
| concepts[12].wikidata | https://www.wikidata.org/wiki/Q36649 |
| concepts[12].display_name | Visual arts |
| concepts[13].id | https://openalex.org/C119599485 |
| concepts[13].level | 1 |
| concepts[13].score | 0.0 |
| concepts[13].wikidata | https://www.wikidata.org/wiki/Q43035 |
| concepts[13].display_name | Electrical engineering |
| concepts[14].id | https://openalex.org/C165801399 |
| concepts[14].level | 2 |
| concepts[14].score | 0.0 |
| concepts[14].wikidata | https://www.wikidata.org/wiki/Q25428 |
| concepts[14].display_name | Voltage |
| keywords[0].id | https://openalex.org/keywords/automatic-summarization |
| keywords[0].score | 0.9284830093383789 |
| keywords[0].display_name | Automatic summarization |
| keywords[1].id | https://openalex.org/keywords/computer-science |
| keywords[1].score | 0.8147361278533936 |
| keywords[1].display_name | Computer science |
| keywords[2].id | https://openalex.org/keywords/transformer |
| keywords[2].score | 0.7684248685836792 |
| keywords[2].display_name | Transformer |
| keywords[3].id | https://openalex.org/keywords/scalability |
| keywords[3].score | 0.7323483228683472 |
| keywords[3].display_name | Scalability |
| keywords[4].id | https://openalex.org/keywords/architecture |
| keywords[4].score | 0.621179461479187 |
| keywords[4].display_name | Architecture |
| keywords[5].id | https://openalex.org/keywords/artificial-intelligence |
| keywords[5].score | 0.4995849132537842 |
| keywords[5].display_name | Artificial intelligence |
| keywords[6].id | https://openalex.org/keywords/question-answering |
| keywords[6].score | 0.4368782341480255 |
| keywords[6].display_name | Question answering |
| keywords[7].id | https://openalex.org/keywords/machine-learning |
| keywords[7].score | 0.36599624156951904 |
| keywords[7].display_name | Machine learning |
| keywords[8].id | https://openalex.org/keywords/natural-language-processing |
| keywords[8].score | 0.35352760553359985 |
| keywords[8].display_name | Natural language processing |
| keywords[9].id | https://openalex.org/keywords/database |
| keywords[9].score | 0.09438613057136536 |
| keywords[9].display_name | Database |
| keywords[10].id | https://openalex.org/keywords/engineering |
| keywords[10].score | 0.09324902296066284 |
| keywords[10].display_name | Engineering |
| language | en |
| locations[0].id | doi:10.18653/v1/2022.findings-naacl.55 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S4363605604 |
| locations[0].source.issn | |
| locations[0].source.type | conference |
| locations[0].source.is_oa | False |
| locations[0].source.issn_l | |
| locations[0].source.is_core | False |
| locations[0].source.is_in_doaj | False |
| locations[0].source.display_name | Findings of the Association for Computational Linguistics: NAACL 2022 |
| locations[0].source.host_organization | |
| locations[0].source.host_organization_name | |
| locations[0].license | cc-by |
| locations[0].pdf_url | https://aclanthology.org/2022.findings-naacl.55.pdf |
| locations[0].version | publishedVersion |
| locations[0].raw_type | proceedings-article |
| locations[0].license_id | https://openalex.org/licenses/cc-by |
| locations[0].is_accepted | True |
| locations[0].is_published | True |
| locations[0].raw_source_name | Findings of the Association for Computational Linguistics: NAACL 2022 |
| locations[0].landing_page_url | https://doi.org/10.18653/v1/2022.findings-naacl.55 |
| indexed_in | crossref |
| authorships[0].author.id | https://openalex.org/A5004864409 |
| authorships[0].author.orcid | |
| authorships[0].author.display_name | Mandy Guo |
| authorships[0].countries | US |
| authorships[0].affiliations[0].institution_ids | https://openalex.org/I1291425158 |
| authorships[0].affiliations[0].raw_affiliation_string | Google Research {xyguo, jainslie, |
| authorships[0].institutions[0].id | https://openalex.org/I1291425158 |
| authorships[0].institutions[0].ror | https://ror.org/00njsd438 |
| authorships[0].institutions[0].type | company |
| authorships[0].institutions[0].lineage | https://openalex.org/I1291425158, https://openalex.org/I4210128969 |
| authorships[0].institutions[0].country_code | US |
| authorships[0].institutions[0].display_name | Google (United States) |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Mandy Guo |
| authorships[0].is_corresponding | False |
| authorships[0].raw_affiliation_strings | Google Research {xyguo, jainslie, |
| authorships[1].author.id | https://openalex.org/A5072605113 |
| authorships[1].author.orcid | |
| authorships[1].author.display_name | Joshua Ainslie |
| authorships[1].countries | US |
| authorships[1].affiliations[0].institution_ids | https://openalex.org/I1291425158 |
| authorships[1].affiliations[0].raw_affiliation_string | Google Research {xyguo, jainslie, |
| authorships[1].institutions[0].id | https://openalex.org/I1291425158 |
| authorships[1].institutions[0].ror | https://ror.org/00njsd438 |
| authorships[1].institutions[0].type | company |
| authorships[1].institutions[0].lineage | https://openalex.org/I1291425158, https://openalex.org/I4210128969 |
| authorships[1].institutions[0].country_code | US |
| authorships[1].institutions[0].display_name | Google (United States) |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Joshua Ainslie |
| authorships[1].is_corresponding | False |
| authorships[1].raw_affiliation_strings | Google Research {xyguo, jainslie, |
| authorships[2].author.id | https://openalex.org/A5026906002 |
| authorships[2].author.orcid | |
| authorships[2].author.display_name | David Uthus |
| authorships[2].countries | US |
| authorships[2].affiliations[0].institution_ids | https://openalex.org/I1291425158 |
| authorships[2].affiliations[0].raw_affiliation_string | Google Research {xyguo, jainslie, |
| authorships[2].institutions[0].id | https://openalex.org/I1291425158 |
| authorships[2].institutions[0].ror | https://ror.org/00njsd438 |
| authorships[2].institutions[0].type | company |
| authorships[2].institutions[0].lineage | https://openalex.org/I1291425158, https://openalex.org/I4210128969 |
| authorships[2].institutions[0].country_code | US |
| authorships[2].institutions[0].display_name | Google (United States) |
| authorships[2].author_position | middle |
| authorships[2].raw_author_name | David Uthus |
| authorships[2].is_corresponding | False |
| authorships[2].raw_affiliation_strings | Google Research {xyguo, jainslie, |
| authorships[3].author.id | https://openalex.org/A5038686010 |
| authorships[3].author.orcid | https://orcid.org/0000-0002-9616-2981 |
| authorships[3].author.display_name | Santiago Ontañón |
| authorships[3].countries | US |
| authorships[3].affiliations[0].institution_ids | https://openalex.org/I1291425158 |
| authorships[3].affiliations[0].raw_affiliation_string | Google Research {xyguo, jainslie, |
| authorships[3].institutions[0].id | https://openalex.org/I1291425158 |
| authorships[3].institutions[0].ror | https://ror.org/00njsd438 |
| authorships[3].institutions[0].type | company |
| authorships[3].institutions[0].lineage | https://openalex.org/I1291425158, https://openalex.org/I4210128969 |
| authorships[3].institutions[0].country_code | US |
| authorships[3].institutions[0].display_name | Google (United States) |
| authorships[3].author_position | middle |
| authorships[3].raw_author_name | Santiago Ontanon |
| authorships[3].is_corresponding | False |
| authorships[3].raw_affiliation_strings | Google Research {xyguo, jainslie, |
| authorships[4].author.id | https://openalex.org/A5077817759 |
| authorships[4].author.orcid | https://orcid.org/0000-0002-6863-8073 |
| authorships[4].author.display_name | Jianmo Ni |
| authorships[4].countries | US |
| authorships[4].affiliations[0].institution_ids | https://openalex.org/I1291425158 |
| authorships[4].affiliations[0].raw_affiliation_string | Google Research {xyguo, jainslie, |
| authorships[4].institutions[0].id | https://openalex.org/I1291425158 |
| authorships[4].institutions[0].ror | https://ror.org/00njsd438 |
| authorships[4].institutions[0].type | company |
| authorships[4].institutions[0].lineage | https://openalex.org/I1291425158, https://openalex.org/I4210128969 |
| authorships[4].institutions[0].country_code | US |
| authorships[4].institutions[0].display_name | Google (United States) |
| authorships[4].author_position | middle |
| authorships[4].raw_author_name | Jianmo Ni |
| authorships[4].is_corresponding | False |
| authorships[4].raw_affiliation_strings | Google Research {xyguo, jainslie, |
| authorships[5].author.id | https://openalex.org/A5113807382 |
| authorships[5].author.orcid | |
| authorships[5].author.display_name | Yun-Hsuan Sung |
| authorships[5].countries | US |
| authorships[5].affiliations[0].institution_ids | https://openalex.org/I1291425158 |
| authorships[5].affiliations[0].raw_affiliation_string | Google Research {xyguo, jainslie, |
| authorships[5].institutions[0].id | https://openalex.org/I1291425158 |
| authorships[5].institutions[0].ror | https://ror.org/00njsd438 |
| authorships[5].institutions[0].type | company |
| authorships[5].institutions[0].lineage | https://openalex.org/I1291425158, https://openalex.org/I4210128969 |
| authorships[5].institutions[0].country_code | US |
| authorships[5].institutions[0].display_name | Google (United States) |
| authorships[5].author_position | middle |
| authorships[5].raw_author_name | Yun-Hsuan Sung |
| authorships[5].is_corresponding | False |
| authorships[5].raw_affiliation_strings | Google Research {xyguo, jainslie, |
| authorships[6].author.id | https://openalex.org/A5112656212 |
| authorships[6].author.orcid | |
| authorships[6].author.display_name | Yinfei Yang |
| authorships[6].countries | US |
| authorships[6].affiliations[0].institution_ids | https://openalex.org/I1291425158 |
| authorships[6].affiliations[0].raw_affiliation_string | Google Research {xyguo, jainslie, |
| authorships[6].institutions[0].id | https://openalex.org/I1291425158 |
| authorships[6].institutions[0].ror | https://ror.org/00njsd438 |
| authorships[6].institutions[0].type | company |
| authorships[6].institutions[0].lineage | https://openalex.org/I1291425158, https://openalex.org/I4210128969 |
| authorships[6].institutions[0].country_code | US |
| authorships[6].institutions[0].display_name | Google (United States) |
| authorships[6].author_position | last |
| authorships[6].raw_author_name | Yinfei Yang |
| authorships[6].is_corresponding | False |
| authorships[6].raw_affiliation_strings | Google Research {xyguo, jainslie, |
| has_content.pdf | True |
| has_content.grobid_xml | True |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://aclanthology.org/2022.findings-naacl.55.pdf |
| open_access.oa_status | hybrid |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | LongT5: Efficient Text-To-Text Transformer for Long Sequences |
| has_fulltext | True |
| is_retracted | False |
| updated_date | 2025-11-06T03:46:38.306776 |
| primary_topic.id | https://openalex.org/T10028 |
| primary_topic.field.id | https://openalex.org/fields/17 |
| primary_topic.field.display_name | Computer Science |
| primary_topic.score | 0.9986000061035156 |
| primary_topic.domain.id | https://openalex.org/domains/3 |
| primary_topic.domain.display_name | Physical Sciences |
| primary_topic.subfield.id | https://openalex.org/subfields/1702 |
| primary_topic.subfield.display_name | Artificial Intelligence |
| primary_topic.display_name | Topic Modeling |
| related_works | https://openalex.org/W3207693618, https://openalex.org/W4308478176, https://openalex.org/W4377164402, https://openalex.org/W4221140906, https://openalex.org/W2367661848, https://openalex.org/W3105439152, https://openalex.org/W4316012698, https://openalex.org/W3120390996, https://openalex.org/W207304934, https://openalex.org/W2747680751 |
| cited_by_count | 183 |
| counts_by_year[0].year | 2025 |
| counts_by_year[0].cited_by_count | 44 |
| counts_by_year[1].year | 2024 |
| counts_by_year[1].cited_by_count | 33 |
| counts_by_year[2].year | 2023 |
| counts_by_year[2].cited_by_count | 79 |
| counts_by_year[3].year | 2022 |
| counts_by_year[3].cited_by_count | 25 |
| counts_by_year[4].year | 2019 |
| counts_by_year[4].cited_by_count | 1 |
| counts_by_year[5].year | 2013 |
| counts_by_year[5].cited_by_count | 1 |
| locations_count | 1 |
| best_oa_location.id | doi:10.18653/v1/2022.findings-naacl.55 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S4363605604 |
| best_oa_location.source.issn | |
| best_oa_location.source.type | conference |
| best_oa_location.source.is_oa | False |
| best_oa_location.source.issn_l | |
| best_oa_location.source.is_core | False |
| best_oa_location.source.is_in_doaj | False |
| best_oa_location.source.display_name | Findings of the Association for Computational Linguistics: NAACL 2022 |
| best_oa_location.source.host_organization | |
| best_oa_location.source.host_organization_name | |
| best_oa_location.license | cc-by |
| best_oa_location.pdf_url | https://aclanthology.org/2022.findings-naacl.55.pdf |
| best_oa_location.version | publishedVersion |
| best_oa_location.raw_type | proceedings-article |
| best_oa_location.license_id | https://openalex.org/licenses/cc-by |
| best_oa_location.is_accepted | True |
| best_oa_location.is_published | True |
| best_oa_location.raw_source_name | Findings of the Association for Computational Linguistics: NAACL 2022 |
| best_oa_location.landing_page_url | https://doi.org/10.18653/v1/2022.findings-naacl.55 |
| primary_location.id | doi:10.18653/v1/2022.findings-naacl.55 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S4363605604 |
| primary_location.source.issn | |
| primary_location.source.type | conference |
| primary_location.source.is_oa | False |
| primary_location.source.issn_l | |
| primary_location.source.is_core | False |
| primary_location.source.is_in_doaj | False |
| primary_location.source.display_name | Findings of the Association for Computational Linguistics: NAACL 2022 |
| primary_location.source.host_organization | |
| primary_location.source.host_organization_name | |
| primary_location.license | cc-by |
| primary_location.pdf_url | https://aclanthology.org/2022.findings-naacl.55.pdf |
| primary_location.version | publishedVersion |
| primary_location.raw_type | proceedings-article |
| primary_location.license_id | https://openalex.org/licenses/cc-by |
| primary_location.is_accepted | True |
| primary_location.is_published | True |
| primary_location.raw_source_name | Findings of the Association for Computational Linguistics: NAACL 2022 |
| primary_location.landing_page_url | https://doi.org/10.18653/v1/2022.findings-naacl.55 |
| publication_date | 2022-01-01 |
| publication_year | 2022 |
| referenced_works | https://openalex.org/W3123615524, https://openalex.org/W3156789018, https://openalex.org/W2953280096, https://openalex.org/W3169012807, https://openalex.org/W4287019748, https://openalex.org/W3001279689, https://openalex.org/W2963926728, https://openalex.org/W3034999214, https://openalex.org/W2786148476, https://openalex.org/W4385245566, https://openalex.org/W2963339397, https://openalex.org/W2963341956, https://openalex.org/W2154652894, https://openalex.org/W4309793872, https://openalex.org/W2953356739, https://openalex.org/W2891534142, https://openalex.org/W2963204221, https://openalex.org/W4287704453, https://openalex.org/W2970392338, https://openalex.org/W3011411500, https://openalex.org/W3098960752, https://openalex.org/W2470673105, https://openalex.org/W3006983028, https://openalex.org/W3206557162, https://openalex.org/W3015468748, https://openalex.org/W3033529678, https://openalex.org/W2965373594, https://openalex.org/W4288089799, https://openalex.org/W2963929190, https://openalex.org/W4205897796, https://openalex.org/W2912924812, https://openalex.org/W4287667694, https://openalex.org/W2996264288, https://openalex.org/W4301633306, https://openalex.org/W3171494313, https://openalex.org/W3103682594, https://openalex.org/W3169942382, https://openalex.org/W3155147984, https://openalex.org/W2734330123 |
| referenced_works_count | 39 |
| abstract_inverted_index.a | 30, 75 |
| abstract_inverted_index.In | 24 |
| abstract_inverted_index.T5 | 70, 115 |
| abstract_inverted_index.We | 95, 120 |
| abstract_inverted_index.as | 109, 111, 129, 131 |
| abstract_inverted_index.at | 46 |
| abstract_inverted_index.is | 74 |
| abstract_inverted_index.of | 20, 37 |
| abstract_inverted_index.on | 102, 117 |
| abstract_inverted_index.or | 11 |
| abstract_inverted_index.to | 98 |
| abstract_inverted_index.we | 27, 51, 79 |
| abstract_inverted_index.(1) | 6 |
| abstract_inverted_index.(2) | 12 |
| abstract_inverted_index.The | 72 |
| abstract_inverted_index.and | 43, 59, 105, 126 |
| abstract_inverted_index.are | 96 |
| abstract_inverted_index.but | 90 |
| abstract_inverted_index.can | 16 |
| abstract_inverted_index.has | 2 |
| abstract_inverted_index.new | 31, 76 |
| abstract_inverted_index.our | 124, 132 |
| abstract_inverted_index.the | 8, 18, 35, 40, 47, 68, 113 |
| abstract_inverted_index.able | 97 |
| abstract_inverted_index.both | 39 |
| abstract_inverted_index.call | 80 |
| abstract_inverted_index.from | 55, 63 |
| abstract_inverted_index.have | 121 |
| abstract_inverted_index.into | 67 |
| abstract_inverted_index.open | 122 |
| abstract_inverted_index.same | 48 |
| abstract_inverted_index.size | 15, 45 |
| abstract_inverted_index.that | 4, 33 |
| abstract_inverted_index.this | 25 |
| abstract_inverted_index.well | 110, 130 |
| abstract_inverted_index.work | 1 |
| abstract_inverted_index.adopt | 60 |
| abstract_inverted_index.code, | 128 |
| abstract_inverted_index.ideas | 54 |
| abstract_inverted_index.input | 9, 41 |
| abstract_inverted_index.model | 14, 32, 44, 134 |
| abstract_inverted_index.shown | 3 |
| abstract_inverted_index.these | 118 |
| abstract_inverted_index.time. | 49 |
| abstract_inverted_index.which | 84 |
| abstract_inverted_index.(ETC), | 58 |
| abstract_inverted_index.Global | 82 |
| abstract_inverted_index.Recent | 0 |
| abstract_inverted_index.either | 5 |
| abstract_inverted_index.length | 10, 42 |
| abstract_inverted_index.mimics | 85 |
| abstract_inverted_index.models | 116 |
| abstract_inverted_index.neural | 22 |
| abstract_inverted_index.paper, | 26 |
| abstract_inverted_index.result | 73 |
| abstract_inverted_index.tasks, | 108 |
| abstract_inverted_index.tasks. | 119 |
| abstract_inverted_index.ETC’s | 86 |
| abstract_inverted_index.LongT5, | 29 |
| abstract_inverted_index.achieve | 99 |
| abstract_inverted_index.effects | 36 |
| abstract_inverted_index.improve | 17 |
| abstract_inverted_index.models. | 23 |
| abstract_inverted_index.present | 28 |
| abstract_inverted_index.results | 101 |
| abstract_inverted_index.scaling | 38 |
| abstract_inverted_index.several | 103 |
| abstract_inverted_index.sourced | 123 |
| abstract_inverted_index.without | 91 |
| abstract_inverted_index.explores | 34 |
| abstract_inverted_index.original | 114 |
| abstract_inverted_index.question | 106 |
| abstract_inverted_index.scalable | 69 |
| abstract_inverted_index.training | 127 |
| abstract_inverted_index.(PEGASUS) | 66 |
| abstract_inverted_index.Transient | 81 |
| abstract_inverted_index.answering | 107 |
| abstract_inverted_index.attention | 53, 77, 88 |
| abstract_inverted_index.integrate | 52 |
| abstract_inverted_index.mechanism | 78 |
| abstract_inverted_index.requiring | 92 |
| abstract_inverted_index.(TGlobal), | 83 |
| abstract_inverted_index.additional | 93 |
| abstract_inverted_index.increasing | 7, 13 |
| abstract_inverted_index.long-input | 56 |
| abstract_inverted_index.mechanism, | 89 |
| abstract_inverted_index.outperform | 112 |
| abstract_inverted_index.strategies | 62 |
| abstract_inverted_index.performance | 19 |
| abstract_inverted_index.pre-trained | 133 |
| abstract_inverted_index.architecture | 125 |
| abstract_inverted_index.checkpoints. | 135 |
| abstract_inverted_index.local/global | 87 |
| abstract_inverted_index.pre-training | 61, 65 |
| abstract_inverted_index.side-inputs. | 94 |
| abstract_inverted_index.transformers | 57 |
| abstract_inverted_index.Specifically, | 50 |
| abstract_inverted_index.architecture. | 71 |
| abstract_inverted_index.summarization | 64, 104 |
| abstract_inverted_index.state-of-the-art | 100 |
| abstract_inverted_index.Transformer-based | 21 |
| cited_by_percentile_year.max | 100 |
| cited_by_percentile_year.min | 89 |
| countries_distinct_count | 1 |
| institutions_distinct_count | 7 |
| citation_normalized_percentile.value | 0.99660451 |
| citation_normalized_percentile.is_in_top_1_percent | True |
| citation_normalized_percentile.is_in_top_10_percent | True |