Multi-Channel 2D-CNN And Attention-Based BiLSTM Method for Sentiment Analysis on Low-Resource Ewe Language Article Swipe
YOU?
·
· 2022
· Open Access
·
· DOI: https://doi.org/10.21203/rs.3.rs-2221141/v1
The unavailability of annotated dataset for a low-resource Ewe language makes it difficult to develop an automated system to appropriately evaluate public opinion on events, news, policies, and regulations in the language. In this study, we collected and preprocessed a low-resourced document-level Ewe sentiment dataset based on social media comments on five (5) different topics. Additionally, we generate three (3) word embedding models (Global vectors, word-to-vector and continuous bag-of-words ) for exploiting sentiment representation based on the dataset. We further proposed a novel multi-channel two-dimensional (2D) convolutional neural network fuse with attention-based-bidirectional long-short term memory (MC2D-CNN+BiLSTM-Attn) to detect the exact sentiment feature from the Ewe document. The proposed method can efficiently describe the following emotions: anger, annoyance, happiness, surprise, and, sadness from the newly developed dataset. Extensive experiments indicate that MC2D-CNN+BiLSTM-Attn method marginally outperformed other known state-of-the-art methods. Results show that in detecting the precise sentiments from raw Ewe textual context, the BiLSTM incorporating Glove outperforms word2vec and CBOW embedding with an accuracy of 0.72714. Furthermore, Attn+BiLSTM and Multi-channel CNN methods incorporating word2vec embedding layer perform better than Glove and CBOW embedding with an accuracy of 0.8483 and 0.8965 while our proposed technique with the same word2vec embedding recorded 0.9493.
Related Topics
- Type
- preprint
- Language
- en
- Landing Page
- https://doi.org/10.21203/rs.3.rs-2221141/v1
- https://www.researchsquare.com/article/rs-2221141/latest.pdf
- OA Status
- green
- Cited By
- 1
- Related Works
- 10
- OpenAlex ID
- https://openalex.org/W4308271542
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4308271542Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.21203/rs.3.rs-2221141/v1Digital Object Identifier
- Title
-
Multi-Channel 2D-CNN And Attention-Based BiLSTM Method for Sentiment Analysis on Low-Resource Ewe LanguageWork title
- Type
-
preprintOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2022Year of publication
- Publication date
-
2022-11-04Full publication date if available
- Authors
-
Victor Kwaku Agbesi, Wenyu Chen, Chiagoziem C. Ukwuoma, Noble Arden Kuadey, Judith A. Browne, Isaac Osei AgyemangList of authors in order
- Landing page
-
https://doi.org/10.21203/rs.3.rs-2221141/v1Publisher landing page
- PDF URL
-
https://www.researchsquare.com/article/rs-2221141/latest.pdfDirect link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
greenOpen access status per OpenAlex
- OA URL
-
https://www.researchsquare.com/article/rs-2221141/latest.pdfDirect OA link when available
- Concepts
-
Word2vec, Computer science, Sentiment analysis, Word embedding, Artificial intelligence, Embedding, Convolutional neural network, Sadness, Bag-of-words model, Natural language processing, Machine learning, Anger, Psychology, PsychiatryTop concepts (fields/topics) attached by OpenAlex
- Cited by
-
1Total citation count in OpenAlex
- Citations by year (recent)
-
2024: 1Per-year citation counts (last 5 years)
- Related works (count)
-
10Other works algorithmically related by OpenAlex
Full payload
| id | https://openalex.org/W4308271542 |
|---|---|
| doi | https://doi.org/10.21203/rs.3.rs-2221141/v1 |
| ids.doi | https://doi.org/10.21203/rs.3.rs-2221141/v1 |
| ids.openalex | https://openalex.org/W4308271542 |
| fwci | 0.19579882 |
| type | preprint |
| title | Multi-Channel 2D-CNN And Attention-Based BiLSTM Method for Sentiment Analysis on Low-Resource Ewe Language |
| biblio.issue | |
| biblio.volume | |
| biblio.last_page | |
| biblio.first_page | |
| topics[0].id | https://openalex.org/T10664 |
| topics[0].field.id | https://openalex.org/fields/17 |
| topics[0].field.display_name | Computer Science |
| topics[0].score | 0.9824000000953674 |
| topics[0].domain.id | https://openalex.org/domains/3 |
| topics[0].domain.display_name | Physical Sciences |
| topics[0].subfield.id | https://openalex.org/subfields/1702 |
| topics[0].subfield.display_name | Artificial Intelligence |
| topics[0].display_name | Sentiment Analysis and Opinion Mining |
| topics[1].id | https://openalex.org/T11550 |
| topics[1].field.id | https://openalex.org/fields/17 |
| topics[1].field.display_name | Computer Science |
| topics[1].score | 0.9160000085830688 |
| topics[1].domain.id | https://openalex.org/domains/3 |
| topics[1].domain.display_name | Physical Sciences |
| topics[1].subfield.id | https://openalex.org/subfields/1702 |
| topics[1].subfield.display_name | Artificial Intelligence |
| topics[1].display_name | Text and Document Classification Technologies |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| concepts[0].id | https://openalex.org/C2776461190 |
| concepts[0].level | 3 |
| concepts[0].score | 0.8336836099624634 |
| concepts[0].wikidata | https://www.wikidata.org/wiki/Q22673982 |
| concepts[0].display_name | Word2vec |
| concepts[1].id | https://openalex.org/C41008148 |
| concepts[1].level | 0 |
| concepts[1].score | 0.8266609907150269 |
| concepts[1].wikidata | https://www.wikidata.org/wiki/Q21198 |
| concepts[1].display_name | Computer science |
| concepts[2].id | https://openalex.org/C66402592 |
| concepts[2].level | 2 |
| concepts[2].score | 0.7392762899398804 |
| concepts[2].wikidata | https://www.wikidata.org/wiki/Q2271421 |
| concepts[2].display_name | Sentiment analysis |
| concepts[3].id | https://openalex.org/C2777462759 |
| concepts[3].level | 3 |
| concepts[3].score | 0.7360892295837402 |
| concepts[3].wikidata | https://www.wikidata.org/wiki/Q18395344 |
| concepts[3].display_name | Word embedding |
| concepts[4].id | https://openalex.org/C154945302 |
| concepts[4].level | 1 |
| concepts[4].score | 0.6564350128173828 |
| concepts[4].wikidata | https://www.wikidata.org/wiki/Q11660 |
| concepts[4].display_name | Artificial intelligence |
| concepts[5].id | https://openalex.org/C41608201 |
| concepts[5].level | 2 |
| concepts[5].score | 0.5210009217262268 |
| concepts[5].wikidata | https://www.wikidata.org/wiki/Q980509 |
| concepts[5].display_name | Embedding |
| concepts[6].id | https://openalex.org/C81363708 |
| concepts[6].level | 2 |
| concepts[6].score | 0.5100749135017395 |
| concepts[6].wikidata | https://www.wikidata.org/wiki/Q17084460 |
| concepts[6].display_name | Convolutional neural network |
| concepts[7].id | https://openalex.org/C2779812673 |
| concepts[7].level | 3 |
| concepts[7].score | 0.4955815076828003 |
| concepts[7].wikidata | https://www.wikidata.org/wiki/Q169251 |
| concepts[7].display_name | Sadness |
| concepts[8].id | https://openalex.org/C13672336 |
| concepts[8].level | 2 |
| concepts[8].score | 0.4345704913139343 |
| concepts[8].wikidata | https://www.wikidata.org/wiki/Q3460803 |
| concepts[8].display_name | Bag-of-words model |
| concepts[9].id | https://openalex.org/C204321447 |
| concepts[9].level | 1 |
| concepts[9].score | 0.41374799609184265 |
| concepts[9].wikidata | https://www.wikidata.org/wiki/Q30642 |
| concepts[9].display_name | Natural language processing |
| concepts[10].id | https://openalex.org/C119857082 |
| concepts[10].level | 1 |
| concepts[10].score | 0.3331787586212158 |
| concepts[10].wikidata | https://www.wikidata.org/wiki/Q2539 |
| concepts[10].display_name | Machine learning |
| concepts[11].id | https://openalex.org/C2779302386 |
| concepts[11].level | 2 |
| concepts[11].score | 0.1920783519744873 |
| concepts[11].wikidata | https://www.wikidata.org/wiki/Q79871 |
| concepts[11].display_name | Anger |
| concepts[12].id | https://openalex.org/C15744967 |
| concepts[12].level | 0 |
| concepts[12].score | 0.0 |
| concepts[12].wikidata | https://www.wikidata.org/wiki/Q9418 |
| concepts[12].display_name | Psychology |
| concepts[13].id | https://openalex.org/C118552586 |
| concepts[13].level | 1 |
| concepts[13].score | 0.0 |
| concepts[13].wikidata | https://www.wikidata.org/wiki/Q7867 |
| concepts[13].display_name | Psychiatry |
| keywords[0].id | https://openalex.org/keywords/word2vec |
| keywords[0].score | 0.8336836099624634 |
| keywords[0].display_name | Word2vec |
| keywords[1].id | https://openalex.org/keywords/computer-science |
| keywords[1].score | 0.8266609907150269 |
| keywords[1].display_name | Computer science |
| keywords[2].id | https://openalex.org/keywords/sentiment-analysis |
| keywords[2].score | 0.7392762899398804 |
| keywords[2].display_name | Sentiment analysis |
| keywords[3].id | https://openalex.org/keywords/word-embedding |
| keywords[3].score | 0.7360892295837402 |
| keywords[3].display_name | Word embedding |
| keywords[4].id | https://openalex.org/keywords/artificial-intelligence |
| keywords[4].score | 0.6564350128173828 |
| keywords[4].display_name | Artificial intelligence |
| keywords[5].id | https://openalex.org/keywords/embedding |
| keywords[5].score | 0.5210009217262268 |
| keywords[5].display_name | Embedding |
| keywords[6].id | https://openalex.org/keywords/convolutional-neural-network |
| keywords[6].score | 0.5100749135017395 |
| keywords[6].display_name | Convolutional neural network |
| keywords[7].id | https://openalex.org/keywords/sadness |
| keywords[7].score | 0.4955815076828003 |
| keywords[7].display_name | Sadness |
| keywords[8].id | https://openalex.org/keywords/bag-of-words-model |
| keywords[8].score | 0.4345704913139343 |
| keywords[8].display_name | Bag-of-words model |
| keywords[9].id | https://openalex.org/keywords/natural-language-processing |
| keywords[9].score | 0.41374799609184265 |
| keywords[9].display_name | Natural language processing |
| keywords[10].id | https://openalex.org/keywords/machine-learning |
| keywords[10].score | 0.3331787586212158 |
| keywords[10].display_name | Machine learning |
| keywords[11].id | https://openalex.org/keywords/anger |
| keywords[11].score | 0.1920783519744873 |
| keywords[11].display_name | Anger |
| language | en |
| locations[0].id | doi:10.21203/rs.3.rs-2221141/v1 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S4306402450 |
| locations[0].source.issn | |
| locations[0].source.type | repository |
| locations[0].source.is_oa | False |
| locations[0].source.issn_l | |
| locations[0].source.is_core | False |
| locations[0].source.is_in_doaj | False |
| locations[0].source.display_name | Research Square (Research Square) |
| locations[0].source.host_organization | https://openalex.org/I4210096694 |
| locations[0].source.host_organization_name | Research Square (United States) |
| locations[0].source.host_organization_lineage | https://openalex.org/I4210096694 |
| locations[0].license | cc-by |
| locations[0].pdf_url | https://www.researchsquare.com/article/rs-2221141/latest.pdf |
| locations[0].version | acceptedVersion |
| locations[0].raw_type | posted-content |
| locations[0].license_id | https://openalex.org/licenses/cc-by |
| locations[0].is_accepted | True |
| locations[0].is_published | False |
| locations[0].raw_source_name | |
| locations[0].landing_page_url | https://doi.org/10.21203/rs.3.rs-2221141/v1 |
| indexed_in | crossref |
| authorships[0].author.id | https://openalex.org/A5017252813 |
| authorships[0].author.orcid | https://orcid.org/0000-0003-0723-5008 |
| authorships[0].author.display_name | Victor Kwaku Agbesi |
| authorships[0].countries | CN |
| authorships[0].affiliations[0].institution_ids | https://openalex.org/I150229711 |
| authorships[0].affiliations[0].raw_affiliation_string | University of Electronic Science and Technology of China (UESTC) |
| authorships[0].institutions[0].id | https://openalex.org/I150229711 |
| authorships[0].institutions[0].ror | https://ror.org/04qr3zq92 |
| authorships[0].institutions[0].type | education |
| authorships[0].institutions[0].lineage | https://openalex.org/I150229711 |
| authorships[0].institutions[0].country_code | CN |
| authorships[0].institutions[0].display_name | University of Electronic Science and Technology of China |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Victor Kwaku Agbesi |
| authorships[0].is_corresponding | False |
| authorships[0].raw_affiliation_strings | University of Electronic Science and Technology of China (UESTC) |
| authorships[1].author.id | https://openalex.org/A5100687323 |
| authorships[1].author.orcid | https://orcid.org/0000-0002-9933-8014 |
| authorships[1].author.display_name | Wenyu Chen |
| authorships[1].countries | CN |
| authorships[1].affiliations[0].institution_ids | https://openalex.org/I150229711 |
| authorships[1].affiliations[0].raw_affiliation_string | University of Electronic Science and Technology of China (UESTC) |
| authorships[1].institutions[0].id | https://openalex.org/I150229711 |
| authorships[1].institutions[0].ror | https://ror.org/04qr3zq92 |
| authorships[1].institutions[0].type | education |
| authorships[1].institutions[0].lineage | https://openalex.org/I150229711 |
| authorships[1].institutions[0].country_code | CN |
| authorships[1].institutions[0].display_name | University of Electronic Science and Technology of China |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Chen Wenyu |
| authorships[1].is_corresponding | False |
| authorships[1].raw_affiliation_strings | University of Electronic Science and Technology of China (UESTC) |
| authorships[2].author.id | https://openalex.org/A5023088189 |
| authorships[2].author.orcid | https://orcid.org/0000-0002-4532-6026 |
| authorships[2].author.display_name | Chiagoziem C. Ukwuoma |
| authorships[2].countries | CN |
| authorships[2].affiliations[0].institution_ids | https://openalex.org/I150229711 |
| authorships[2].affiliations[0].raw_affiliation_string | University of Electronic Science and Technology of China (UESTC) |
| authorships[2].institutions[0].id | https://openalex.org/I150229711 |
| authorships[2].institutions[0].ror | https://ror.org/04qr3zq92 |
| authorships[2].institutions[0].type | education |
| authorships[2].institutions[0].lineage | https://openalex.org/I150229711 |
| authorships[2].institutions[0].country_code | CN |
| authorships[2].institutions[0].display_name | University of Electronic Science and Technology of China |
| authorships[2].author_position | middle |
| authorships[2].raw_author_name | Chiagoziem C. Ukwuoma |
| authorships[2].is_corresponding | False |
| authorships[2].raw_affiliation_strings | University of Electronic Science and Technology of China (UESTC) |
| authorships[3].author.id | https://openalex.org/A5015553254 |
| authorships[3].author.orcid | https://orcid.org/0000-0001-5346-8553 |
| authorships[3].author.display_name | Noble Arden Kuadey |
| authorships[3].countries | CN |
| authorships[3].affiliations[0].institution_ids | https://openalex.org/I150229711 |
| authorships[3].affiliations[0].raw_affiliation_string | University of Electronic Science and Technology of China (UESTC) |
| authorships[3].institutions[0].id | https://openalex.org/I150229711 |
| authorships[3].institutions[0].ror | https://ror.org/04qr3zq92 |
| authorships[3].institutions[0].type | education |
| authorships[3].institutions[0].lineage | https://openalex.org/I150229711 |
| authorships[3].institutions[0].country_code | CN |
| authorships[3].institutions[0].display_name | University of Electronic Science and Technology of China |
| authorships[3].author_position | middle |
| authorships[3].raw_author_name | Noble A. Kuadey |
| authorships[3].is_corresponding | False |
| authorships[3].raw_affiliation_strings | University of Electronic Science and Technology of China (UESTC) |
| authorships[4].author.id | https://openalex.org/A5074263488 |
| authorships[4].author.orcid | |
| authorships[4].author.display_name | Judith A. Browne |
| authorships[4].countries | CN |
| authorships[4].affiliations[0].institution_ids | https://openalex.org/I150229711 |
| authorships[4].affiliations[0].raw_affiliation_string | University of Electronic Science and Technology of China (UESTC) |
| authorships[4].institutions[0].id | https://openalex.org/I150229711 |
| authorships[4].institutions[0].ror | https://ror.org/04qr3zq92 |
| authorships[4].institutions[0].type | education |
| authorships[4].institutions[0].lineage | https://openalex.org/I150229711 |
| authorships[4].institutions[0].country_code | CN |
| authorships[4].institutions[0].display_name | University of Electronic Science and Technology of China |
| authorships[4].author_position | middle |
| authorships[4].raw_author_name | Judith A. Browne |
| authorships[4].is_corresponding | False |
| authorships[4].raw_affiliation_strings | University of Electronic Science and Technology of China (UESTC) |
| authorships[5].author.id | https://openalex.org/A5085751106 |
| authorships[5].author.orcid | https://orcid.org/0000-0002-3559-7746 |
| authorships[5].author.display_name | Isaac Osei Agyemang |
| authorships[5].countries | CN |
| authorships[5].affiliations[0].institution_ids | https://openalex.org/I150229711 |
| authorships[5].affiliations[0].raw_affiliation_string | University of Electronic Science and Technology of China (UESTC) |
| authorships[5].institutions[0].id | https://openalex.org/I150229711 |
| authorships[5].institutions[0].ror | https://ror.org/04qr3zq92 |
| authorships[5].institutions[0].type | education |
| authorships[5].institutions[0].lineage | https://openalex.org/I150229711 |
| authorships[5].institutions[0].country_code | CN |
| authorships[5].institutions[0].display_name | University of Electronic Science and Technology of China |
| authorships[5].author_position | last |
| authorships[5].raw_author_name | Isaac Osei Agyemang |
| authorships[5].is_corresponding | False |
| authorships[5].raw_affiliation_strings | University of Electronic Science and Technology of China (UESTC) |
| has_content.pdf | True |
| has_content.grobid_xml | True |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://www.researchsquare.com/article/rs-2221141/latest.pdf |
| open_access.oa_status | green |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | Multi-Channel 2D-CNN And Attention-Based BiLSTM Method for Sentiment Analysis on Low-Resource Ewe Language |
| has_fulltext | True |
| is_retracted | False |
| updated_date | 2025-11-06T03:46:38.306776 |
| primary_topic.id | https://openalex.org/T10664 |
| primary_topic.field.id | https://openalex.org/fields/17 |
| primary_topic.field.display_name | Computer Science |
| primary_topic.score | 0.9824000000953674 |
| primary_topic.domain.id | https://openalex.org/domains/3 |
| primary_topic.domain.display_name | Physical Sciences |
| primary_topic.subfield.id | https://openalex.org/subfields/1702 |
| primary_topic.subfield.display_name | Artificial Intelligence |
| primary_topic.display_name | Sentiment Analysis and Opinion Mining |
| related_works | https://openalex.org/W4210823838, https://openalex.org/W2896498353, https://openalex.org/W3133567596, https://openalex.org/W3191135439, https://openalex.org/W3080191145, https://openalex.org/W4205948734, https://openalex.org/W3034080962, https://openalex.org/W4308271542, https://openalex.org/W4307074408, https://openalex.org/W4296004246 |
| cited_by_count | 1 |
| counts_by_year[0].year | 2024 |
| counts_by_year[0].cited_by_count | 1 |
| locations_count | 1 |
| best_oa_location.id | doi:10.21203/rs.3.rs-2221141/v1 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S4306402450 |
| best_oa_location.source.issn | |
| best_oa_location.source.type | repository |
| best_oa_location.source.is_oa | False |
| best_oa_location.source.issn_l | |
| best_oa_location.source.is_core | False |
| best_oa_location.source.is_in_doaj | False |
| best_oa_location.source.display_name | Research Square (Research Square) |
| best_oa_location.source.host_organization | https://openalex.org/I4210096694 |
| best_oa_location.source.host_organization_name | Research Square (United States) |
| best_oa_location.source.host_organization_lineage | https://openalex.org/I4210096694 |
| best_oa_location.license | cc-by |
| best_oa_location.pdf_url | https://www.researchsquare.com/article/rs-2221141/latest.pdf |
| best_oa_location.version | acceptedVersion |
| best_oa_location.raw_type | posted-content |
| best_oa_location.license_id | https://openalex.org/licenses/cc-by |
| best_oa_location.is_accepted | True |
| best_oa_location.is_published | False |
| best_oa_location.raw_source_name | |
| best_oa_location.landing_page_url | https://doi.org/10.21203/rs.3.rs-2221141/v1 |
| primary_location.id | doi:10.21203/rs.3.rs-2221141/v1 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S4306402450 |
| primary_location.source.issn | |
| primary_location.source.type | repository |
| primary_location.source.is_oa | False |
| primary_location.source.issn_l | |
| primary_location.source.is_core | False |
| primary_location.source.is_in_doaj | False |
| primary_location.source.display_name | Research Square (Research Square) |
| primary_location.source.host_organization | https://openalex.org/I4210096694 |
| primary_location.source.host_organization_name | Research Square (United States) |
| primary_location.source.host_organization_lineage | https://openalex.org/I4210096694 |
| primary_location.license | cc-by |
| primary_location.pdf_url | https://www.researchsquare.com/article/rs-2221141/latest.pdf |
| primary_location.version | acceptedVersion |
| primary_location.raw_type | posted-content |
| primary_location.license_id | https://openalex.org/licenses/cc-by |
| primary_location.is_accepted | True |
| primary_location.is_published | False |
| primary_location.raw_source_name | |
| primary_location.landing_page_url | https://doi.org/10.21203/rs.3.rs-2221141/v1 |
| publication_date | 2022-11-04 |
| publication_year | 2022 |
| referenced_works_count | 0 |
| abstract_inverted_index.) | 70 |
| abstract_inverted_index.a | 7, 40, 82 |
| abstract_inverted_index.In | 33 |
| abstract_inverted_index.We | 79 |
| abstract_inverted_index.an | 16, 162, 184 |
| abstract_inverted_index.in | 30, 142 |
| abstract_inverted_index.it | 12 |
| abstract_inverted_index.of | 3, 164, 186 |
| abstract_inverted_index.on | 24, 47, 51, 76 |
| abstract_inverted_index.to | 14, 19, 97 |
| abstract_inverted_index.we | 36, 57 |
| abstract_inverted_index.(3) | 60 |
| abstract_inverted_index.(5) | 53 |
| abstract_inverted_index.CNN | 170 |
| abstract_inverted_index.Ewe | 9, 43, 105, 149 |
| abstract_inverted_index.The | 1, 107 |
| abstract_inverted_index.and | 28, 38, 67, 158, 168, 180, 188 |
| abstract_inverted_index.can | 110 |
| abstract_inverted_index.for | 6, 71 |
| abstract_inverted_index.our | 191 |
| abstract_inverted_index.raw | 148 |
| abstract_inverted_index.the | 31, 77, 99, 104, 113, 123, 144, 152, 195 |
| abstract_inverted_index.(2D) | 86 |
| abstract_inverted_index.CBOW | 159, 181 |
| abstract_inverted_index.and, | 120 |
| abstract_inverted_index.five | 52 |
| abstract_inverted_index.from | 103, 122, 147 |
| abstract_inverted_index.fuse | 90 |
| abstract_inverted_index.same | 196 |
| abstract_inverted_index.show | 140 |
| abstract_inverted_index.term | 94 |
| abstract_inverted_index.than | 178 |
| abstract_inverted_index.that | 130, 141 |
| abstract_inverted_index.this | 34 |
| abstract_inverted_index.with | 91, 161, 183, 194 |
| abstract_inverted_index.word | 61 |
| abstract_inverted_index.Glove | 155, 179 |
| abstract_inverted_index.based | 46, 75 |
| abstract_inverted_index.exact | 100 |
| abstract_inverted_index.known | 136 |
| abstract_inverted_index.layer | 175 |
| abstract_inverted_index.makes | 11 |
| abstract_inverted_index.media | 49 |
| abstract_inverted_index.newly | 124 |
| abstract_inverted_index.news, | 26 |
| abstract_inverted_index.novel | 83 |
| abstract_inverted_index.other | 135 |
| abstract_inverted_index.three | 59 |
| abstract_inverted_index.while | 190 |
| abstract_inverted_index.0.8483 | 187 |
| abstract_inverted_index.0.8965 | 189 |
| abstract_inverted_index.BiLSTM | 153 |
| abstract_inverted_index.anger, | 116 |
| abstract_inverted_index.better | 177 |
| abstract_inverted_index.detect | 98 |
| abstract_inverted_index.memory | 95 |
| abstract_inverted_index.method | 109, 132 |
| abstract_inverted_index.models | 63 |
| abstract_inverted_index.neural | 88 |
| abstract_inverted_index.public | 22 |
| abstract_inverted_index.social | 48 |
| abstract_inverted_index.study, | 35 |
| abstract_inverted_index.system | 18 |
| abstract_inverted_index.(Global | 64 |
| abstract_inverted_index.0.9493. | 200 |
| abstract_inverted_index.Results | 139 |
| abstract_inverted_index.dataset | 5, 45 |
| abstract_inverted_index.develop | 15 |
| abstract_inverted_index.events, | 25 |
| abstract_inverted_index.feature | 102 |
| abstract_inverted_index.further | 80 |
| abstract_inverted_index.methods | 171 |
| abstract_inverted_index.network | 89 |
| abstract_inverted_index.opinion | 23 |
| abstract_inverted_index.perform | 176 |
| abstract_inverted_index.precise | 145 |
| abstract_inverted_index.sadness | 121 |
| abstract_inverted_index.textual | 150 |
| abstract_inverted_index.topics. | 55 |
| abstract_inverted_index.0.72714. | 165 |
| abstract_inverted_index.Abstract | 0 |
| abstract_inverted_index.accuracy | 163, 185 |
| abstract_inverted_index.comments | 50 |
| abstract_inverted_index.context, | 151 |
| abstract_inverted_index.dataset. | 78, 126 |
| abstract_inverted_index.describe | 112 |
| abstract_inverted_index.evaluate | 21 |
| abstract_inverted_index.generate | 58 |
| abstract_inverted_index.indicate | 129 |
| abstract_inverted_index.language | 10 |
| abstract_inverted_index.methods. | 138 |
| abstract_inverted_index.proposed | 81, 108, 192 |
| abstract_inverted_index.recorded | 199 |
| abstract_inverted_index.vectors, | 65 |
| abstract_inverted_index.word2vec | 157, 173, 197 |
| abstract_inverted_index.Extensive | 127 |
| abstract_inverted_index.annotated | 4 |
| abstract_inverted_index.automated | 17 |
| abstract_inverted_index.collected | 37 |
| abstract_inverted_index.detecting | 143 |
| abstract_inverted_index.developed | 125 |
| abstract_inverted_index.different | 54 |
| abstract_inverted_index.difficult | 13 |
| abstract_inverted_index.document. | 106 |
| abstract_inverted_index.embedding | 62, 160, 174, 182, 198 |
| abstract_inverted_index.emotions: | 115 |
| abstract_inverted_index.following | 114 |
| abstract_inverted_index.language. | 32 |
| abstract_inverted_index.policies, | 27 |
| abstract_inverted_index.sentiment | 44, 73, 101 |
| abstract_inverted_index.surprise, | 119 |
| abstract_inverted_index.technique | 193 |
| abstract_inverted_index.annoyance, | 117 |
| abstract_inverted_index.continuous | 68 |
| abstract_inverted_index.exploiting | 72 |
| abstract_inverted_index.happiness, | 118 |
| abstract_inverted_index.long-short | 93 |
| abstract_inverted_index.marginally | 133 |
| abstract_inverted_index.sentiments | 146 |
| abstract_inverted_index.Attn+BiLSTM | 167 |
| abstract_inverted_index.efficiently | 111 |
| abstract_inverted_index.experiments | 128 |
| abstract_inverted_index.outperforms | 156 |
| abstract_inverted_index.regulations | 29 |
| abstract_inverted_index.Furthermore, | 166 |
| abstract_inverted_index.bag-of-words | 69 |
| abstract_inverted_index.low-resource | 8 |
| abstract_inverted_index.outperformed | 134 |
| abstract_inverted_index.preprocessed | 39 |
| abstract_inverted_index.Additionally, | 56 |
| abstract_inverted_index.Multi-channel | 169 |
| abstract_inverted_index.appropriately | 20 |
| abstract_inverted_index.convolutional | 87 |
| abstract_inverted_index.incorporating | 154, 172 |
| abstract_inverted_index.low-resourced | 41 |
| abstract_inverted_index.multi-channel | 84 |
| abstract_inverted_index.document-level | 42 |
| abstract_inverted_index.representation | 74 |
| abstract_inverted_index.unavailability | 2 |
| abstract_inverted_index.word-to-vector | 66 |
| abstract_inverted_index.two-dimensional | 85 |
| abstract_inverted_index.state-of-the-art | 137 |
| abstract_inverted_index.MC2D-CNN+BiLSTM-Attn | 131 |
| abstract_inverted_index.(MC2D-CNN+BiLSTM-Attn) | 96 |
| abstract_inverted_index.attention-based-bidirectional | 92 |
| cited_by_percentile_year.max | 94 |
| cited_by_percentile_year.min | 90 |
| countries_distinct_count | 1 |
| institutions_distinct_count | 6 |
| citation_normalized_percentile.value | 0.55008945 |
| citation_normalized_percentile.is_in_top_1_percent | False |
| citation_normalized_percentile.is_in_top_10_percent | False |