A Systematic Review on Affective Computing: Emotion Models, Databases, and Recent Advances Article Swipe
YOU?
·
· 2022
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.2203.06935
Affective computing plays a key role in human-computer interactions, entertainment, teaching, safe driving, and multimedia integration. Major breakthroughs have been made recently in the areas of affective computing (i.e., emotion recognition and sentiment analysis). Affective computing is realized based on unimodal or multimodal data, primarily consisting of physical information (e.g., textual, audio, and visual data) and physiological signals (e.g., EEG and ECG signals). Physical-based affect recognition caters to more researchers due to multiple public databases. However, it is hard to reveal one's inner emotion hidden purposely from facial expressions, audio tones, body gestures, etc. Physiological signals can generate more precise and reliable emotional results; yet, the difficulty in acquiring physiological signals also hinders their practical application. Thus, the fusion of physical information and physiological signals can provide useful features of emotional states and lead to higher accuracy. Instead of focusing on one specific field of affective analysis, we systematically review recent advances in the affective computing, and taxonomize unimodal affect recognition as well as multimodal affective analysis. Firstly, we introduce two typical emotion models followed by commonly used databases for affective computing. Next, we survey and taxonomize state-of-the-art unimodal affect recognition and multimodal affective analysis in terms of their detailed architectures and performances. Finally, we discuss some important aspects on affective computing and their applications and conclude this review with an indication of the most promising future directions, such as the establishment of baseline dataset, fusion strategies for multimodal affective analysis, and unsupervised learning models.
Related Topics
- Type
- review
- Language
- en
- Landing Page
- http://arxiv.org/abs/2203.06935
- https://arxiv.org/pdf/2203.06935
- OA Status
- green
- Cited By
- 2
- Related Works
- 10
- OpenAlex ID
- https://openalex.org/W4221141542
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4221141542Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.48550/arxiv.2203.06935Digital Object Identifier
- Title
-
A Systematic Review on Affective Computing: Emotion Models, Databases, and Recent AdvancesWork title
- Type
-
reviewOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2022Year of publication
- Publication date
-
2022-03-14Full publication date if available
- Authors
-
Yan Wang, Wei Song, Wei Tao, Antonio Liotta, Dawei Yang, Xinlei Li, Shuyong Gao, Yixuan Sun, Weifeng Ge, Wei Zhang, Wenqiang ZhangList of authors in order
- Landing page
-
https://arxiv.org/abs/2203.06935Publisher landing page
- PDF URL
-
https://arxiv.org/pdf/2203.06935Direct link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
greenOpen access status per OpenAlex
- OA URL
-
https://arxiv.org/pdf/2203.06935Direct OA link when available
- Concepts
-
Affective computing, Affect (linguistics), Computer science, Field (mathematics), Gesture, Emotion recognition, Sentiment analysis, Multimodality, Emotion classification, Facial expression, Artificial intelligence, Human–computer interaction, Psychology, World Wide Web, Communication, Mathematics, Pure mathematicsTop concepts (fields/topics) attached by OpenAlex
- Cited by
-
2Total citation count in OpenAlex
- Citations by year (recent)
-
2023: 1, 2022: 1Per-year citation counts (last 5 years)
- Related works (count)
-
10Other works algorithmically related by OpenAlex
Full payload
| id | https://openalex.org/W4221141542 |
|---|---|
| doi | https://doi.org/10.48550/arxiv.2203.06935 |
| ids.doi | https://doi.org/10.48550/arxiv.2203.06935 |
| ids.openalex | https://openalex.org/W4221141542 |
| fwci | |
| type | review |
| title | A Systematic Review on Affective Computing: Emotion Models, Databases, and Recent Advances |
| biblio.issue | |
| biblio.volume | |
| biblio.last_page | |
| biblio.first_page | |
| topics[0].id | https://openalex.org/T10667 |
| topics[0].field.id | https://openalex.org/fields/32 |
| topics[0].field.display_name | Psychology |
| topics[0].score | 0.9534000158309937 |
| topics[0].domain.id | https://openalex.org/domains/2 |
| topics[0].domain.display_name | Social Sciences |
| topics[0].subfield.id | https://openalex.org/subfields/3205 |
| topics[0].subfield.display_name | Experimental and Cognitive Psychology |
| topics[0].display_name | Emotion and Mood Recognition |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| concepts[0].id | https://openalex.org/C6438553 |
| concepts[0].level | 2 |
| concepts[0].score | 0.8843011260032654 |
| concepts[0].wikidata | https://www.wikidata.org/wiki/Q1185804 |
| concepts[0].display_name | Affective computing |
| concepts[1].id | https://openalex.org/C2776035688 |
| concepts[1].level | 2 |
| concepts[1].score | 0.6716889142990112 |
| concepts[1].wikidata | https://www.wikidata.org/wiki/Q1606558 |
| concepts[1].display_name | Affect (linguistics) |
| concepts[2].id | https://openalex.org/C41008148 |
| concepts[2].level | 0 |
| concepts[2].score | 0.6491525769233704 |
| concepts[2].wikidata | https://www.wikidata.org/wiki/Q21198 |
| concepts[2].display_name | Computer science |
| concepts[3].id | https://openalex.org/C9652623 |
| concepts[3].level | 2 |
| concepts[3].score | 0.5977230072021484 |
| concepts[3].wikidata | https://www.wikidata.org/wiki/Q190109 |
| concepts[3].display_name | Field (mathematics) |
| concepts[4].id | https://openalex.org/C207347870 |
| concepts[4].level | 2 |
| concepts[4].score | 0.577089250087738 |
| concepts[4].wikidata | https://www.wikidata.org/wiki/Q371174 |
| concepts[4].display_name | Gesture |
| concepts[5].id | https://openalex.org/C2777438025 |
| concepts[5].level | 2 |
| concepts[5].score | 0.5612120032310486 |
| concepts[5].wikidata | https://www.wikidata.org/wiki/Q1339090 |
| concepts[5].display_name | Emotion recognition |
| concepts[6].id | https://openalex.org/C66402592 |
| concepts[6].level | 2 |
| concepts[6].score | 0.4986386299133301 |
| concepts[6].wikidata | https://www.wikidata.org/wiki/Q2271421 |
| concepts[6].display_name | Sentiment analysis |
| concepts[7].id | https://openalex.org/C2780910867 |
| concepts[7].level | 2 |
| concepts[7].score | 0.48010945320129395 |
| concepts[7].wikidata | https://www.wikidata.org/wiki/Q1952416 |
| concepts[7].display_name | Multimodality |
| concepts[8].id | https://openalex.org/C206310091 |
| concepts[8].level | 2 |
| concepts[8].score | 0.44793710112571716 |
| concepts[8].wikidata | https://www.wikidata.org/wiki/Q750859 |
| concepts[8].display_name | Emotion classification |
| concepts[9].id | https://openalex.org/C195704467 |
| concepts[9].level | 2 |
| concepts[9].score | 0.44556158781051636 |
| concepts[9].wikidata | https://www.wikidata.org/wiki/Q327968 |
| concepts[9].display_name | Facial expression |
| concepts[10].id | https://openalex.org/C154945302 |
| concepts[10].level | 1 |
| concepts[10].score | 0.3657371699810028 |
| concepts[10].wikidata | https://www.wikidata.org/wiki/Q11660 |
| concepts[10].display_name | Artificial intelligence |
| concepts[11].id | https://openalex.org/C107457646 |
| concepts[11].level | 1 |
| concepts[11].score | 0.3256242275238037 |
| concepts[11].wikidata | https://www.wikidata.org/wiki/Q207434 |
| concepts[11].display_name | Human–computer interaction |
| concepts[12].id | https://openalex.org/C15744967 |
| concepts[12].level | 0 |
| concepts[12].score | 0.17204442620277405 |
| concepts[12].wikidata | https://www.wikidata.org/wiki/Q9418 |
| concepts[12].display_name | Psychology |
| concepts[13].id | https://openalex.org/C136764020 |
| concepts[13].level | 1 |
| concepts[13].score | 0.09145790338516235 |
| concepts[13].wikidata | https://www.wikidata.org/wiki/Q466 |
| concepts[13].display_name | World Wide Web |
| concepts[14].id | https://openalex.org/C46312422 |
| concepts[14].level | 1 |
| concepts[14].score | 0.08741962909698486 |
| concepts[14].wikidata | https://www.wikidata.org/wiki/Q11024 |
| concepts[14].display_name | Communication |
| concepts[15].id | https://openalex.org/C33923547 |
| concepts[15].level | 0 |
| concepts[15].score | 0.0 |
| concepts[15].wikidata | https://www.wikidata.org/wiki/Q395 |
| concepts[15].display_name | Mathematics |
| concepts[16].id | https://openalex.org/C202444582 |
| concepts[16].level | 1 |
| concepts[16].score | 0.0 |
| concepts[16].wikidata | https://www.wikidata.org/wiki/Q837863 |
| concepts[16].display_name | Pure mathematics |
| keywords[0].id | https://openalex.org/keywords/affective-computing |
| keywords[0].score | 0.8843011260032654 |
| keywords[0].display_name | Affective computing |
| keywords[1].id | https://openalex.org/keywords/affect |
| keywords[1].score | 0.6716889142990112 |
| keywords[1].display_name | Affect (linguistics) |
| keywords[2].id | https://openalex.org/keywords/computer-science |
| keywords[2].score | 0.6491525769233704 |
| keywords[2].display_name | Computer science |
| keywords[3].id | https://openalex.org/keywords/field |
| keywords[3].score | 0.5977230072021484 |
| keywords[3].display_name | Field (mathematics) |
| keywords[4].id | https://openalex.org/keywords/gesture |
| keywords[4].score | 0.577089250087738 |
| keywords[4].display_name | Gesture |
| keywords[5].id | https://openalex.org/keywords/emotion-recognition |
| keywords[5].score | 0.5612120032310486 |
| keywords[5].display_name | Emotion recognition |
| keywords[6].id | https://openalex.org/keywords/sentiment-analysis |
| keywords[6].score | 0.4986386299133301 |
| keywords[6].display_name | Sentiment analysis |
| keywords[7].id | https://openalex.org/keywords/multimodality |
| keywords[7].score | 0.48010945320129395 |
| keywords[7].display_name | Multimodality |
| keywords[8].id | https://openalex.org/keywords/emotion-classification |
| keywords[8].score | 0.44793710112571716 |
| keywords[8].display_name | Emotion classification |
| keywords[9].id | https://openalex.org/keywords/facial-expression |
| keywords[9].score | 0.44556158781051636 |
| keywords[9].display_name | Facial expression |
| keywords[10].id | https://openalex.org/keywords/artificial-intelligence |
| keywords[10].score | 0.3657371699810028 |
| keywords[10].display_name | Artificial intelligence |
| keywords[11].id | https://openalex.org/keywords/human–computer-interaction |
| keywords[11].score | 0.3256242275238037 |
| keywords[11].display_name | Human–computer interaction |
| keywords[12].id | https://openalex.org/keywords/psychology |
| keywords[12].score | 0.17204442620277405 |
| keywords[12].display_name | Psychology |
| keywords[13].id | https://openalex.org/keywords/world-wide-web |
| keywords[13].score | 0.09145790338516235 |
| keywords[13].display_name | World Wide Web |
| keywords[14].id | https://openalex.org/keywords/communication |
| keywords[14].score | 0.08741962909698486 |
| keywords[14].display_name | Communication |
| language | en |
| locations[0].id | pmh:oai:arXiv.org:2203.06935 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S4306400194 |
| locations[0].source.issn | |
| locations[0].source.type | repository |
| locations[0].source.is_oa | True |
| locations[0].source.issn_l | |
| locations[0].source.is_core | False |
| locations[0].source.is_in_doaj | False |
| locations[0].source.display_name | arXiv (Cornell University) |
| locations[0].source.host_organization | https://openalex.org/I205783295 |
| locations[0].source.host_organization_name | Cornell University |
| locations[0].source.host_organization_lineage | https://openalex.org/I205783295 |
| locations[0].license | |
| locations[0].pdf_url | https://arxiv.org/pdf/2203.06935 |
| locations[0].version | submittedVersion |
| locations[0].raw_type | text |
| locations[0].license_id | |
| locations[0].is_accepted | False |
| locations[0].is_published | False |
| locations[0].raw_source_name | |
| locations[0].landing_page_url | http://arxiv.org/abs/2203.06935 |
| locations[1].id | doi:10.48550/arxiv.2203.06935 |
| locations[1].is_oa | True |
| locations[1].source.id | https://openalex.org/S4306400194 |
| locations[1].source.issn | |
| locations[1].source.type | repository |
| locations[1].source.is_oa | True |
| locations[1].source.issn_l | |
| locations[1].source.is_core | False |
| locations[1].source.is_in_doaj | False |
| locations[1].source.display_name | arXiv (Cornell University) |
| locations[1].source.host_organization | https://openalex.org/I205783295 |
| locations[1].source.host_organization_name | Cornell University |
| locations[1].source.host_organization_lineage | https://openalex.org/I205783295 |
| locations[1].license | |
| locations[1].pdf_url | |
| locations[1].version | |
| locations[1].raw_type | article |
| locations[1].license_id | |
| locations[1].is_accepted | False |
| locations[1].is_published | |
| locations[1].raw_source_name | |
| locations[1].landing_page_url | https://doi.org/10.48550/arxiv.2203.06935 |
| indexed_in | arxiv, datacite |
| authorships[0].author.id | https://openalex.org/A5100711836 |
| authorships[0].author.orcid | https://orcid.org/0000-0003-3664-7933 |
| authorships[0].author.display_name | Yan Wang |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Wang, Yan |
| authorships[0].is_corresponding | False |
| authorships[1].author.id | https://openalex.org/A5100636955 |
| authorships[1].author.orcid | https://orcid.org/0000-0002-8512-0663 |
| authorships[1].author.display_name | Wei Song |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Song, Wei |
| authorships[1].is_corresponding | False |
| authorships[2].author.id | https://openalex.org/A5100660697 |
| authorships[2].author.orcid | https://orcid.org/0000-0002-4277-3728 |
| authorships[2].author.display_name | Wei Tao |
| authorships[2].author_position | middle |
| authorships[2].raw_author_name | Tao, Wei |
| authorships[2].is_corresponding | False |
| authorships[3].author.id | https://openalex.org/A5026941307 |
| authorships[3].author.orcid | https://orcid.org/0000-0002-2773-4421 |
| authorships[3].author.display_name | Antonio Liotta |
| authorships[3].author_position | middle |
| authorships[3].raw_author_name | Liotta, Antonio |
| authorships[3].is_corresponding | False |
| authorships[4].author.id | https://openalex.org/A5107958561 |
| authorships[4].author.orcid | https://orcid.org/0009-0002-0619-5059 |
| authorships[4].author.display_name | Dawei Yang |
| authorships[4].author_position | middle |
| authorships[4].raw_author_name | Yang, Dawei |
| authorships[4].is_corresponding | False |
| authorships[5].author.id | https://openalex.org/A5100664392 |
| authorships[5].author.orcid | https://orcid.org/0000-0001-5333-7676 |
| authorships[5].author.display_name | Xinlei Li |
| authorships[5].author_position | middle |
| authorships[5].raw_author_name | Li, Xinlei |
| authorships[5].is_corresponding | False |
| authorships[6].author.id | https://openalex.org/A5048271189 |
| authorships[6].author.orcid | https://orcid.org/0000-0002-8992-0756 |
| authorships[6].author.display_name | Shuyong Gao |
| authorships[6].author_position | middle |
| authorships[6].raw_author_name | Gao, Shuyong |
| authorships[6].is_corresponding | False |
| authorships[7].author.id | https://openalex.org/A5029261205 |
| authorships[7].author.orcid | https://orcid.org/0000-0003-1109-3380 |
| authorships[7].author.display_name | Yixuan Sun |
| authorships[7].author_position | middle |
| authorships[7].raw_author_name | Sun, Yixuan |
| authorships[7].is_corresponding | False |
| authorships[8].author.id | https://openalex.org/A5101647216 |
| authorships[8].author.orcid | https://orcid.org/0009-0000-6627-5101 |
| authorships[8].author.display_name | Weifeng Ge |
| authorships[8].author_position | middle |
| authorships[8].raw_author_name | Ge, Weifeng |
| authorships[8].is_corresponding | False |
| authorships[9].author.id | https://openalex.org/A5100441703 |
| authorships[9].author.orcid | https://orcid.org/0000-0002-8208-3342 |
| authorships[9].author.display_name | Wei Zhang |
| authorships[9].author_position | middle |
| authorships[9].raw_author_name | Zhang, Wei |
| authorships[9].is_corresponding | False |
| authorships[10].author.id | https://openalex.org/A5100669255 |
| authorships[10].author.orcid | https://orcid.org/0000-0002-3339-8751 |
| authorships[10].author.display_name | Wenqiang Zhang |
| authorships[10].author_position | last |
| authorships[10].raw_author_name | Zhang, Wenqiang |
| authorships[10].is_corresponding | False |
| has_content.pdf | False |
| has_content.grobid_xml | False |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://arxiv.org/pdf/2203.06935 |
| open_access.oa_status | green |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | A Systematic Review on Affective Computing: Emotion Models, Databases, and Recent Advances |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-11-06T06:51:31.235846 |
| primary_topic.id | https://openalex.org/T10667 |
| primary_topic.field.id | https://openalex.org/fields/32 |
| primary_topic.field.display_name | Psychology |
| primary_topic.score | 0.9534000158309937 |
| primary_topic.domain.id | https://openalex.org/domains/2 |
| primary_topic.domain.display_name | Social Sciences |
| primary_topic.subfield.id | https://openalex.org/subfields/3205 |
| primary_topic.subfield.display_name | Experimental and Cognitive Psychology |
| primary_topic.display_name | Emotion and Mood Recognition |
| related_works | https://openalex.org/W2945121592, https://openalex.org/W3080495370, https://openalex.org/W2584926856, https://openalex.org/W2075935902, https://openalex.org/W3000867607, https://openalex.org/W2014713986, https://openalex.org/W3214419959, https://openalex.org/W4380370144, https://openalex.org/W2787157782, https://openalex.org/W2903515201 |
| cited_by_count | 2 |
| counts_by_year[0].year | 2023 |
| counts_by_year[0].cited_by_count | 1 |
| counts_by_year[1].year | 2022 |
| counts_by_year[1].cited_by_count | 1 |
| locations_count | 2 |
| best_oa_location.id | pmh:oai:arXiv.org:2203.06935 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S4306400194 |
| best_oa_location.source.issn | |
| best_oa_location.source.type | repository |
| best_oa_location.source.is_oa | True |
| best_oa_location.source.issn_l | |
| best_oa_location.source.is_core | False |
| best_oa_location.source.is_in_doaj | False |
| best_oa_location.source.display_name | arXiv (Cornell University) |
| best_oa_location.source.host_organization | https://openalex.org/I205783295 |
| best_oa_location.source.host_organization_name | Cornell University |
| best_oa_location.source.host_organization_lineage | https://openalex.org/I205783295 |
| best_oa_location.license | |
| best_oa_location.pdf_url | https://arxiv.org/pdf/2203.06935 |
| best_oa_location.version | submittedVersion |
| best_oa_location.raw_type | text |
| best_oa_location.license_id | |
| best_oa_location.is_accepted | False |
| best_oa_location.is_published | False |
| best_oa_location.raw_source_name | |
| best_oa_location.landing_page_url | http://arxiv.org/abs/2203.06935 |
| primary_location.id | pmh:oai:arXiv.org:2203.06935 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S4306400194 |
| primary_location.source.issn | |
| primary_location.source.type | repository |
| primary_location.source.is_oa | True |
| primary_location.source.issn_l | |
| primary_location.source.is_core | False |
| primary_location.source.is_in_doaj | False |
| primary_location.source.display_name | arXiv (Cornell University) |
| primary_location.source.host_organization | https://openalex.org/I205783295 |
| primary_location.source.host_organization_name | Cornell University |
| primary_location.source.host_organization_lineage | https://openalex.org/I205783295 |
| primary_location.license | |
| primary_location.pdf_url | https://arxiv.org/pdf/2203.06935 |
| primary_location.version | submittedVersion |
| primary_location.raw_type | text |
| primary_location.license_id | |
| primary_location.is_accepted | False |
| primary_location.is_published | False |
| primary_location.raw_source_name | |
| primary_location.landing_page_url | http://arxiv.org/abs/2203.06935 |
| publication_date | 2022-03-14 |
| publication_year | 2022 |
| referenced_works_count | 0 |
| abstract_inverted_index.a | 3 |
| abstract_inverted_index.an | 220 |
| abstract_inverted_index.as | 161, 163, 229 |
| abstract_inverted_index.by | 175 |
| abstract_inverted_index.in | 6, 22, 107, 152, 195 |
| abstract_inverted_index.is | 36, 77 |
| abstract_inverted_index.it | 76 |
| abstract_inverted_index.of | 25, 46, 119, 129, 138, 144, 197, 222, 232 |
| abstract_inverted_index.on | 39, 140, 209 |
| abstract_inverted_index.or | 41 |
| abstract_inverted_index.to | 67, 71, 79, 134 |
| abstract_inverted_index.we | 147, 168, 183, 204 |
| abstract_inverted_index.ECG | 61 |
| abstract_inverted_index.EEG | 59 |
| abstract_inverted_index.and | 13, 31, 52, 55, 60, 100, 122, 132, 156, 185, 191, 201, 212, 215, 241 |
| abstract_inverted_index.can | 96, 125 |
| abstract_inverted_index.due | 70 |
| abstract_inverted_index.for | 179, 237 |
| abstract_inverted_index.key | 4 |
| abstract_inverted_index.one | 141 |
| abstract_inverted_index.the | 23, 105, 117, 153, 223, 230 |
| abstract_inverted_index.two | 170 |
| abstract_inverted_index.also | 111 |
| abstract_inverted_index.been | 19 |
| abstract_inverted_index.body | 91 |
| abstract_inverted_index.etc. | 93 |
| abstract_inverted_index.from | 86 |
| abstract_inverted_index.hard | 78 |
| abstract_inverted_index.have | 18 |
| abstract_inverted_index.lead | 133 |
| abstract_inverted_index.made | 20 |
| abstract_inverted_index.more | 68, 98 |
| abstract_inverted_index.most | 224 |
| abstract_inverted_index.role | 5 |
| abstract_inverted_index.safe | 11 |
| abstract_inverted_index.some | 206 |
| abstract_inverted_index.such | 228 |
| abstract_inverted_index.this | 217 |
| abstract_inverted_index.used | 177 |
| abstract_inverted_index.well | 162 |
| abstract_inverted_index.with | 219 |
| abstract_inverted_index.yet, | 104 |
| abstract_inverted_index.Major | 16 |
| abstract_inverted_index.Next, | 182 |
| abstract_inverted_index.Thus, | 116 |
| abstract_inverted_index.areas | 24 |
| abstract_inverted_index.audio | 89 |
| abstract_inverted_index.based | 38 |
| abstract_inverted_index.data) | 54 |
| abstract_inverted_index.data, | 43 |
| abstract_inverted_index.field | 143 |
| abstract_inverted_index.inner | 82 |
| abstract_inverted_index.one's | 81 |
| abstract_inverted_index.plays | 2 |
| abstract_inverted_index.terms | 196 |
| abstract_inverted_index.their | 113, 198, 213 |
| abstract_inverted_index.(e.g., | 49, 58 |
| abstract_inverted_index.(i.e., | 28 |
| abstract_inverted_index.affect | 64, 159, 189 |
| abstract_inverted_index.audio, | 51 |
| abstract_inverted_index.caters | 66 |
| abstract_inverted_index.facial | 87 |
| abstract_inverted_index.fusion | 118, 235 |
| abstract_inverted_index.future | 226 |
| abstract_inverted_index.hidden | 84 |
| abstract_inverted_index.higher | 135 |
| abstract_inverted_index.models | 173 |
| abstract_inverted_index.public | 73 |
| abstract_inverted_index.recent | 150 |
| abstract_inverted_index.reveal | 80 |
| abstract_inverted_index.review | 149, 218 |
| abstract_inverted_index.states | 131 |
| abstract_inverted_index.survey | 184 |
| abstract_inverted_index.tones, | 90 |
| abstract_inverted_index.useful | 127 |
| abstract_inverted_index.visual | 53 |
| abstract_inverted_index.Instead | 137 |
| abstract_inverted_index.aspects | 208 |
| abstract_inverted_index.discuss | 205 |
| abstract_inverted_index.emotion | 29, 83, 172 |
| abstract_inverted_index.hinders | 112 |
| abstract_inverted_index.models. | 244 |
| abstract_inverted_index.precise | 99 |
| abstract_inverted_index.provide | 126 |
| abstract_inverted_index.signals | 57, 95, 110, 124 |
| abstract_inverted_index.typical | 171 |
| abstract_inverted_index.Finally, | 203 |
| abstract_inverted_index.Firstly, | 167 |
| abstract_inverted_index.However, | 75 |
| abstract_inverted_index.advances | 151 |
| abstract_inverted_index.analysis | 194 |
| abstract_inverted_index.baseline | 233 |
| abstract_inverted_index.commonly | 176 |
| abstract_inverted_index.conclude | 216 |
| abstract_inverted_index.dataset, | 234 |
| abstract_inverted_index.detailed | 199 |
| abstract_inverted_index.driving, | 12 |
| abstract_inverted_index.features | 128 |
| abstract_inverted_index.focusing | 139 |
| abstract_inverted_index.followed | 174 |
| abstract_inverted_index.generate | 97 |
| abstract_inverted_index.learning | 243 |
| abstract_inverted_index.multiple | 72 |
| abstract_inverted_index.physical | 47, 120 |
| abstract_inverted_index.realized | 37 |
| abstract_inverted_index.recently | 21 |
| abstract_inverted_index.reliable | 101 |
| abstract_inverted_index.results; | 103 |
| abstract_inverted_index.specific | 142 |
| abstract_inverted_index.textual, | 50 |
| abstract_inverted_index.unimodal | 40, 158, 188 |
| abstract_inverted_index.Affective | 0, 34 |
| abstract_inverted_index.accuracy. | 136 |
| abstract_inverted_index.acquiring | 108 |
| abstract_inverted_index.affective | 26, 145, 154, 165, 180, 193, 210, 239 |
| abstract_inverted_index.analysis, | 146, 240 |
| abstract_inverted_index.analysis. | 166 |
| abstract_inverted_index.computing | 1, 27, 35, 211 |
| abstract_inverted_index.databases | 178 |
| abstract_inverted_index.emotional | 102, 130 |
| abstract_inverted_index.gestures, | 92 |
| abstract_inverted_index.important | 207 |
| abstract_inverted_index.introduce | 169 |
| abstract_inverted_index.practical | 114 |
| abstract_inverted_index.primarily | 44 |
| abstract_inverted_index.promising | 225 |
| abstract_inverted_index.purposely | 85 |
| abstract_inverted_index.sentiment | 32 |
| abstract_inverted_index.signals). | 62 |
| abstract_inverted_index.teaching, | 10 |
| abstract_inverted_index.analysis). | 33 |
| abstract_inverted_index.computing, | 155 |
| abstract_inverted_index.computing. | 181 |
| abstract_inverted_index.consisting | 45 |
| abstract_inverted_index.databases. | 74 |
| abstract_inverted_index.difficulty | 106 |
| abstract_inverted_index.indication | 221 |
| abstract_inverted_index.multimedia | 14 |
| abstract_inverted_index.multimodal | 42, 164, 192, 238 |
| abstract_inverted_index.strategies | 236 |
| abstract_inverted_index.taxonomize | 157, 186 |
| abstract_inverted_index.directions, | 227 |
| abstract_inverted_index.information | 48, 121 |
| abstract_inverted_index.recognition | 30, 65, 160, 190 |
| abstract_inverted_index.researchers | 69 |
| abstract_inverted_index.application. | 115 |
| abstract_inverted_index.applications | 214 |
| abstract_inverted_index.expressions, | 88 |
| abstract_inverted_index.integration. | 15 |
| abstract_inverted_index.unsupervised | 242 |
| abstract_inverted_index.Physiological | 94 |
| abstract_inverted_index.architectures | 200 |
| abstract_inverted_index.breakthroughs | 17 |
| abstract_inverted_index.establishment | 231 |
| abstract_inverted_index.interactions, | 8 |
| abstract_inverted_index.performances. | 202 |
| abstract_inverted_index.physiological | 56, 109, 123 |
| abstract_inverted_index.Physical-based | 63 |
| abstract_inverted_index.entertainment, | 9 |
| abstract_inverted_index.human-computer | 7 |
| abstract_inverted_index.systematically | 148 |
| abstract_inverted_index.state-of-the-art | 187 |
| cited_by_percentile_year | |
| countries_distinct_count | 0 |
| institutions_distinct_count | 11 |
| sustainable_development_goals[0].id | https://metadata.un.org/sdg/4 |
| sustainable_development_goals[0].score | 0.6399999856948853 |
| sustainable_development_goals[0].display_name | Quality Education |
| citation_normalized_percentile |