Analyzing Continuous-Time and Sentence-Level Annotations for Speech Emotion Recognition Article Swipe
YOU?
·
· 2024
· Open Access
·
· DOI: https://doi.org/10.1109/taffc.2024.3372380
The emotional content of several databases are annotated with continuous-time (CT) annotations, providing traces with frame-by-frame scores describing the instantaneous value of an emotional attribute. However, having a single score describing the global emotion of a short segment is more convenient for several emotion recognition formulations. A common approach is to derive sentence-level (SL) labels from CT annotations by aggregating the values of the emotional traces across time and annotators. How similar are these aggregated SL labels from labels originally collected at the sentence level? The release of the MSP-Podcast (SL annotations) and MSP-Conversation (CT annotations) corpora provides the resources to explore the validity of aggregating SL labels from CT annotations. There are 2,884 speech segments that belong to both corpora. Using this set, this study (1) compares both types of annotations using statistical metrics, (2) evaluates their inter-evaluator agreements, and (3) explores the effect of these SL labels on speech emotion recognition (SER) tasks. The analysis reveals benefits of using SL labels derived from CT annotations in the estimation of valence. This analysis also provides insights on how the two types of labels differ and how that could affect a model.
Related Topics
- Type
- article
- Language
- en
- Landing Page
- https://doi.org/10.1109/taffc.2024.3372380
- https://ieeexplore.ieee.org/ielx7/5165369/5520654/10457571.pdf
- OA Status
- hybrid
- Cited By
- 10
- References
- 63
- Related Works
- 10
- OpenAlex ID
- https://openalex.org/W4392397287
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4392397287Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.1109/taffc.2024.3372380Digital Object Identifier
- Title
-
Analyzing Continuous-Time and Sentence-Level Annotations for Speech Emotion RecognitionWork title
- Type
-
articleOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2024Year of publication
- Publication date
-
2024-03-01Full publication date if available
- Authors
-
Luz Martinez-Lucas, Wei-Cheng Lin, Carlos BussoList of authors in order
- Landing page
-
https://doi.org/10.1109/taffc.2024.3372380Publisher landing page
- PDF URL
-
https://ieeexplore.ieee.org/ielx7/5165369/5520654/10457571.pdfDirect link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
hybridOpen access status per OpenAlex
- OA URL
-
https://ieeexplore.ieee.org/ielx7/5165369/5520654/10457571.pdfDirect OA link when available
- Concepts
-
Speech recognition, Emotion recognition, Sentence, Natural language processing, Computer science, Artificial intelligence, PsychologyTop concepts (fields/topics) attached by OpenAlex
- Cited by
-
10Total citation count in OpenAlex
- Citations by year (recent)
-
2025: 5, 2024: 5Per-year citation counts (last 5 years)
- References (count)
-
63Number of works referenced by this work
- Related works (count)
-
10Other works algorithmically related by OpenAlex
Full payload
| id | https://openalex.org/W4392397287 |
|---|---|
| doi | https://doi.org/10.1109/taffc.2024.3372380 |
| ids.doi | https://doi.org/10.1109/taffc.2024.3372380 |
| ids.openalex | https://openalex.org/W4392397287 |
| fwci | 10.96437584 |
| type | article |
| title | Analyzing Continuous-Time and Sentence-Level Annotations for Speech Emotion Recognition |
| awards[0].id | https://openalex.org/G8080363172 |
| awards[0].funder_id | https://openalex.org/F4320306076 |
| awards[0].display_name | |
| awards[0].funder_award_id | CNS-1823166 |
| awards[0].funder_display_name | National Science Foundation |
| awards[1].id | https://openalex.org/G8082180408 |
| awards[1].funder_id | https://openalex.org/F4320306076 |
| awards[1].display_name | |
| awards[1].funder_award_id | CNS-2016719 |
| awards[1].funder_display_name | National Science Foundation |
| biblio.issue | 3 |
| biblio.volume | 15 |
| biblio.last_page | 1768 |
| biblio.first_page | 1754 |
| topics[0].id | https://openalex.org/T10667 |
| topics[0].field.id | https://openalex.org/fields/32 |
| topics[0].field.display_name | Psychology |
| topics[0].score | 0.9998999834060669 |
| topics[0].domain.id | https://openalex.org/domains/2 |
| topics[0].domain.display_name | Social Sciences |
| topics[0].subfield.id | https://openalex.org/subfields/3205 |
| topics[0].subfield.display_name | Experimental and Cognitive Psychology |
| topics[0].display_name | Emotion and Mood Recognition |
| topics[1].id | https://openalex.org/T11309 |
| topics[1].field.id | https://openalex.org/fields/17 |
| topics[1].field.display_name | Computer Science |
| topics[1].score | 0.9972000122070312 |
| topics[1].domain.id | https://openalex.org/domains/3 |
| topics[1].domain.display_name | Physical Sciences |
| topics[1].subfield.id | https://openalex.org/subfields/1711 |
| topics[1].subfield.display_name | Signal Processing |
| topics[1].display_name | Music and Audio Processing |
| topics[2].id | https://openalex.org/T10860 |
| topics[2].field.id | https://openalex.org/fields/17 |
| topics[2].field.display_name | Computer Science |
| topics[2].score | 0.9965999722480774 |
| topics[2].domain.id | https://openalex.org/domains/3 |
| topics[2].domain.display_name | Physical Sciences |
| topics[2].subfield.id | https://openalex.org/subfields/1711 |
| topics[2].subfield.display_name | Signal Processing |
| topics[2].display_name | Speech and Audio Processing |
| funders[0].id | https://openalex.org/F4320306076 |
| funders[0].ror | https://ror.org/021nxhr62 |
| funders[0].display_name | National Science Foundation |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| concepts[0].id | https://openalex.org/C28490314 |
| concepts[0].level | 1 |
| concepts[0].score | 0.7015487551689148 |
| concepts[0].wikidata | https://www.wikidata.org/wiki/Q189436 |
| concepts[0].display_name | Speech recognition |
| concepts[1].id | https://openalex.org/C2777438025 |
| concepts[1].level | 2 |
| concepts[1].score | 0.6757758855819702 |
| concepts[1].wikidata | https://www.wikidata.org/wiki/Q1339090 |
| concepts[1].display_name | Emotion recognition |
| concepts[2].id | https://openalex.org/C2777530160 |
| concepts[2].level | 2 |
| concepts[2].score | 0.6674063801765442 |
| concepts[2].wikidata | https://www.wikidata.org/wiki/Q41796 |
| concepts[2].display_name | Sentence |
| concepts[3].id | https://openalex.org/C204321447 |
| concepts[3].level | 1 |
| concepts[3].score | 0.5863706469535828 |
| concepts[3].wikidata | https://www.wikidata.org/wiki/Q30642 |
| concepts[3].display_name | Natural language processing |
| concepts[4].id | https://openalex.org/C41008148 |
| concepts[4].level | 0 |
| concepts[4].score | 0.5415059328079224 |
| concepts[4].wikidata | https://www.wikidata.org/wiki/Q21198 |
| concepts[4].display_name | Computer science |
| concepts[5].id | https://openalex.org/C154945302 |
| concepts[5].level | 1 |
| concepts[5].score | 0.43281030654907227 |
| concepts[5].wikidata | https://www.wikidata.org/wiki/Q11660 |
| concepts[5].display_name | Artificial intelligence |
| concepts[6].id | https://openalex.org/C15744967 |
| concepts[6].level | 0 |
| concepts[6].score | 0.3388650715351105 |
| concepts[6].wikidata | https://www.wikidata.org/wiki/Q9418 |
| concepts[6].display_name | Psychology |
| keywords[0].id | https://openalex.org/keywords/speech-recognition |
| keywords[0].score | 0.7015487551689148 |
| keywords[0].display_name | Speech recognition |
| keywords[1].id | https://openalex.org/keywords/emotion-recognition |
| keywords[1].score | 0.6757758855819702 |
| keywords[1].display_name | Emotion recognition |
| keywords[2].id | https://openalex.org/keywords/sentence |
| keywords[2].score | 0.6674063801765442 |
| keywords[2].display_name | Sentence |
| keywords[3].id | https://openalex.org/keywords/natural-language-processing |
| keywords[3].score | 0.5863706469535828 |
| keywords[3].display_name | Natural language processing |
| keywords[4].id | https://openalex.org/keywords/computer-science |
| keywords[4].score | 0.5415059328079224 |
| keywords[4].display_name | Computer science |
| keywords[5].id | https://openalex.org/keywords/artificial-intelligence |
| keywords[5].score | 0.43281030654907227 |
| keywords[5].display_name | Artificial intelligence |
| keywords[6].id | https://openalex.org/keywords/psychology |
| keywords[6].score | 0.3388650715351105 |
| keywords[6].display_name | Psychology |
| language | en |
| locations[0].id | doi:10.1109/taffc.2024.3372380 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S104780363 |
| locations[0].source.issn | 1949-3045, 2371-9850 |
| locations[0].source.type | journal |
| locations[0].source.is_oa | False |
| locations[0].source.issn_l | 1949-3045 |
| locations[0].source.is_core | True |
| locations[0].source.is_in_doaj | False |
| locations[0].source.display_name | IEEE Transactions on Affective Computing |
| locations[0].source.host_organization | https://openalex.org/P4310319808 |
| locations[0].source.host_organization_name | Institute of Electrical and Electronics Engineers |
| locations[0].source.host_organization_lineage | https://openalex.org/P4310319808 |
| locations[0].source.host_organization_lineage_names | Institute of Electrical and Electronics Engineers |
| locations[0].license | cc-by |
| locations[0].pdf_url | https://ieeexplore.ieee.org/ielx7/5165369/5520654/10457571.pdf |
| locations[0].version | publishedVersion |
| locations[0].raw_type | journal-article |
| locations[0].license_id | https://openalex.org/licenses/cc-by |
| locations[0].is_accepted | True |
| locations[0].is_published | True |
| locations[0].raw_source_name | IEEE Transactions on Affective Computing |
| locations[0].landing_page_url | https://doi.org/10.1109/taffc.2024.3372380 |
| indexed_in | crossref |
| authorships[0].author.id | https://openalex.org/A5072064784 |
| authorships[0].author.orcid | https://orcid.org/0009-0000-6368-7039 |
| authorships[0].author.display_name | Luz Martinez-Lucas |
| authorships[0].countries | US |
| authorships[0].affiliations[0].institution_ids | https://openalex.org/I162577319 |
| authorships[0].affiliations[0].raw_affiliation_string | Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, USA |
| authorships[0].institutions[0].id | https://openalex.org/I162577319 |
| authorships[0].institutions[0].ror | https://ror.org/049emcs32 |
| authorships[0].institutions[0].type | education |
| authorships[0].institutions[0].lineage | https://openalex.org/I162577319 |
| authorships[0].institutions[0].country_code | US |
| authorships[0].institutions[0].display_name | The University of Texas at Dallas |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Luz Martinez-Lucas |
| authorships[0].is_corresponding | False |
| authorships[0].raw_affiliation_strings | Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, USA |
| authorships[1].author.id | https://openalex.org/A5070819601 |
| authorships[1].author.orcid | https://orcid.org/0000-0003-1933-1590 |
| authorships[1].author.display_name | Wei-Cheng Lin |
| authorships[1].countries | US |
| authorships[1].affiliations[0].institution_ids | https://openalex.org/I162577319 |
| authorships[1].affiliations[0].raw_affiliation_string | Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, USA |
| authorships[1].institutions[0].id | https://openalex.org/I162577319 |
| authorships[1].institutions[0].ror | https://ror.org/049emcs32 |
| authorships[1].institutions[0].type | education |
| authorships[1].institutions[0].lineage | https://openalex.org/I162577319 |
| authorships[1].institutions[0].country_code | US |
| authorships[1].institutions[0].display_name | The University of Texas at Dallas |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Wei-Cheng Lin |
| authorships[1].is_corresponding | False |
| authorships[1].raw_affiliation_strings | Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, USA |
| authorships[2].author.id | https://openalex.org/A5040793194 |
| authorships[2].author.orcid | https://orcid.org/0000-0002-4075-4072 |
| authorships[2].author.display_name | Carlos Busso |
| authorships[2].countries | US |
| authorships[2].affiliations[0].institution_ids | https://openalex.org/I162577319 |
| authorships[2].affiliations[0].raw_affiliation_string | Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, USA |
| authorships[2].institutions[0].id | https://openalex.org/I162577319 |
| authorships[2].institutions[0].ror | https://ror.org/049emcs32 |
| authorships[2].institutions[0].type | education |
| authorships[2].institutions[0].lineage | https://openalex.org/I162577319 |
| authorships[2].institutions[0].country_code | US |
| authorships[2].institutions[0].display_name | The University of Texas at Dallas |
| authorships[2].author_position | last |
| authorships[2].raw_author_name | Carlos Busso |
| authorships[2].is_corresponding | False |
| authorships[2].raw_affiliation_strings | Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, USA |
| has_content.pdf | True |
| has_content.grobid_xml | True |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://ieeexplore.ieee.org/ielx7/5165369/5520654/10457571.pdf |
| open_access.oa_status | hybrid |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | Analyzing Continuous-Time and Sentence-Level Annotations for Speech Emotion Recognition |
| has_fulltext | True |
| is_retracted | False |
| updated_date | 2025-11-06T03:46:38.306776 |
| primary_topic.id | https://openalex.org/T10667 |
| primary_topic.field.id | https://openalex.org/fields/32 |
| primary_topic.field.display_name | Psychology |
| primary_topic.score | 0.9998999834060669 |
| primary_topic.domain.id | https://openalex.org/domains/2 |
| primary_topic.domain.display_name | Social Sciences |
| primary_topic.subfield.id | https://openalex.org/subfields/3205 |
| primary_topic.subfield.display_name | Experimental and Cognitive Psychology |
| primary_topic.display_name | Emotion and Mood Recognition |
| related_works | https://openalex.org/W2375873920, https://openalex.org/W2146114872, https://openalex.org/W2392060890, https://openalex.org/W2392760275, https://openalex.org/W2083530853, https://openalex.org/W2009831055, https://openalex.org/W2393172683, https://openalex.org/W3204019825, https://openalex.org/W3126677997, https://openalex.org/W1610857240 |
| cited_by_count | 10 |
| counts_by_year[0].year | 2025 |
| counts_by_year[0].cited_by_count | 5 |
| counts_by_year[1].year | 2024 |
| counts_by_year[1].cited_by_count | 5 |
| locations_count | 1 |
| best_oa_location.id | doi:10.1109/taffc.2024.3372380 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S104780363 |
| best_oa_location.source.issn | 1949-3045, 2371-9850 |
| best_oa_location.source.type | journal |
| best_oa_location.source.is_oa | False |
| best_oa_location.source.issn_l | 1949-3045 |
| best_oa_location.source.is_core | True |
| best_oa_location.source.is_in_doaj | False |
| best_oa_location.source.display_name | IEEE Transactions on Affective Computing |
| best_oa_location.source.host_organization | https://openalex.org/P4310319808 |
| best_oa_location.source.host_organization_name | Institute of Electrical and Electronics Engineers |
| best_oa_location.source.host_organization_lineage | https://openalex.org/P4310319808 |
| best_oa_location.source.host_organization_lineage_names | Institute of Electrical and Electronics Engineers |
| best_oa_location.license | cc-by |
| best_oa_location.pdf_url | https://ieeexplore.ieee.org/ielx7/5165369/5520654/10457571.pdf |
| best_oa_location.version | publishedVersion |
| best_oa_location.raw_type | journal-article |
| best_oa_location.license_id | https://openalex.org/licenses/cc-by |
| best_oa_location.is_accepted | True |
| best_oa_location.is_published | True |
| best_oa_location.raw_source_name | IEEE Transactions on Affective Computing |
| best_oa_location.landing_page_url | https://doi.org/10.1109/taffc.2024.3372380 |
| primary_location.id | doi:10.1109/taffc.2024.3372380 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S104780363 |
| primary_location.source.issn | 1949-3045, 2371-9850 |
| primary_location.source.type | journal |
| primary_location.source.is_oa | False |
| primary_location.source.issn_l | 1949-3045 |
| primary_location.source.is_core | True |
| primary_location.source.is_in_doaj | False |
| primary_location.source.display_name | IEEE Transactions on Affective Computing |
| primary_location.source.host_organization | https://openalex.org/P4310319808 |
| primary_location.source.host_organization_name | Institute of Electrical and Electronics Engineers |
| primary_location.source.host_organization_lineage | https://openalex.org/P4310319808 |
| primary_location.source.host_organization_lineage_names | Institute of Electrical and Electronics Engineers |
| primary_location.license | cc-by |
| primary_location.pdf_url | https://ieeexplore.ieee.org/ielx7/5165369/5520654/10457571.pdf |
| primary_location.version | publishedVersion |
| primary_location.raw_type | journal-article |
| primary_location.license_id | https://openalex.org/licenses/cc-by |
| primary_location.is_accepted | True |
| primary_location.is_published | True |
| primary_location.raw_source_name | IEEE Transactions on Affective Computing |
| primary_location.landing_page_url | https://doi.org/10.1109/taffc.2024.3372380 |
| publication_date | 2024-03-01 |
| publication_year | 2024 |
| referenced_works | https://openalex.org/W2550557083, https://openalex.org/W2146334809, https://openalex.org/W2097732741, https://openalex.org/W2742542661, https://openalex.org/W2030931454, https://openalex.org/W2054541702, https://openalex.org/W6628256265, https://openalex.org/W2095797119, https://openalex.org/W2132555391, https://openalex.org/W2910165986, https://openalex.org/W6777021087, https://openalex.org/W2403326047, https://openalex.org/W2130162821, https://openalex.org/W1589734697, https://openalex.org/W2404446881, https://openalex.org/W2161459043, https://openalex.org/W2527527138, https://openalex.org/W2899500748, https://openalex.org/W1608705073, https://openalex.org/W2985882473, https://openalex.org/W2964300796, https://openalex.org/W3115382905, https://openalex.org/W3202775307, https://openalex.org/W2145598468, https://openalex.org/W3095093565, https://openalex.org/W4237530378, https://openalex.org/W2134031328, https://openalex.org/W2156848952, https://openalex.org/W6608008786, https://openalex.org/W2963162726, https://openalex.org/W2401417847, https://openalex.org/W2525412388, https://openalex.org/W2177245949, https://openalex.org/W175750906, https://openalex.org/W2143350951, https://openalex.org/W2342475039, https://openalex.org/W1981950962, https://openalex.org/W1785074626, https://openalex.org/W2346454595, https://openalex.org/W2579555219, https://openalex.org/W2900358852, https://openalex.org/W2010700035, https://openalex.org/W2331249327, https://openalex.org/W2148146486, https://openalex.org/W2972935927, https://openalex.org/W2804664105, https://openalex.org/W3086923691, https://openalex.org/W4361994820, https://openalex.org/W2962839749, https://openalex.org/W3164582967, https://openalex.org/W2806649730, https://openalex.org/W2144005487, https://openalex.org/W2085662862, https://openalex.org/W2889100420, https://openalex.org/W2399733683, https://openalex.org/W2404863510, https://openalex.org/W3162840325, https://openalex.org/W4285111045, https://openalex.org/W349893698, https://openalex.org/W2072712810, https://openalex.org/W2407346284, https://openalex.org/W2099767163, https://openalex.org/W2032254851 |
| referenced_works_count | 63 |
| abstract_inverted_index.A | 48 |
| abstract_inverted_index.a | 29, 37, 196 |
| abstract_inverted_index.CT | 60, 113, 171 |
| abstract_inverted_index.SL | 79, 110, 151, 167 |
| abstract_inverted_index.an | 24 |
| abstract_inverted_index.at | 85 |
| abstract_inverted_index.by | 62 |
| abstract_inverted_index.in | 173 |
| abstract_inverted_index.is | 40, 51 |
| abstract_inverted_index.of | 3, 23, 36, 66, 91, 108, 134, 149, 165, 176, 188 |
| abstract_inverted_index.on | 153, 183 |
| abstract_inverted_index.to | 52, 104, 122 |
| abstract_inverted_index.(1) | 130 |
| abstract_inverted_index.(2) | 139 |
| abstract_inverted_index.(3) | 145 |
| abstract_inverted_index.(CT | 98 |
| abstract_inverted_index.(SL | 94 |
| abstract_inverted_index.How | 74 |
| abstract_inverted_index.The | 0, 89, 161 |
| abstract_inverted_index.and | 72, 96, 144, 191 |
| abstract_inverted_index.are | 6, 76, 116 |
| abstract_inverted_index.for | 43 |
| abstract_inverted_index.how | 184, 192 |
| abstract_inverted_index.the | 20, 33, 64, 67, 86, 92, 102, 106, 147, 174, 185 |
| abstract_inverted_index.two | 186 |
| abstract_inverted_index.(CT) | 12 |
| abstract_inverted_index.(SL) | 57 |
| abstract_inverted_index.This | 178 |
| abstract_inverted_index.also | 180 |
| abstract_inverted_index.both | 123, 132 |
| abstract_inverted_index.from | 59, 81, 112, 170 |
| abstract_inverted_index.more | 41 |
| abstract_inverted_index.set, | 127 |
| abstract_inverted_index.that | 120, 193 |
| abstract_inverted_index.this | 126, 128 |
| abstract_inverted_index.time | 71 |
| abstract_inverted_index.with | 8, 16 |
| abstract_inverted_index.(SER) | 159 |
| abstract_inverted_index.2,884 | 117 |
| abstract_inverted_index.There | 115 |
| abstract_inverted_index.Using | 125 |
| abstract_inverted_index.could | 194 |
| abstract_inverted_index.score | 31 |
| abstract_inverted_index.short | 38 |
| abstract_inverted_index.study | 129 |
| abstract_inverted_index.their | 141 |
| abstract_inverted_index.these | 77, 150 |
| abstract_inverted_index.types | 133, 187 |
| abstract_inverted_index.using | 136, 166 |
| abstract_inverted_index.value | 22 |
| abstract_inverted_index.across | 70 |
| abstract_inverted_index.affect | 195 |
| abstract_inverted_index.belong | 121 |
| abstract_inverted_index.common | 49 |
| abstract_inverted_index.derive | 53 |
| abstract_inverted_index.differ | 190 |
| abstract_inverted_index.effect | 148 |
| abstract_inverted_index.global | 34 |
| abstract_inverted_index.having | 28 |
| abstract_inverted_index.labels | 58, 80, 82, 111, 152, 168, 189 |
| abstract_inverted_index.level? | 88 |
| abstract_inverted_index.model. | 197 |
| abstract_inverted_index.scores | 18 |
| abstract_inverted_index.single | 30 |
| abstract_inverted_index.speech | 118 |
| abstract_inverted_index.tasks. | 160 |
| abstract_inverted_index.traces | 15, 69 |
| abstract_inverted_index.values | 65 |
| abstract_inverted_index.<italic | 9, 54, 154 |
| abstract_inverted_index.content | 2 |
| abstract_inverted_index.corpora | 100 |
| abstract_inverted_index.derived | 169 |
| abstract_inverted_index.emotion | 35, 45, 157 |
| abstract_inverted_index.explore | 105 |
| abstract_inverted_index.release | 90 |
| abstract_inverted_index.reveals | 163 |
| abstract_inverted_index.segment | 39 |
| abstract_inverted_index.several | 4, 44 |
| abstract_inverted_index.similar | 75 |
| abstract_inverted_index.However, | 27 |
| abstract_inverted_index.analysis | 162, 179 |
| abstract_inverted_index.approach | 50 |
| abstract_inverted_index.benefits | 164 |
| abstract_inverted_index.compares | 131 |
| abstract_inverted_index.corpora. | 124 |
| abstract_inverted_index.explores | 146 |
| abstract_inverted_index.insights | 182 |
| abstract_inverted_index.metrics, | 138 |
| abstract_inverted_index.provides | 101, 181 |
| abstract_inverted_index.segments | 119 |
| abstract_inverted_index.sentence | 87 |
| abstract_inverted_index.valence. | 177 |
| abstract_inverted_index.validity | 107 |
| abstract_inverted_index.annotated | 7 |
| abstract_inverted_index.collected | 84 |
| abstract_inverted_index.databases | 5 |
| abstract_inverted_index.emotional | 1, 25, 68 |
| abstract_inverted_index.evaluates | 140 |
| abstract_inverted_index.providing | 14 |
| abstract_inverted_index.resources | 103 |
| abstract_inverted_index.aggregated | 78 |
| abstract_inverted_index.attribute. | 26 |
| abstract_inverted_index.convenient | 42 |
| abstract_inverted_index.describing | 19, 32 |
| abstract_inverted_index.estimation | 175 |
| abstract_inverted_index.originally | 83 |
| abstract_inverted_index.MSP-Podcast | 93 |
| abstract_inverted_index.aggregating | 63, 109 |
| abstract_inverted_index.agreements, | 143 |
| abstract_inverted_index.annotations | 61, 135, 172 |
| abstract_inverted_index.annotators. | 73 |
| abstract_inverted_index.recognition | 46 |
| abstract_inverted_index.statistical | 137 |
| abstract_inverted_index.annotations) | 95, 99 |
| abstract_inverted_index.annotations, | 13 |
| abstract_inverted_index.annotations. | 114 |
| abstract_inverted_index.formulations. | 47 |
| abstract_inverted_index.instantaneous | 21 |
| abstract_inverted_index.frame-by-frame | 17 |
| abstract_inverted_index.inter-evaluator | 142 |
| abstract_inverted_index.recognition</i> | 158 |
| abstract_inverted_index.MSP-Conversation | 97 |
| abstract_inverted_index.xmlns:mml="http://www.w3.org/1998/Math/MathML" | 10, 55, 155 |
| abstract_inverted_index.xmlns:xlink="http://www.w3.org/1999/xlink">speech | 156 |
| abstract_inverted_index.xmlns:xlink="http://www.w3.org/1999/xlink">sentence-level</i> | 56 |
| abstract_inverted_index.xmlns:xlink="http://www.w3.org/1999/xlink">continuous-time</i> | 11 |
| cited_by_percentile_year.max | 98 |
| cited_by_percentile_year.min | 97 |
| countries_distinct_count | 1 |
| institutions_distinct_count | 3 |
| citation_normalized_percentile.value | 0.96919878 |
| citation_normalized_percentile.is_in_top_1_percent | False |
| citation_normalized_percentile.is_in_top_10_percent | True |