Assessing random forest performance in low resource speech emotion recognition Article Swipe
YOU?
·
· 2025
· Open Access
·
· DOI: https://doi.org/10.1038/s41598-025-30511-6
In human-computer interaction (HCI), speech emotion recognition (SER) is a pivotal technology that enables machines to decipher human emotions. This study delves into the efficacy of employing a random forest (RF) classifier for SER within Urdu speech, a notably low-resource language. By focusing on the three primary emotions-happiness, sadness, and anger, and leveraging Mel-frequency cepstral coefficients (MFCCs) for feature extraction, we have meticulously crafted a model with an impressive validation accuracy of 94.53%. This endeavor highlights the robustness of the RF classifier and represents a significant step towards empathetic artificial intelligence, particularly in improving digital user experiences through emotional understanding. Moreover, our examination highlights the key features of MFCCs and concentrating on their crucial role in emotion discrepancy. This research endeavors the viability and realism of RF classifiers in paving the way for emotionally subtle artificial intelligence (AI) systems, even under the constrictions of resource-deprived languages for instance Urdu. In the future, we hope to broaden the scope of our research to include a wider spectrum of emotions and investigate how other datasets affect our model's functionality, creating new avenues for the advancement of SER technology.
Related Topics
- Type
- article
- Language
- en
- Landing Page
- https://doi.org/10.1038/s41598-025-30511-6
- https://www.nature.com/articles/s41598-025-30511-6_reference.pdf
- OA Status
- gold
- References
- 45
- OpenAlex ID
- https://openalex.org/W4417200090
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4417200090Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.1038/s41598-025-30511-6Digital Object Identifier
- Title
-
Assessing random forest performance in low resource speech emotion recognitionWork title
- Type
-
articleOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2025Year of publication
- Publication date
-
2025-12-10Full publication date if available
- Authors
-
Muhammad Adeel, Zhi-Yong Tao, Shu-Ya Jin, Chuanjie Guo, Mohammed AlsuhaibaniList of authors in order
- Landing page
-
https://doi.org/10.1038/s41598-025-30511-6Publisher landing page
- PDF URL
-
https://www.nature.com/articles/s41598-025-30511-6_reference.pdfDirect link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
goldOpen access status per OpenAlex
- OA URL
-
https://www.nature.com/articles/s41598-025-30511-6_reference.pdfDirect OA link when available
- Cited by
-
0Total citation count in OpenAlex
- References (count)
-
45Number of works referenced by this work
Full payload
| id | https://openalex.org/W4417200090 |
|---|---|
| doi | https://doi.org/10.1038/s41598-025-30511-6 |
| ids.doi | https://doi.org/10.1038/s41598-025-30511-6 |
| ids.pmid | https://pubmed.ncbi.nlm.nih.gov/41372327 |
| ids.openalex | https://openalex.org/W4417200090 |
| fwci | |
| type | article |
| title | Assessing random forest performance in low resource speech emotion recognition |
| biblio.issue | |
| biblio.volume | |
| biblio.last_page | |
| biblio.first_page | |
| is_xpac | False |
| apc_list.value | 1890 |
| apc_list.currency | EUR |
| apc_list.value_usd | 2190 |
| apc_paid.value | 1890 |
| apc_paid.currency | EUR |
| apc_paid.value_usd | 2190 |
| language | en |
| locations[0].id | doi:10.1038/s41598-025-30511-6 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S196734849 |
| locations[0].source.issn | 2045-2322 |
| locations[0].source.type | journal |
| locations[0].source.is_oa | True |
| locations[0].source.issn_l | 2045-2322 |
| locations[0].source.is_core | True |
| locations[0].source.is_in_doaj | True |
| locations[0].source.display_name | Scientific Reports |
| locations[0].source.host_organization | https://openalex.org/P4310319908 |
| locations[0].source.host_organization_name | Nature Portfolio |
| locations[0].source.host_organization_lineage | https://openalex.org/P4310319908, https://openalex.org/P4310319965 |
| locations[0].source.host_organization_lineage_names | Nature Portfolio, Springer Nature |
| locations[0].license | cc-by-nc-nd |
| locations[0].pdf_url | https://www.nature.com/articles/s41598-025-30511-6_reference.pdf |
| locations[0].version | publishedVersion |
| locations[0].raw_type | journal-article |
| locations[0].license_id | https://openalex.org/licenses/cc-by-nc-nd |
| locations[0].is_accepted | True |
| locations[0].is_published | True |
| locations[0].raw_source_name | Scientific Reports |
| locations[0].landing_page_url | https://doi.org/10.1038/s41598-025-30511-6 |
| locations[1].id | pmid:41372327 |
| locations[1].is_oa | False |
| locations[1].source.id | https://openalex.org/S4306525036 |
| locations[1].source.issn | |
| locations[1].source.type | repository |
| locations[1].source.is_oa | False |
| locations[1].source.issn_l | |
| locations[1].source.is_core | False |
| locations[1].source.is_in_doaj | False |
| locations[1].source.display_name | PubMed |
| locations[1].source.host_organization | https://openalex.org/I1299303238 |
| locations[1].source.host_organization_name | National Institutes of Health |
| locations[1].source.host_organization_lineage | https://openalex.org/I1299303238 |
| locations[1].license | |
| locations[1].pdf_url | |
| locations[1].version | publishedVersion |
| locations[1].raw_type | |
| locations[1].license_id | |
| locations[1].is_accepted | True |
| locations[1].is_published | True |
| locations[1].raw_source_name | Scientific reports |
| locations[1].landing_page_url | https://pubmed.ncbi.nlm.nih.gov/41372327 |
| indexed_in | crossref, doaj, pubmed |
| authorships[0].author.id | https://openalex.org/A5117489752 |
| authorships[0].author.orcid | https://orcid.org/0009-0008-0831-4908 |
| authorships[0].author.display_name | Muhammad Adeel |
| authorships[0].countries | CN |
| authorships[0].affiliations[0].institution_ids | https://openalex.org/I5343935 |
| authorships[0].affiliations[0].raw_affiliation_string | School of Information and Communication, Guilin University of Electronic Technology, Guilin, 541004, People's Republic of China. [email protected]. |
| authorships[0].institutions[0].id | https://openalex.org/I5343935 |
| authorships[0].institutions[0].ror | https://ror.org/05arjae42 |
| authorships[0].institutions[0].type | education |
| authorships[0].institutions[0].lineage | https://openalex.org/I5343935 |
| authorships[0].institutions[0].country_code | CN |
| authorships[0].institutions[0].display_name | Guilin University of Electronic Technology |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Muhammad Adeel |
| authorships[0].is_corresponding | False |
| authorships[0].raw_affiliation_strings | School of Information and Communication, Guilin University of Electronic Technology, Guilin, 541004, People's Republic of China. [email protected]. |
| authorships[1].author.id | https://openalex.org/A5037894772 |
| authorships[1].author.orcid | https://orcid.org/0000-0003-4732-5059 |
| authorships[1].author.display_name | Zhi-Yong Tao |
| authorships[1].countries | CN |
| authorships[1].affiliations[0].institution_ids | https://openalex.org/I5343935 |
| authorships[1].affiliations[0].raw_affiliation_string | Academy of Marine Information Technology, Guilin University of Electronic Technology, Beihai, 536000, People's Republic of China. |
| authorships[1].institutions[0].id | https://openalex.org/I5343935 |
| authorships[1].institutions[0].ror | https://ror.org/05arjae42 |
| authorships[1].institutions[0].type | education |
| authorships[1].institutions[0].lineage | https://openalex.org/I5343935 |
| authorships[1].institutions[0].country_code | CN |
| authorships[1].institutions[0].display_name | Guilin University of Electronic Technology |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Zhi-Yong Tao |
| authorships[1].is_corresponding | False |
| authorships[1].raw_affiliation_strings | Academy of Marine Information Technology, Guilin University of Electronic Technology, Beihai, 536000, People's Republic of China. |
| authorships[2].author.id | https://openalex.org/A5065821925 |
| authorships[2].author.orcid | https://orcid.org/0000-0003-4505-5526 |
| authorships[2].author.display_name | Shu-Ya Jin |
| authorships[2].countries | CN |
| authorships[2].affiliations[0].institution_ids | https://openalex.org/I5343935 |
| authorships[2].affiliations[0].raw_affiliation_string | Academy of Marine Information Technology, Guilin University of Electronic Technology, Beihai, 536000, People's Republic of China. |
| authorships[2].institutions[0].id | https://openalex.org/I5343935 |
| authorships[2].institutions[0].ror | https://ror.org/05arjae42 |
| authorships[2].institutions[0].type | education |
| authorships[2].institutions[0].lineage | https://openalex.org/I5343935 |
| authorships[2].institutions[0].country_code | CN |
| authorships[2].institutions[0].display_name | Guilin University of Electronic Technology |
| authorships[2].author_position | middle |
| authorships[2].raw_author_name | Shu-Ya Jin |
| authorships[2].is_corresponding | False |
| authorships[2].raw_affiliation_strings | Academy of Marine Information Technology, Guilin University of Electronic Technology, Beihai, 536000, People's Republic of China. |
| authorships[3].author.id | https://openalex.org/A5077042200 |
| authorships[3].author.orcid | |
| authorships[3].author.display_name | Chuanjie Guo |
| authorships[3].countries | CN |
| authorships[3].affiliations[0].institution_ids | https://openalex.org/I5343935 |
| authorships[3].affiliations[0].raw_affiliation_string | Academy of Marine Information Technology, Guilin University of Electronic Technology, Beihai, 536000, People's Republic of China. |
| authorships[3].institutions[0].id | https://openalex.org/I5343935 |
| authorships[3].institutions[0].ror | https://ror.org/05arjae42 |
| authorships[3].institutions[0].type | education |
| authorships[3].institutions[0].lineage | https://openalex.org/I5343935 |
| authorships[3].institutions[0].country_code | CN |
| authorships[3].institutions[0].display_name | Guilin University of Electronic Technology |
| authorships[3].author_position | middle |
| authorships[3].raw_author_name | Chuan-Jie Guo |
| authorships[3].is_corresponding | False |
| authorships[3].raw_affiliation_strings | Academy of Marine Information Technology, Guilin University of Electronic Technology, Beihai, 536000, People's Republic of China. |
| authorships[4].author.id | https://openalex.org/A5007714861 |
| authorships[4].author.orcid | https://orcid.org/0000-0001-6567-6413 |
| authorships[4].author.display_name | Mohammed Alsuhaibani |
| authorships[4].countries | SA |
| authorships[4].affiliations[0].institution_ids | https://openalex.org/I156216236 |
| authorships[4].affiliations[0].raw_affiliation_string | Department of Computer Science, College of Computer, Qassim University, Buraydah, 51452, Saudi Arabia. [email protected]. |
| authorships[4].institutions[0].id | https://openalex.org/I156216236 |
| authorships[4].institutions[0].ror | https://ror.org/01wsfe280 |
| authorships[4].institutions[0].type | education |
| authorships[4].institutions[0].lineage | https://openalex.org/I156216236 |
| authorships[4].institutions[0].country_code | SA |
| authorships[4].institutions[0].display_name | Qassim University |
| authorships[4].author_position | last |
| authorships[4].raw_author_name | Mohammed Alsuhaibani |
| authorships[4].is_corresponding | False |
| authorships[4].raw_affiliation_strings | Department of Computer Science, College of Computer, Qassim University, Buraydah, 51452, Saudi Arabia. [email protected]. |
| has_content.pdf | True |
| has_content.grobid_xml | False |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://www.nature.com/articles/s41598-025-30511-6_reference.pdf |
| open_access.oa_status | gold |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-12-10T00:00:00 |
| display_name | Assessing random forest performance in low resource speech emotion recognition |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-12-11T23:09:37.256380 |
| primary_topic | |
| cited_by_count | 0 |
| locations_count | 2 |
| best_oa_location.id | doi:10.1038/s41598-025-30511-6 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S196734849 |
| best_oa_location.source.issn | 2045-2322 |
| best_oa_location.source.type | journal |
| best_oa_location.source.is_oa | True |
| best_oa_location.source.issn_l | 2045-2322 |
| best_oa_location.source.is_core | True |
| best_oa_location.source.is_in_doaj | True |
| best_oa_location.source.display_name | Scientific Reports |
| best_oa_location.source.host_organization | https://openalex.org/P4310319908 |
| best_oa_location.source.host_organization_name | Nature Portfolio |
| best_oa_location.source.host_organization_lineage | https://openalex.org/P4310319908, https://openalex.org/P4310319965 |
| best_oa_location.source.host_organization_lineage_names | Nature Portfolio, Springer Nature |
| best_oa_location.license | cc-by-nc-nd |
| best_oa_location.pdf_url | https://www.nature.com/articles/s41598-025-30511-6_reference.pdf |
| best_oa_location.version | publishedVersion |
| best_oa_location.raw_type | journal-article |
| best_oa_location.license_id | https://openalex.org/licenses/cc-by-nc-nd |
| best_oa_location.is_accepted | True |
| best_oa_location.is_published | True |
| best_oa_location.raw_source_name | Scientific Reports |
| best_oa_location.landing_page_url | https://doi.org/10.1038/s41598-025-30511-6 |
| primary_location.id | doi:10.1038/s41598-025-30511-6 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S196734849 |
| primary_location.source.issn | 2045-2322 |
| primary_location.source.type | journal |
| primary_location.source.is_oa | True |
| primary_location.source.issn_l | 2045-2322 |
| primary_location.source.is_core | True |
| primary_location.source.is_in_doaj | True |
| primary_location.source.display_name | Scientific Reports |
| primary_location.source.host_organization | https://openalex.org/P4310319908 |
| primary_location.source.host_organization_name | Nature Portfolio |
| primary_location.source.host_organization_lineage | https://openalex.org/P4310319908, https://openalex.org/P4310319965 |
| primary_location.source.host_organization_lineage_names | Nature Portfolio, Springer Nature |
| primary_location.license | cc-by-nc-nd |
| primary_location.pdf_url | https://www.nature.com/articles/s41598-025-30511-6_reference.pdf |
| primary_location.version | publishedVersion |
| primary_location.raw_type | journal-article |
| primary_location.license_id | https://openalex.org/licenses/cc-by-nc-nd |
| primary_location.is_accepted | True |
| primary_location.is_published | True |
| primary_location.raw_source_name | Scientific Reports |
| primary_location.landing_page_url | https://doi.org/10.1038/s41598-025-30511-6 |
| publication_date | 2025-12-10 |
| publication_year | 2025 |
| referenced_works | https://openalex.org/W4386397728, https://openalex.org/W2997399314, https://openalex.org/W4385240187, https://openalex.org/W4408459007, https://openalex.org/W4283366946, https://openalex.org/W4385322803, https://openalex.org/W4383426356, https://openalex.org/W2009465763, https://openalex.org/W4405761372, https://openalex.org/W4382399592, https://openalex.org/W3130521866, https://openalex.org/W4388821224, https://openalex.org/W4410193901, https://openalex.org/W4311970277, https://openalex.org/W4291237139, https://openalex.org/W3162742210, https://openalex.org/W2888790405, https://openalex.org/W4310720906, https://openalex.org/W3023491306, https://openalex.org/W4382489462, https://openalex.org/W3092190487, https://openalex.org/W2587299955, https://openalex.org/W4383551941, https://openalex.org/W4295867992, https://openalex.org/W4220829848, https://openalex.org/W3129285739, https://openalex.org/W3092468857, https://openalex.org/W4389058056, https://openalex.org/W4321496488, https://openalex.org/W2970405962, https://openalex.org/W2905903577, https://openalex.org/W2146334809, https://openalex.org/W175750906, https://openalex.org/W4410708102, https://openalex.org/W4312187627, https://openalex.org/W4401537068, https://openalex.org/W2602034649, https://openalex.org/W2972691009, https://openalex.org/W2896596895, https://openalex.org/W4403729532, https://openalex.org/W4413081861, https://openalex.org/W4393141088, https://openalex.org/W4399374391, https://openalex.org/W4413278144, https://openalex.org/W4410225487 |
| referenced_works_count | 45 |
| abstract_inverted_index.a | 9, 27, 37, 64, 84, 163 |
| abstract_inverted_index.By | 41 |
| abstract_inverted_index.In | 0, 149 |
| abstract_inverted_index.RF | 80, 126 |
| abstract_inverted_index.an | 67 |
| abstract_inverted_index.in | 92, 115, 128 |
| abstract_inverted_index.is | 8 |
| abstract_inverted_index.of | 25, 71, 78, 107, 125, 143, 158, 166, 183 |
| abstract_inverted_index.on | 43, 111 |
| abstract_inverted_index.to | 15, 154, 161 |
| abstract_inverted_index.we | 60, 152 |
| abstract_inverted_index.SER | 33, 184 |
| abstract_inverted_index.and | 49, 51, 82, 109, 123, 168 |
| abstract_inverted_index.for | 32, 57, 132, 146, 180 |
| abstract_inverted_index.how | 170 |
| abstract_inverted_index.key | 105 |
| abstract_inverted_index.new | 178 |
| abstract_inverted_index.our | 101, 159, 174 |
| abstract_inverted_index.the | 23, 44, 76, 79, 104, 121, 130, 141, 150, 156, 181 |
| abstract_inverted_index.way | 131 |
| abstract_inverted_index.(AI) | 137 |
| abstract_inverted_index.(RF) | 30 |
| abstract_inverted_index.This | 19, 73, 118 |
| abstract_inverted_index.Urdu | 35 |
| abstract_inverted_index.even | 139 |
| abstract_inverted_index.have | 61 |
| abstract_inverted_index.hope | 153 |
| abstract_inverted_index.into | 22 |
| abstract_inverted_index.role | 114 |
| abstract_inverted_index.step | 86 |
| abstract_inverted_index.that | 12 |
| abstract_inverted_index.user | 95 |
| abstract_inverted_index.with | 66 |
| abstract_inverted_index.(SER) | 7 |
| abstract_inverted_index.MFCCs | 108 |
| abstract_inverted_index.Urdu. | 148 |
| abstract_inverted_index.human | 17 |
| abstract_inverted_index.model | 65 |
| abstract_inverted_index.other | 171 |
| abstract_inverted_index.scope | 157 |
| abstract_inverted_index.study | 20 |
| abstract_inverted_index.their | 112 |
| abstract_inverted_index.three | 45 |
| abstract_inverted_index.under | 140 |
| abstract_inverted_index.wider | 164 |
| abstract_inverted_index.(HCI), | 3 |
| abstract_inverted_index.affect | 173 |
| abstract_inverted_index.anger, | 50 |
| abstract_inverted_index.delves | 21 |
| abstract_inverted_index.forest | 29 |
| abstract_inverted_index.paving | 129 |
| abstract_inverted_index.random | 28 |
| abstract_inverted_index.speech | 4 |
| abstract_inverted_index.subtle | 134 |
| abstract_inverted_index.within | 34 |
| abstract_inverted_index.(MFCCs) | 56 |
| abstract_inverted_index.94.53%. | 72 |
| abstract_inverted_index.avenues | 179 |
| abstract_inverted_index.broaden | 155 |
| abstract_inverted_index.crafted | 63 |
| abstract_inverted_index.crucial | 113 |
| abstract_inverted_index.digital | 94 |
| abstract_inverted_index.emotion | 5, 116 |
| abstract_inverted_index.enables | 13 |
| abstract_inverted_index.feature | 58 |
| abstract_inverted_index.future, | 151 |
| abstract_inverted_index.include | 162 |
| abstract_inverted_index.model's | 175 |
| abstract_inverted_index.notably | 38 |
| abstract_inverted_index.pivotal | 10 |
| abstract_inverted_index.primary | 46 |
| abstract_inverted_index.realism | 124 |
| abstract_inverted_index.speech, | 36 |
| abstract_inverted_index.through | 97 |
| abstract_inverted_index.towards | 87 |
| abstract_inverted_index.accuracy | 70 |
| abstract_inverted_index.cepstral | 54 |
| abstract_inverted_index.creating | 177 |
| abstract_inverted_index.datasets | 172 |
| abstract_inverted_index.decipher | 16 |
| abstract_inverted_index.efficacy | 24 |
| abstract_inverted_index.emotions | 167 |
| abstract_inverted_index.endeavor | 74 |
| abstract_inverted_index.features | 106 |
| abstract_inverted_index.focusing | 42 |
| abstract_inverted_index.instance | 147 |
| abstract_inverted_index.machines | 14 |
| abstract_inverted_index.research | 119, 160 |
| abstract_inverted_index.sadness, | 48 |
| abstract_inverted_index.spectrum | 165 |
| abstract_inverted_index.systems, | 138 |
| abstract_inverted_index.Moreover, | 100 |
| abstract_inverted_index.emotional | 98 |
| abstract_inverted_index.emotions. | 18 |
| abstract_inverted_index.employing | 26 |
| abstract_inverted_index.endeavors | 120 |
| abstract_inverted_index.improving | 93 |
| abstract_inverted_index.language. | 40 |
| abstract_inverted_index.languages | 145 |
| abstract_inverted_index.viability | 122 |
| abstract_inverted_index.artificial | 89, 135 |
| abstract_inverted_index.classifier | 31, 81 |
| abstract_inverted_index.empathetic | 88 |
| abstract_inverted_index.highlights | 75, 103 |
| abstract_inverted_index.impressive | 68 |
| abstract_inverted_index.leveraging | 52 |
| abstract_inverted_index.represents | 83 |
| abstract_inverted_index.robustness | 77 |
| abstract_inverted_index.technology | 11 |
| abstract_inverted_index.validation | 69 |
| abstract_inverted_index.advancement | 182 |
| abstract_inverted_index.classifiers | 127 |
| abstract_inverted_index.emotionally | 133 |
| abstract_inverted_index.examination | 102 |
| abstract_inverted_index.experiences | 96 |
| abstract_inverted_index.extraction, | 59 |
| abstract_inverted_index.interaction | 2 |
| abstract_inverted_index.investigate | 169 |
| abstract_inverted_index.recognition | 6 |
| abstract_inverted_index.significant | 85 |
| abstract_inverted_index.technology. | 185 |
| abstract_inverted_index.coefficients | 55 |
| abstract_inverted_index.discrepancy. | 117 |
| abstract_inverted_index.intelligence | 136 |
| abstract_inverted_index.low-resource | 39 |
| abstract_inverted_index.meticulously | 62 |
| abstract_inverted_index.particularly | 91 |
| abstract_inverted_index.Mel-frequency | 53 |
| abstract_inverted_index.concentrating | 110 |
| abstract_inverted_index.constrictions | 142 |
| abstract_inverted_index.intelligence, | 90 |
| abstract_inverted_index.functionality, | 176 |
| abstract_inverted_index.human-computer | 1 |
| abstract_inverted_index.understanding. | 99 |
| abstract_inverted_index.resource-deprived | 144 |
| abstract_inverted_index.emotions-happiness, | 47 |
| cited_by_percentile_year | |
| countries_distinct_count | 2 |
| institutions_distinct_count | 5 |
| citation_normalized_percentile |