Exploring automated machine learning to develop facial expression recognition systems Article Swipe
YOU?
·
· 2025
· Open Access
·
· DOI: https://doi.org/10.1007/s42452-025-07728-1
With human-computer interactions on the rise, in use for even the most mundane things like shopping, making travel arrangements and ordering food as well as matters of national importance such as healthcare and security, Facial Expression Recognition has become crucial to our lives. Currently used traditional facial expression recognition systems rely on manual tuning by experts to get optimised results, face challenges such as the demand of resources and expert-level knowledge of a plethora of neural network architectures which prevents the technology from being widely adopted. This paper demonstrates a practical application of Automated Machine Learning using a framework known as AutoGluon to automate model selection and hyperparameter tuning for Facial Expression Recognition, achieving competitive accuracy (76.4%) on the challenging FER2013 dataset. It also provides a comprehensive analysis of four model configurations (Default, Low Resource, Balanced, High Quality), enabling users to balance computational cost and performance. These models outperform several state-of-the-art manually tuned models (RMN, ViT, MLFCC etc.) and highlight AutoML’s potential to democratize FER development for non-experts. This paper also includes the use of metrics such as precision, recall and f1-score along with accuracy to provide a more holistic understanding of the models’ strengths and limitations, and includes ethical considerations and future directions, addressing critical issues like bias mitigation and real-world deployment challenges. This work confirms the potential of AutoML to revolutionize facial expression recognition, making advanced facial expression analysis more accessible and effective.
Related Topics
- Type
- article
- Language
- en
- Landing Page
- https://doi.org/10.1007/s42452-025-07728-1
- https://link.springer.com/content/pdf/10.1007/s42452-025-07728-1.pdf
- OA Status
- diamond
- References
- 40
- OpenAlex ID
- https://openalex.org/W4414994688
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4414994688Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.1007/s42452-025-07728-1Digital Object Identifier
- Title
-
Exploring automated machine learning to develop facial expression recognition systemsWork title
- Type
-
articleOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2025Year of publication
- Publication date
-
2025-10-09Full publication date if available
- Authors
-
Amit Kumar Goel, Praneet Saurabh, Dhananjay Bisen, Rishav Dubey, Prathap SomuList of authors in order
- Landing page
-
https://doi.org/10.1007/s42452-025-07728-1Publisher landing page
- PDF URL
-
https://link.springer.com/content/pdf/10.1007/s42452-025-07728-1.pdfDirect link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
diamondOpen access status per OpenAlex
- OA URL
-
https://link.springer.com/content/pdf/10.1007/s42452-025-07728-1.pdfDirect OA link when available
- Cited by
-
0Total citation count in OpenAlex
- References (count)
-
40Number of works referenced by this work
Full payload
| id | https://openalex.org/W4414994688 |
|---|---|
| doi | https://doi.org/10.1007/s42452-025-07728-1 |
| ids.doi | https://doi.org/10.1007/s42452-025-07728-1 |
| ids.openalex | https://openalex.org/W4414994688 |
| fwci | 0.0 |
| type | article |
| title | Exploring automated machine learning to develop facial expression recognition systems |
| biblio.issue | 10 |
| biblio.volume | 7 |
| biblio.last_page | |
| biblio.first_page | |
| topics[0].id | https://openalex.org/T10667 |
| topics[0].field.id | https://openalex.org/fields/32 |
| topics[0].field.display_name | Psychology |
| topics[0].score | 0.9991000294685364 |
| topics[0].domain.id | https://openalex.org/domains/2 |
| topics[0].domain.display_name | Social Sciences |
| topics[0].subfield.id | https://openalex.org/subfields/3205 |
| topics[0].subfield.display_name | Experimental and Cognitive Psychology |
| topics[0].display_name | Emotion and Mood Recognition |
| topics[1].id | https://openalex.org/T10057 |
| topics[1].field.id | https://openalex.org/fields/17 |
| topics[1].field.display_name | Computer Science |
| topics[1].score | 0.9987999796867371 |
| topics[1].domain.id | https://openalex.org/domains/3 |
| topics[1].domain.display_name | Physical Sciences |
| topics[1].subfield.id | https://openalex.org/subfields/1707 |
| topics[1].subfield.display_name | Computer Vision and Pattern Recognition |
| topics[1].display_name | Face and Expression Recognition |
| topics[2].id | https://openalex.org/T11448 |
| topics[2].field.id | https://openalex.org/fields/17 |
| topics[2].field.display_name | Computer Science |
| topics[2].score | 0.9976999759674072 |
| topics[2].domain.id | https://openalex.org/domains/3 |
| topics[2].domain.display_name | Physical Sciences |
| topics[2].subfield.id | https://openalex.org/subfields/1707 |
| topics[2].subfield.display_name | Computer Vision and Pattern Recognition |
| topics[2].display_name | Face recognition and analysis |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| language | en |
| locations[0].id | doi:10.1007/s42452-025-07728-1 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S5407042868 |
| locations[0].source.issn | 3004-9261 |
| locations[0].source.type | journal |
| locations[0].source.is_oa | True |
| locations[0].source.issn_l | 3004-9261 |
| locations[0].source.is_core | False |
| locations[0].source.is_in_doaj | True |
| locations[0].source.display_name | Discover Applied Sciences |
| locations[0].source.host_organization | |
| locations[0].source.host_organization_name | |
| locations[0].license | cc-by-nc-nd |
| locations[0].pdf_url | https://link.springer.com/content/pdf/10.1007/s42452-025-07728-1.pdf |
| locations[0].version | publishedVersion |
| locations[0].raw_type | journal-article |
| locations[0].license_id | https://openalex.org/licenses/cc-by-nc-nd |
| locations[0].is_accepted | True |
| locations[0].is_published | True |
| locations[0].raw_source_name | Discover Applied Sciences |
| locations[0].landing_page_url | https://doi.org/10.1007/s42452-025-07728-1 |
| locations[1].id | pmh:oai:doaj.org/article:11a408f3d81447ac80779685037665fd |
| locations[1].is_oa | False |
| locations[1].source.id | https://openalex.org/S4306401280 |
| locations[1].source.issn | |
| locations[1].source.type | repository |
| locations[1].source.is_oa | False |
| locations[1].source.issn_l | |
| locations[1].source.is_core | False |
| locations[1].source.is_in_doaj | False |
| locations[1].source.display_name | DOAJ (DOAJ: Directory of Open Access Journals) |
| locations[1].source.host_organization | |
| locations[1].source.host_organization_name | |
| locations[1].license | |
| locations[1].pdf_url | |
| locations[1].version | submittedVersion |
| locations[1].raw_type | article |
| locations[1].license_id | |
| locations[1].is_accepted | False |
| locations[1].is_published | False |
| locations[1].raw_source_name | Discover Applied Sciences, Vol 7, Iss 10, Pp 1-23 (2025) |
| locations[1].landing_page_url | https://doaj.org/article/11a408f3d81447ac80779685037665fd |
| indexed_in | crossref, doaj |
| authorships[0].author.id | https://openalex.org/A5101561198 |
| authorships[0].author.orcid | https://orcid.org/0000-0003-4204-926X |
| authorships[0].author.display_name | Amit Kumar Goel |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Atharv Goel |
| authorships[0].is_corresponding | False |
| authorships[1].author.id | https://openalex.org/A5046363887 |
| authorships[1].author.orcid | https://orcid.org/0000-0002-3782-4279 |
| authorships[1].author.display_name | Praneet Saurabh |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Praneet Saurabh |
| authorships[1].is_corresponding | False |
| authorships[2].author.id | https://openalex.org/A5012816906 |
| authorships[2].author.orcid | https://orcid.org/0000-0003-4165-3959 |
| authorships[2].author.display_name | Dhananjay Bisen |
| authorships[2].author_position | middle |
| authorships[2].raw_author_name | Dhananjay Bisen |
| authorships[2].is_corresponding | False |
| authorships[3].author.id | https://openalex.org/A5031674330 |
| authorships[3].author.orcid | https://orcid.org/0000-0001-8324-3152 |
| authorships[3].author.display_name | Rishav Dubey |
| authorships[3].author_position | middle |
| authorships[3].raw_author_name | Rishav Dubey |
| authorships[3].is_corresponding | False |
| authorships[4].author.id | https://openalex.org/A5119920539 |
| authorships[4].author.orcid | |
| authorships[4].author.display_name | Prathap Somu |
| authorships[4].author_position | last |
| authorships[4].raw_author_name | Prathap Somu |
| authorships[4].is_corresponding | False |
| has_content.pdf | True |
| has_content.grobid_xml | False |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://link.springer.com/content/pdf/10.1007/s42452-025-07728-1.pdf |
| open_access.oa_status | diamond |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | Exploring automated machine learning to develop facial expression recognition systems |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-11-06T03:46:38.306776 |
| primary_topic.id | https://openalex.org/T10667 |
| primary_topic.field.id | https://openalex.org/fields/32 |
| primary_topic.field.display_name | Psychology |
| primary_topic.score | 0.9991000294685364 |
| primary_topic.domain.id | https://openalex.org/domains/2 |
| primary_topic.domain.display_name | Social Sciences |
| primary_topic.subfield.id | https://openalex.org/subfields/3205 |
| primary_topic.subfield.display_name | Experimental and Cognitive Psychology |
| primary_topic.display_name | Emotion and Mood Recognition |
| cited_by_count | 0 |
| locations_count | 2 |
| best_oa_location.id | doi:10.1007/s42452-025-07728-1 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S5407042868 |
| best_oa_location.source.issn | 3004-9261 |
| best_oa_location.source.type | journal |
| best_oa_location.source.is_oa | True |
| best_oa_location.source.issn_l | 3004-9261 |
| best_oa_location.source.is_core | False |
| best_oa_location.source.is_in_doaj | True |
| best_oa_location.source.display_name | Discover Applied Sciences |
| best_oa_location.source.host_organization | |
| best_oa_location.source.host_organization_name | |
| best_oa_location.license | cc-by-nc-nd |
| best_oa_location.pdf_url | https://link.springer.com/content/pdf/10.1007/s42452-025-07728-1.pdf |
| best_oa_location.version | publishedVersion |
| best_oa_location.raw_type | journal-article |
| best_oa_location.license_id | https://openalex.org/licenses/cc-by-nc-nd |
| best_oa_location.is_accepted | True |
| best_oa_location.is_published | True |
| best_oa_location.raw_source_name | Discover Applied Sciences |
| best_oa_location.landing_page_url | https://doi.org/10.1007/s42452-025-07728-1 |
| primary_location.id | doi:10.1007/s42452-025-07728-1 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S5407042868 |
| primary_location.source.issn | 3004-9261 |
| primary_location.source.type | journal |
| primary_location.source.is_oa | True |
| primary_location.source.issn_l | 3004-9261 |
| primary_location.source.is_core | False |
| primary_location.source.is_in_doaj | True |
| primary_location.source.display_name | Discover Applied Sciences |
| primary_location.source.host_organization | |
| primary_location.source.host_organization_name | |
| primary_location.license | cc-by-nc-nd |
| primary_location.pdf_url | https://link.springer.com/content/pdf/10.1007/s42452-025-07728-1.pdf |
| primary_location.version | publishedVersion |
| primary_location.raw_type | journal-article |
| primary_location.license_id | https://openalex.org/licenses/cc-by-nc-nd |
| primary_location.is_accepted | True |
| primary_location.is_published | True |
| primary_location.raw_source_name | Discover Applied Sciences |
| primary_location.landing_page_url | https://doi.org/10.1007/s42452-025-07728-1 |
| publication_date | 2025-10-09 |
| publication_year | 2025 |
| referenced_works | https://openalex.org/W4313244619, https://openalex.org/W4404135415, https://openalex.org/W4405511989, https://openalex.org/W2742928281, https://openalex.org/W4220828392, https://openalex.org/W2342264685, https://openalex.org/W4390063968, https://openalex.org/W4389640736, https://openalex.org/W4388906717, https://openalex.org/W4407631600, https://openalex.org/W4394719929, https://openalex.org/W4386161521, https://openalex.org/W3161346624, https://openalex.org/W2618530766, https://openalex.org/W2963726609, https://openalex.org/W2506506742, https://openalex.org/W2982372777, https://openalex.org/W2962200824, https://openalex.org/W2914778005, https://openalex.org/W4382776104, https://openalex.org/W4287083813, https://openalex.org/W4400642556, https://openalex.org/W4319996317, https://openalex.org/W4388668091, https://openalex.org/W4387019970, https://openalex.org/W4391650123, https://openalex.org/W3008425820, https://openalex.org/W4408503249, https://openalex.org/W4405778930, https://openalex.org/W4205518925, https://openalex.org/W2799041689, https://openalex.org/W4392232628, https://openalex.org/W4385951249, https://openalex.org/W4236325002, https://openalex.org/W4295122555, https://openalex.org/W3128633047, https://openalex.org/W4384557825, https://openalex.org/W3094502228, https://openalex.org/W4317728343, https://openalex.org/W3124054989 |
| referenced_works_count | 40 |
| abstract_inverted_index.a | 73, 90, 98, 126, 188 |
| abstract_inverted_index.It | 123 |
| abstract_inverted_index.as | 23, 25, 31, 64, 101, 178 |
| abstract_inverted_index.by | 55 |
| abstract_inverted_index.in | 7 |
| abstract_inverted_index.of | 27, 67, 72, 75, 93, 129, 175, 192, 220 |
| abstract_inverted_index.on | 4, 52, 118 |
| abstract_inverted_index.to | 41, 57, 103, 141, 163, 186, 222 |
| abstract_inverted_index.FER | 165 |
| abstract_inverted_index.Low | 134 |
| abstract_inverted_index.and | 20, 33, 69, 107, 145, 159, 181, 196, 198, 202, 211, 234 |
| abstract_inverted_index.for | 9, 110, 167 |
| abstract_inverted_index.get | 58 |
| abstract_inverted_index.has | 38 |
| abstract_inverted_index.our | 42 |
| abstract_inverted_index.the | 5, 11, 65, 81, 119, 173, 193, 218 |
| abstract_inverted_index.use | 8, 174 |
| abstract_inverted_index.High | 137 |
| abstract_inverted_index.This | 87, 169, 215 |
| abstract_inverted_index.ViT, | 156 |
| abstract_inverted_index.With | 1 |
| abstract_inverted_index.also | 124, 171 |
| abstract_inverted_index.bias | 209 |
| abstract_inverted_index.cost | 144 |
| abstract_inverted_index.even | 10 |
| abstract_inverted_index.face | 61 |
| abstract_inverted_index.food | 22 |
| abstract_inverted_index.four | 130 |
| abstract_inverted_index.from | 83 |
| abstract_inverted_index.like | 15, 208 |
| abstract_inverted_index.more | 189, 232 |
| abstract_inverted_index.most | 12 |
| abstract_inverted_index.rely | 51 |
| abstract_inverted_index.such | 30, 63, 177 |
| abstract_inverted_index.used | 45 |
| abstract_inverted_index.well | 24 |
| abstract_inverted_index.with | 184 |
| abstract_inverted_index.work | 216 |
| abstract_inverted_index.(RMN, | 155 |
| abstract_inverted_index.MLFCC | 157 |
| abstract_inverted_index.These | 147 |
| abstract_inverted_index.along | 183 |
| abstract_inverted_index.being | 84 |
| abstract_inverted_index.etc.) | 158 |
| abstract_inverted_index.known | 100 |
| abstract_inverted_index.model | 105, 131 |
| abstract_inverted_index.paper | 88, 170 |
| abstract_inverted_index.rise, | 6 |
| abstract_inverted_index.tuned | 153 |
| abstract_inverted_index.users | 140 |
| abstract_inverted_index.using | 97 |
| abstract_inverted_index.which | 79 |
| abstract_inverted_index.AutoML | 221 |
| abstract_inverted_index.Facial | 35, 111 |
| abstract_inverted_index.become | 39 |
| abstract_inverted_index.demand | 66 |
| abstract_inverted_index.facial | 47, 224, 229 |
| abstract_inverted_index.future | 203 |
| abstract_inverted_index.issues | 207 |
| abstract_inverted_index.lives. | 43 |
| abstract_inverted_index.making | 17, 227 |
| abstract_inverted_index.manual | 53 |
| abstract_inverted_index.models | 148, 154 |
| abstract_inverted_index.neural | 76 |
| abstract_inverted_index.recall | 180 |
| abstract_inverted_index.things | 14 |
| abstract_inverted_index.travel | 18 |
| abstract_inverted_index.tuning | 54, 109 |
| abstract_inverted_index.widely | 85 |
| abstract_inverted_index.(76.4%) | 117 |
| abstract_inverted_index.FER2013 | 121 |
| abstract_inverted_index.Machine | 95 |
| abstract_inverted_index.balance | 142 |
| abstract_inverted_index.crucial | 40 |
| abstract_inverted_index.ethical | 200 |
| abstract_inverted_index.experts | 56 |
| abstract_inverted_index.matters | 26 |
| abstract_inverted_index.metrics | 176 |
| abstract_inverted_index.mundane | 13 |
| abstract_inverted_index.network | 77 |
| abstract_inverted_index.provide | 187 |
| abstract_inverted_index.several | 150 |
| abstract_inverted_index.systems | 50 |
| abstract_inverted_index.Abstract | 0 |
| abstract_inverted_index.Learning | 96 |
| abstract_inverted_index.accuracy | 116, 185 |
| abstract_inverted_index.adopted. | 86 |
| abstract_inverted_index.advanced | 228 |
| abstract_inverted_index.analysis | 128, 231 |
| abstract_inverted_index.automate | 104 |
| abstract_inverted_index.confirms | 217 |
| abstract_inverted_index.critical | 206 |
| abstract_inverted_index.dataset. | 122 |
| abstract_inverted_index.enabling | 139 |
| abstract_inverted_index.f1-score | 182 |
| abstract_inverted_index.holistic | 190 |
| abstract_inverted_index.includes | 172, 199 |
| abstract_inverted_index.manually | 152 |
| abstract_inverted_index.national | 28 |
| abstract_inverted_index.ordering | 21 |
| abstract_inverted_index.plethora | 74 |
| abstract_inverted_index.prevents | 80 |
| abstract_inverted_index.provides | 125 |
| abstract_inverted_index.results, | 60 |
| abstract_inverted_index.(Default, | 133 |
| abstract_inverted_index.AutoGluon | 102 |
| abstract_inverted_index.Automated | 94 |
| abstract_inverted_index.Balanced, | 136 |
| abstract_inverted_index.Currently | 44 |
| abstract_inverted_index.Quality), | 138 |
| abstract_inverted_index.Resource, | 135 |
| abstract_inverted_index.achieving | 114 |
| abstract_inverted_index.framework | 99 |
| abstract_inverted_index.highlight | 160 |
| abstract_inverted_index.knowledge | 71 |
| abstract_inverted_index.models’ | 194 |
| abstract_inverted_index.optimised | 59 |
| abstract_inverted_index.potential | 162, 219 |
| abstract_inverted_index.practical | 91 |
| abstract_inverted_index.resources | 68 |
| abstract_inverted_index.security, | 34 |
| abstract_inverted_index.selection | 106 |
| abstract_inverted_index.shopping, | 16 |
| abstract_inverted_index.strengths | 195 |
| abstract_inverted_index.AutoML’s | 161 |
| abstract_inverted_index.Expression | 36, 112 |
| abstract_inverted_index.accessible | 233 |
| abstract_inverted_index.addressing | 205 |
| abstract_inverted_index.challenges | 62 |
| abstract_inverted_index.deployment | 213 |
| abstract_inverted_index.effective. | 235 |
| abstract_inverted_index.expression | 48, 225, 230 |
| abstract_inverted_index.healthcare | 32 |
| abstract_inverted_index.importance | 29 |
| abstract_inverted_index.mitigation | 210 |
| abstract_inverted_index.outperform | 149 |
| abstract_inverted_index.precision, | 179 |
| abstract_inverted_index.real-world | 212 |
| abstract_inverted_index.technology | 82 |
| abstract_inverted_index.Recognition | 37 |
| abstract_inverted_index.application | 92 |
| abstract_inverted_index.challenges. | 214 |
| abstract_inverted_index.challenging | 120 |
| abstract_inverted_index.competitive | 115 |
| abstract_inverted_index.democratize | 164 |
| abstract_inverted_index.development | 166 |
| abstract_inverted_index.directions, | 204 |
| abstract_inverted_index.recognition | 49 |
| abstract_inverted_index.traditional | 46 |
| abstract_inverted_index.Recognition, | 113 |
| abstract_inverted_index.arrangements | 19 |
| abstract_inverted_index.demonstrates | 89 |
| abstract_inverted_index.expert-level | 70 |
| abstract_inverted_index.interactions | 3 |
| abstract_inverted_index.limitations, | 197 |
| abstract_inverted_index.non-experts. | 168 |
| abstract_inverted_index.performance. | 146 |
| abstract_inverted_index.recognition, | 226 |
| abstract_inverted_index.architectures | 78 |
| abstract_inverted_index.comprehensive | 127 |
| abstract_inverted_index.computational | 143 |
| abstract_inverted_index.revolutionize | 223 |
| abstract_inverted_index.understanding | 191 |
| abstract_inverted_index.configurations | 132 |
| abstract_inverted_index.considerations | 201 |
| abstract_inverted_index.human-computer | 2 |
| abstract_inverted_index.hyperparameter | 108 |
| abstract_inverted_index.state-of-the-art | 151 |
| cited_by_percentile_year | |
| countries_distinct_count | 0 |
| institutions_distinct_count | 5 |
| citation_normalized_percentile.value | 0.51543983 |
| citation_normalized_percentile.is_in_top_1_percent | False |
| citation_normalized_percentile.is_in_top_10_percent | True |