Hands-Free Cursor Control Through Eyeball using Python Article Swipe
YOU?
·
· 2025
· Open Access
·
· DOI: https://doi.org/10.36948/ijfmr.2025.v07i04.52038
This study proposes a revolutionary Human-Computer Interaction (HCI) technology that allows for computer control using eye movements, with a focus on those with physical limitations. Users with limited mobility frequently find themselves unable to utilize traditional input devices such as a mouse or keyboard. To solve this issue, we created a vision-based system that uses Python, OpenCV, and the MediaPipe modules to identify and monitor pupil movements in real time. The suggested technology uses a camera to capture eye movements, which are then used to identify the pupil's center. By mapping these coordinates, the system allows users to move the mouse pointer across the screen. Furthermore, PyAutoGUI interprets eye blinks as click events, which eliminates the requirement for physical contact. The work provides an eye-controlled mouse technology designed to assist physically challenged persons in interacting with computers. This methodology offers a low-cost and accessible method of managing computers, encouraging inclusiveness for users with motor disabilities. The device shows impressive accuracy and responsiveness, with possible applications in assistive technologies, gaming, and touchless interfaces.
Related Topics
- Type
- article
- Language
- en
- Landing Page
- https://doi.org/10.36948/ijfmr.2025.v07i04.52038
- https://www.ijfmr.com/papers/2025/4/52038.pdf
- OA Status
- hybrid
- Related Works
- 10
- OpenAlex ID
- https://openalex.org/W4412743258
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4412743258Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.36948/ijfmr.2025.v07i04.52038Digital Object Identifier
- Title
-
Hands-Free Cursor Control Through Eyeball using PythonWork title
- Type
-
articleOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2025Year of publication
- Publication date
-
2025-07-25Full publication date if available
- Authors
-
Ashish Chandra, Pinnamraju T S Priya -List of authors in order
- Landing page
-
https://doi.org/10.36948/ijfmr.2025.v07i04.52038Publisher landing page
- PDF URL
-
https://www.ijfmr.com/papers/2025/4/52038.pdfDirect link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
hybridOpen access status per OpenAlex
- OA URL
-
https://www.ijfmr.com/papers/2025/4/52038.pdfDirect OA link when available
- Concepts
-
Cursor (databases), Python (programming language), Computer science, Computer graphics (images), Programming language, Artificial intelligenceTop concepts (fields/topics) attached by OpenAlex
- Cited by
-
0Total citation count in OpenAlex
- Related works (count)
-
10Other works algorithmically related by OpenAlex
Full payload
| id | https://openalex.org/W4412743258 |
|---|---|
| doi | https://doi.org/10.36948/ijfmr.2025.v07i04.52038 |
| ids.doi | https://doi.org/10.36948/ijfmr.2025.v07i04.52038 |
| ids.openalex | https://openalex.org/W4412743258 |
| fwci | 0.0 |
| type | article |
| title | Hands-Free Cursor Control Through Eyeball using Python |
| biblio.issue | 4 |
| biblio.volume | 7 |
| biblio.last_page | |
| biblio.first_page | |
| topics[0].id | https://openalex.org/T11707 |
| topics[0].field.id | https://openalex.org/fields/17 |
| topics[0].field.display_name | Computer Science |
| topics[0].score | 0.9923999905586243 |
| topics[0].domain.id | https://openalex.org/domains/3 |
| topics[0].domain.display_name | Physical Sciences |
| topics[0].subfield.id | https://openalex.org/subfields/1709 |
| topics[0].subfield.display_name | Human-Computer Interaction |
| topics[0].display_name | Gaze Tracking and Assistive Technology |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| concepts[0].id | https://openalex.org/C2776990265 |
| concepts[0].level | 2 |
| concepts[0].score | 0.8824148178100586 |
| concepts[0].wikidata | https://www.wikidata.org/wiki/Q2998101 |
| concepts[0].display_name | Cursor (databases) |
| concepts[1].id | https://openalex.org/C519991488 |
| concepts[1].level | 2 |
| concepts[1].score | 0.690418004989624 |
| concepts[1].wikidata | https://www.wikidata.org/wiki/Q28865 |
| concepts[1].display_name | Python (programming language) |
| concepts[2].id | https://openalex.org/C41008148 |
| concepts[2].level | 0 |
| concepts[2].score | 0.6531381607055664 |
| concepts[2].wikidata | https://www.wikidata.org/wiki/Q21198 |
| concepts[2].display_name | Computer science |
| concepts[3].id | https://openalex.org/C121684516 |
| concepts[3].level | 1 |
| concepts[3].score | 0.48906952142715454 |
| concepts[3].wikidata | https://www.wikidata.org/wiki/Q7600677 |
| concepts[3].display_name | Computer graphics (images) |
| concepts[4].id | https://openalex.org/C199360897 |
| concepts[4].level | 1 |
| concepts[4].score | 0.3105581998825073 |
| concepts[4].wikidata | https://www.wikidata.org/wiki/Q9143 |
| concepts[4].display_name | Programming language |
| concepts[5].id | https://openalex.org/C154945302 |
| concepts[5].level | 1 |
| concepts[5].score | 0.2365035116672516 |
| concepts[5].wikidata | https://www.wikidata.org/wiki/Q11660 |
| concepts[5].display_name | Artificial intelligence |
| keywords[0].id | https://openalex.org/keywords/cursor |
| keywords[0].score | 0.8824148178100586 |
| keywords[0].display_name | Cursor (databases) |
| keywords[1].id | https://openalex.org/keywords/python |
| keywords[1].score | 0.690418004989624 |
| keywords[1].display_name | Python (programming language) |
| keywords[2].id | https://openalex.org/keywords/computer-science |
| keywords[2].score | 0.6531381607055664 |
| keywords[2].display_name | Computer science |
| keywords[3].id | https://openalex.org/keywords/computer-graphics |
| keywords[3].score | 0.48906952142715454 |
| keywords[3].display_name | Computer graphics (images) |
| keywords[4].id | https://openalex.org/keywords/programming-language |
| keywords[4].score | 0.3105581998825073 |
| keywords[4].display_name | Programming language |
| keywords[5].id | https://openalex.org/keywords/artificial-intelligence |
| keywords[5].score | 0.2365035116672516 |
| keywords[5].display_name | Artificial intelligence |
| language | en |
| locations[0].id | doi:10.36948/ijfmr.2025.v07i04.52038 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S4210207214 |
| locations[0].source.issn | 2582-2160 |
| locations[0].source.type | journal |
| locations[0].source.is_oa | False |
| locations[0].source.issn_l | 2582-2160 |
| locations[0].source.is_core | False |
| locations[0].source.is_in_doaj | False |
| locations[0].source.display_name | International Journal For Multidisciplinary Research |
| locations[0].source.host_organization | |
| locations[0].source.host_organization_name | |
| locations[0].license | cc-by-sa |
| locations[0].pdf_url | https://www.ijfmr.com/papers/2025/4/52038.pdf |
| locations[0].version | publishedVersion |
| locations[0].raw_type | journal-article |
| locations[0].license_id | https://openalex.org/licenses/cc-by-sa |
| locations[0].is_accepted | True |
| locations[0].is_published | True |
| locations[0].raw_source_name | International Journal For Multidisciplinary Research |
| locations[0].landing_page_url | https://doi.org/10.36948/ijfmr.2025.v07i04.52038 |
| indexed_in | crossref |
| authorships[0].author.id | https://openalex.org/A5063801494 |
| authorships[0].author.orcid | https://orcid.org/0000-0003-0339-0086 |
| authorships[0].author.display_name | Ashish Chandra |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Anapu Chandra |
| authorships[0].is_corresponding | False |
| authorships[1].author.id | https://openalex.org/A5119111716 |
| authorships[1].author.orcid | |
| authorships[1].author.display_name | Pinnamraju T S Priya - |
| authorships[1].author_position | last |
| authorships[1].raw_author_name | Pinnamraju T S Priya - |
| authorships[1].is_corresponding | False |
| has_content.pdf | True |
| has_content.grobid_xml | True |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://www.ijfmr.com/papers/2025/4/52038.pdf |
| open_access.oa_status | hybrid |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | Hands-Free Cursor Control Through Eyeball using Python |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-11-06T03:46:38.306776 |
| primary_topic.id | https://openalex.org/T11707 |
| primary_topic.field.id | https://openalex.org/fields/17 |
| primary_topic.field.display_name | Computer Science |
| primary_topic.score | 0.9923999905586243 |
| primary_topic.domain.id | https://openalex.org/domains/3 |
| primary_topic.domain.display_name | Physical Sciences |
| primary_topic.subfield.id | https://openalex.org/subfields/1709 |
| primary_topic.subfield.display_name | Human-Computer Interaction |
| primary_topic.display_name | Gaze Tracking and Assistive Technology |
| related_works | https://openalex.org/W4391375266, https://openalex.org/W2899084033, https://openalex.org/W2748952813, https://openalex.org/W2341492732, https://openalex.org/W3187193180, https://openalex.org/W106542691, https://openalex.org/W1699080303, https://openalex.org/W2482827754, https://openalex.org/W2207495067, https://openalex.org/W1906486629 |
| cited_by_count | 0 |
| locations_count | 1 |
| best_oa_location.id | doi:10.36948/ijfmr.2025.v07i04.52038 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S4210207214 |
| best_oa_location.source.issn | 2582-2160 |
| best_oa_location.source.type | journal |
| best_oa_location.source.is_oa | False |
| best_oa_location.source.issn_l | 2582-2160 |
| best_oa_location.source.is_core | False |
| best_oa_location.source.is_in_doaj | False |
| best_oa_location.source.display_name | International Journal For Multidisciplinary Research |
| best_oa_location.source.host_organization | |
| best_oa_location.source.host_organization_name | |
| best_oa_location.license | cc-by-sa |
| best_oa_location.pdf_url | https://www.ijfmr.com/papers/2025/4/52038.pdf |
| best_oa_location.version | publishedVersion |
| best_oa_location.raw_type | journal-article |
| best_oa_location.license_id | https://openalex.org/licenses/cc-by-sa |
| best_oa_location.is_accepted | True |
| best_oa_location.is_published | True |
| best_oa_location.raw_source_name | International Journal For Multidisciplinary Research |
| best_oa_location.landing_page_url | https://doi.org/10.36948/ijfmr.2025.v07i04.52038 |
| primary_location.id | doi:10.36948/ijfmr.2025.v07i04.52038 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S4210207214 |
| primary_location.source.issn | 2582-2160 |
| primary_location.source.type | journal |
| primary_location.source.is_oa | False |
| primary_location.source.issn_l | 2582-2160 |
| primary_location.source.is_core | False |
| primary_location.source.is_in_doaj | False |
| primary_location.source.display_name | International Journal For Multidisciplinary Research |
| primary_location.source.host_organization | |
| primary_location.source.host_organization_name | |
| primary_location.license | cc-by-sa |
| primary_location.pdf_url | https://www.ijfmr.com/papers/2025/4/52038.pdf |
| primary_location.version | publishedVersion |
| primary_location.raw_type | journal-article |
| primary_location.license_id | https://openalex.org/licenses/cc-by-sa |
| primary_location.is_accepted | True |
| primary_location.is_published | True |
| primary_location.raw_source_name | International Journal For Multidisciplinary Research |
| primary_location.landing_page_url | https://doi.org/10.36948/ijfmr.2025.v07i04.52038 |
| publication_date | 2025-07-25 |
| publication_year | 2025 |
| referenced_works_count | 0 |
| abstract_inverted_index.a | 3, 18, 40, 50, 74, 140 |
| abstract_inverted_index.By | 89 |
| abstract_inverted_index.To | 44 |
| abstract_inverted_index.an | 123 |
| abstract_inverted_index.as | 39, 110 |
| abstract_inverted_index.in | 67, 133, 165 |
| abstract_inverted_index.of | 145 |
| abstract_inverted_index.on | 20 |
| abstract_inverted_index.or | 42 |
| abstract_inverted_index.to | 33, 61, 76, 84, 97, 128 |
| abstract_inverted_index.we | 48 |
| abstract_inverted_index.The | 70, 120, 155 |
| abstract_inverted_index.and | 57, 63, 142, 160, 169 |
| abstract_inverted_index.are | 81 |
| abstract_inverted_index.eye | 15, 78, 108 |
| abstract_inverted_index.for | 11, 117, 150 |
| abstract_inverted_index.the | 58, 86, 93, 99, 103, 115 |
| abstract_inverted_index.This | 0, 137 |
| abstract_inverted_index.find | 30 |
| abstract_inverted_index.move | 98 |
| abstract_inverted_index.real | 68 |
| abstract_inverted_index.such | 38 |
| abstract_inverted_index.that | 9, 53 |
| abstract_inverted_index.then | 82 |
| abstract_inverted_index.this | 46 |
| abstract_inverted_index.used | 83 |
| abstract_inverted_index.uses | 54, 73 |
| abstract_inverted_index.with | 17, 22, 26, 135, 152, 162 |
| abstract_inverted_index.work | 121 |
| abstract_inverted_index.(HCI) | 7 |
| abstract_inverted_index.Users | 25 |
| abstract_inverted_index.click | 111 |
| abstract_inverted_index.focus | 19 |
| abstract_inverted_index.input | 36 |
| abstract_inverted_index.motor | 153 |
| abstract_inverted_index.mouse | 41, 100, 125 |
| abstract_inverted_index.pupil | 65 |
| abstract_inverted_index.shows | 157 |
| abstract_inverted_index.solve | 45 |
| abstract_inverted_index.study | 1 |
| abstract_inverted_index.these | 91 |
| abstract_inverted_index.those | 21 |
| abstract_inverted_index.time. | 69 |
| abstract_inverted_index.users | 96, 151 |
| abstract_inverted_index.using | 14 |
| abstract_inverted_index.which | 80, 113 |
| abstract_inverted_index.across | 102 |
| abstract_inverted_index.allows | 10, 95 |
| abstract_inverted_index.assist | 129 |
| abstract_inverted_index.blinks | 109 |
| abstract_inverted_index.camera | 75 |
| abstract_inverted_index.device | 156 |
| abstract_inverted_index.issue, | 47 |
| abstract_inverted_index.method | 144 |
| abstract_inverted_index.offers | 139 |
| abstract_inverted_index.system | 52, 94 |
| abstract_inverted_index.unable | 32 |
| abstract_inverted_index.OpenCV, | 56 |
| abstract_inverted_index.Python, | 55 |
| abstract_inverted_index.capture | 77 |
| abstract_inverted_index.center. | 88 |
| abstract_inverted_index.control | 13 |
| abstract_inverted_index.created | 49 |
| abstract_inverted_index.devices | 37 |
| abstract_inverted_index.events, | 112 |
| abstract_inverted_index.gaming, | 168 |
| abstract_inverted_index.limited | 27 |
| abstract_inverted_index.mapping | 90 |
| abstract_inverted_index.modules | 60 |
| abstract_inverted_index.monitor | 64 |
| abstract_inverted_index.persons | 132 |
| abstract_inverted_index.pointer | 101 |
| abstract_inverted_index.pupil's | 87 |
| abstract_inverted_index.screen. | 104 |
| abstract_inverted_index.utilize | 34 |
| abstract_inverted_index.accuracy | 159 |
| abstract_inverted_index.computer | 12 |
| abstract_inverted_index.contact. | 119 |
| abstract_inverted_index.designed | 127 |
| abstract_inverted_index.identify | 62, 85 |
| abstract_inverted_index.low-cost | 141 |
| abstract_inverted_index.managing | 146 |
| abstract_inverted_index.mobility | 28 |
| abstract_inverted_index.physical | 23, 118 |
| abstract_inverted_index.possible | 163 |
| abstract_inverted_index.proposes | 2 |
| abstract_inverted_index.provides | 122 |
| abstract_inverted_index.MediaPipe | 59 |
| abstract_inverted_index.PyAutoGUI | 106 |
| abstract_inverted_index.assistive | 166 |
| abstract_inverted_index.keyboard. | 43 |
| abstract_inverted_index.movements | 66 |
| abstract_inverted_index.suggested | 71 |
| abstract_inverted_index.touchless | 170 |
| abstract_inverted_index.accessible | 143 |
| abstract_inverted_index.challenged | 131 |
| abstract_inverted_index.computers, | 147 |
| abstract_inverted_index.computers. | 136 |
| abstract_inverted_index.eliminates | 114 |
| abstract_inverted_index.frequently | 29 |
| abstract_inverted_index.impressive | 158 |
| abstract_inverted_index.interprets | 107 |
| abstract_inverted_index.movements, | 16, 79 |
| abstract_inverted_index.physically | 130 |
| abstract_inverted_index.technology | 8, 72, 126 |
| abstract_inverted_index.themselves | 31 |
| abstract_inverted_index.Interaction | 6 |
| abstract_inverted_index.encouraging | 148 |
| abstract_inverted_index.interacting | 134 |
| abstract_inverted_index.interfaces. | 171 |
| abstract_inverted_index.methodology | 138 |
| abstract_inverted_index.requirement | 116 |
| abstract_inverted_index.traditional | 35 |
| abstract_inverted_index.Furthermore, | 105 |
| abstract_inverted_index.applications | 164 |
| abstract_inverted_index.coordinates, | 92 |
| abstract_inverted_index.limitations. | 24 |
| abstract_inverted_index.vision-based | 51 |
| abstract_inverted_index.disabilities. | 154 |
| abstract_inverted_index.inclusiveness | 149 |
| abstract_inverted_index.revolutionary | 4 |
| abstract_inverted_index.technologies, | 167 |
| abstract_inverted_index.Human-Computer | 5 |
| abstract_inverted_index.eye-controlled | 124 |
| abstract_inverted_index.responsiveness, | 161 |
| cited_by_percentile_year | |
| countries_distinct_count | 0 |
| institutions_distinct_count | 2 |
| citation_normalized_percentile.value | 0.34681723 |
| citation_normalized_percentile.is_in_top_1_percent | False |
| citation_normalized_percentile.is_in_top_10_percent | True |