Personalized Federated Learning for Egocentric Video Gaze Estimation with Comprehensive Parameter Frezzing Article Swipe
YOU?
·
· 2025
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.2502.18123
Egocentric video gaze estimation requires models to capture individual gaze patterns while adapting to diverse user data. Our approach leverages a transformer-based architecture, integrating it into a PFL framework where only the most significant parameters, those exhibiting the highest rate of change during training, are selected and frozen for personalization in client models. Through extensive experimentation on the EGTEA Gaze+ and Ego4D datasets, we demonstrate that FedCPF significantly outperforms previously reported federated learning methods, achieving superior recall, precision, and F1-score. These results confirm the effectiveness of our comprehensive parameters freezing strategy in enhancing model personalization, making FedCPF a promising approach for tasks requiring both adaptability and accuracy in federated learning settings.
Related Topics
- Type
- preprint
- Language
- en
- Landing Page
- http://arxiv.org/abs/2502.18123
- https://arxiv.org/pdf/2502.18123
- OA Status
- green
- OpenAlex ID
- https://openalex.org/W4415188183
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4415188183Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.48550/arxiv.2502.18123Digital Object Identifier
- Title
-
Personalized Federated Learning for Egocentric Video Gaze Estimation with Comprehensive Parameter FrezzingWork title
- Type
-
preprintOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2025Year of publication
- Publication date
-
2025-02-25Full publication date if available
- Authors
-
Yuhu Feng, Keisuke Maeda, Takahiro Ogawa, Miki HaseyamaList of authors in order
- Landing page
-
https://arxiv.org/abs/2502.18123Publisher landing page
- PDF URL
-
https://arxiv.org/pdf/2502.18123Direct link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
greenOpen access status per OpenAlex
- OA URL
-
https://arxiv.org/pdf/2502.18123Direct OA link when available
- Cited by
-
0Total citation count in OpenAlex
Full payload
| id | https://openalex.org/W4415188183 |
|---|---|
| doi | https://doi.org/10.48550/arxiv.2502.18123 |
| ids.doi | https://doi.org/10.48550/arxiv.2502.18123 |
| ids.openalex | https://openalex.org/W4415188183 |
| fwci | 0.0 |
| type | preprint |
| title | Personalized Federated Learning for Egocentric Video Gaze Estimation with Comprehensive Parameter Frezzing |
| biblio.issue | |
| biblio.volume | |
| biblio.last_page | |
| biblio.first_page | |
| topics[0].id | https://openalex.org/T13731 |
| topics[0].field.id | https://openalex.org/fields/33 |
| topics[0].field.display_name | Social Sciences |
| topics[0].score | 0.979200005531311 |
| topics[0].domain.id | https://openalex.org/domains/2 |
| topics[0].domain.display_name | Social Sciences |
| topics[0].subfield.id | https://openalex.org/subfields/3322 |
| topics[0].subfield.display_name | Urban Studies |
| topics[0].display_name | Advanced Computing and Algorithms |
| topics[1].id | https://openalex.org/T11707 |
| topics[1].field.id | https://openalex.org/fields/17 |
| topics[1].field.display_name | Computer Science |
| topics[1].score | 0.970300018787384 |
| topics[1].domain.id | https://openalex.org/domains/3 |
| topics[1].domain.display_name | Physical Sciences |
| topics[1].subfield.id | https://openalex.org/subfields/1709 |
| topics[1].subfield.display_name | Human-Computer Interaction |
| topics[1].display_name | Gaze Tracking and Assistive Technology |
| topics[2].id | https://openalex.org/T12702 |
| topics[2].field.id | https://openalex.org/fields/28 |
| topics[2].field.display_name | Neuroscience |
| topics[2].score | 0.9603999853134155 |
| topics[2].domain.id | https://openalex.org/domains/1 |
| topics[2].domain.display_name | Life Sciences |
| topics[2].subfield.id | https://openalex.org/subfields/2808 |
| topics[2].subfield.display_name | Neurology |
| topics[2].display_name | Brain Tumor Detection and Classification |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| language | en |
| locations[0].id | pmh:oai:arXiv.org:2502.18123 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S4306400194 |
| locations[0].source.issn | |
| locations[0].source.type | repository |
| locations[0].source.is_oa | True |
| locations[0].source.issn_l | |
| locations[0].source.is_core | False |
| locations[0].source.is_in_doaj | False |
| locations[0].source.display_name | arXiv (Cornell University) |
| locations[0].source.host_organization | https://openalex.org/I205783295 |
| locations[0].source.host_organization_name | Cornell University |
| locations[0].source.host_organization_lineage | https://openalex.org/I205783295 |
| locations[0].license | |
| locations[0].pdf_url | https://arxiv.org/pdf/2502.18123 |
| locations[0].version | submittedVersion |
| locations[0].raw_type | text |
| locations[0].license_id | |
| locations[0].is_accepted | False |
| locations[0].is_published | False |
| locations[0].raw_source_name | |
| locations[0].landing_page_url | http://arxiv.org/abs/2502.18123 |
| locations[1].id | doi:10.48550/arxiv.2502.18123 |
| locations[1].is_oa | True |
| locations[1].source.id | https://openalex.org/S4306400194 |
| locations[1].source.issn | |
| locations[1].source.type | repository |
| locations[1].source.is_oa | True |
| locations[1].source.issn_l | |
| locations[1].source.is_core | False |
| locations[1].source.is_in_doaj | False |
| locations[1].source.display_name | arXiv (Cornell University) |
| locations[1].source.host_organization | https://openalex.org/I205783295 |
| locations[1].source.host_organization_name | Cornell University |
| locations[1].source.host_organization_lineage | https://openalex.org/I205783295 |
| locations[1].license | |
| locations[1].pdf_url | |
| locations[1].version | |
| locations[1].raw_type | article |
| locations[1].license_id | |
| locations[1].is_accepted | False |
| locations[1].is_published | |
| locations[1].raw_source_name | |
| locations[1].landing_page_url | https://doi.org/10.48550/arxiv.2502.18123 |
| indexed_in | arxiv, datacite |
| authorships[0].author.id | https://openalex.org/A5063121046 |
| authorships[0].author.orcid | https://orcid.org/0009-0006-4819-2066 |
| authorships[0].author.display_name | Yuhu Feng |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Feng, Yuhu |
| authorships[0].is_corresponding | False |
| authorships[1].author.id | https://openalex.org/A5033215072 |
| authorships[1].author.orcid | https://orcid.org/0000-0001-8039-3462 |
| authorships[1].author.display_name | Keisuke Maeda |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Maeda, Keisuke |
| authorships[1].is_corresponding | False |
| authorships[2].author.id | https://openalex.org/A5009032240 |
| authorships[2].author.orcid | https://orcid.org/0000-0001-5332-8112 |
| authorships[2].author.display_name | Takahiro Ogawa |
| authorships[2].author_position | middle |
| authorships[2].raw_author_name | Ogawa, Takahiro |
| authorships[2].is_corresponding | False |
| authorships[3].author.id | https://openalex.org/A5063903016 |
| authorships[3].author.orcid | https://orcid.org/0000-0003-1496-1761 |
| authorships[3].author.display_name | Miki Haseyama |
| authorships[3].author_position | last |
| authorships[3].raw_author_name | Haseyama, Miki |
| authorships[3].is_corresponding | False |
| has_content.pdf | False |
| has_content.grobid_xml | False |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://arxiv.org/pdf/2502.18123 |
| open_access.oa_status | green |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-15T00:00:00 |
| display_name | Personalized Federated Learning for Egocentric Video Gaze Estimation with Comprehensive Parameter Frezzing |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-11-06T06:51:31.235846 |
| primary_topic.id | https://openalex.org/T13731 |
| primary_topic.field.id | https://openalex.org/fields/33 |
| primary_topic.field.display_name | Social Sciences |
| primary_topic.score | 0.979200005531311 |
| primary_topic.domain.id | https://openalex.org/domains/2 |
| primary_topic.domain.display_name | Social Sciences |
| primary_topic.subfield.id | https://openalex.org/subfields/3322 |
| primary_topic.subfield.display_name | Urban Studies |
| primary_topic.display_name | Advanced Computing and Algorithms |
| cited_by_count | 0 |
| locations_count | 2 |
| best_oa_location.id | pmh:oai:arXiv.org:2502.18123 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S4306400194 |
| best_oa_location.source.issn | |
| best_oa_location.source.type | repository |
| best_oa_location.source.is_oa | True |
| best_oa_location.source.issn_l | |
| best_oa_location.source.is_core | False |
| best_oa_location.source.is_in_doaj | False |
| best_oa_location.source.display_name | arXiv (Cornell University) |
| best_oa_location.source.host_organization | https://openalex.org/I205783295 |
| best_oa_location.source.host_organization_name | Cornell University |
| best_oa_location.source.host_organization_lineage | https://openalex.org/I205783295 |
| best_oa_location.license | |
| best_oa_location.pdf_url | https://arxiv.org/pdf/2502.18123 |
| best_oa_location.version | submittedVersion |
| best_oa_location.raw_type | text |
| best_oa_location.license_id | |
| best_oa_location.is_accepted | False |
| best_oa_location.is_published | False |
| best_oa_location.raw_source_name | |
| best_oa_location.landing_page_url | http://arxiv.org/abs/2502.18123 |
| primary_location.id | pmh:oai:arXiv.org:2502.18123 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S4306400194 |
| primary_location.source.issn | |
| primary_location.source.type | repository |
| primary_location.source.is_oa | True |
| primary_location.source.issn_l | |
| primary_location.source.is_core | False |
| primary_location.source.is_in_doaj | False |
| primary_location.source.display_name | arXiv (Cornell University) |
| primary_location.source.host_organization | https://openalex.org/I205783295 |
| primary_location.source.host_organization_name | Cornell University |
| primary_location.source.host_organization_lineage | https://openalex.org/I205783295 |
| primary_location.license | |
| primary_location.pdf_url | https://arxiv.org/pdf/2502.18123 |
| primary_location.version | submittedVersion |
| primary_location.raw_type | text |
| primary_location.license_id | |
| primary_location.is_accepted | False |
| primary_location.is_published | False |
| primary_location.raw_source_name | |
| primary_location.landing_page_url | http://arxiv.org/abs/2502.18123 |
| publication_date | 2025-02-25 |
| publication_year | 2025 |
| referenced_works_count | 0 |
| abstract_inverted_index.a | 20, 26, 97 |
| abstract_inverted_index.in | 50, 91, 107 |
| abstract_inverted_index.it | 24 |
| abstract_inverted_index.of | 40, 85 |
| abstract_inverted_index.on | 56 |
| abstract_inverted_index.to | 6, 13 |
| abstract_inverted_index.we | 63 |
| abstract_inverted_index.Our | 17 |
| abstract_inverted_index.PFL | 27 |
| abstract_inverted_index.and | 46, 60, 78, 105 |
| abstract_inverted_index.are | 44 |
| abstract_inverted_index.for | 48, 100 |
| abstract_inverted_index.our | 86 |
| abstract_inverted_index.the | 31, 37, 57, 83 |
| abstract_inverted_index.both | 103 |
| abstract_inverted_index.gaze | 2, 9 |
| abstract_inverted_index.into | 25 |
| abstract_inverted_index.most | 32 |
| abstract_inverted_index.only | 30 |
| abstract_inverted_index.rate | 39 |
| abstract_inverted_index.that | 65 |
| abstract_inverted_index.user | 15 |
| abstract_inverted_index.EGTEA | 58 |
| abstract_inverted_index.Ego4D | 61 |
| abstract_inverted_index.Gaze+ | 59 |
| abstract_inverted_index.These | 80 |
| abstract_inverted_index.data. | 16 |
| abstract_inverted_index.model | 93 |
| abstract_inverted_index.tasks | 101 |
| abstract_inverted_index.those | 35 |
| abstract_inverted_index.video | 1 |
| abstract_inverted_index.where | 29 |
| abstract_inverted_index.while | 11 |
| abstract_inverted_index.FedCPF | 66, 96 |
| abstract_inverted_index.change | 41 |
| abstract_inverted_index.client | 51 |
| abstract_inverted_index.during | 42 |
| abstract_inverted_index.frozen | 47 |
| abstract_inverted_index.making | 95 |
| abstract_inverted_index.models | 5 |
| abstract_inverted_index.Through | 53 |
| abstract_inverted_index.capture | 7 |
| abstract_inverted_index.confirm | 82 |
| abstract_inverted_index.diverse | 14 |
| abstract_inverted_index.highest | 38 |
| abstract_inverted_index.models. | 52 |
| abstract_inverted_index.recall, | 76 |
| abstract_inverted_index.results | 81 |
| abstract_inverted_index.accuracy | 106 |
| abstract_inverted_index.adapting | 12 |
| abstract_inverted_index.approach | 18, 99 |
| abstract_inverted_index.freezing | 89 |
| abstract_inverted_index.learning | 72, 109 |
| abstract_inverted_index.methods, | 73 |
| abstract_inverted_index.patterns | 10 |
| abstract_inverted_index.reported | 70 |
| abstract_inverted_index.requires | 4 |
| abstract_inverted_index.selected | 45 |
| abstract_inverted_index.strategy | 90 |
| abstract_inverted_index.superior | 75 |
| abstract_inverted_index.F1-score. | 79 |
| abstract_inverted_index.achieving | 74 |
| abstract_inverted_index.datasets, | 62 |
| abstract_inverted_index.enhancing | 92 |
| abstract_inverted_index.extensive | 54 |
| abstract_inverted_index.federated | 71, 108 |
| abstract_inverted_index.framework | 28 |
| abstract_inverted_index.leverages | 19 |
| abstract_inverted_index.promising | 98 |
| abstract_inverted_index.requiring | 102 |
| abstract_inverted_index.settings. | 110 |
| abstract_inverted_index.training, | 43 |
| abstract_inverted_index.Egocentric | 0 |
| abstract_inverted_index.estimation | 3 |
| abstract_inverted_index.exhibiting | 36 |
| abstract_inverted_index.individual | 8 |
| abstract_inverted_index.parameters | 88 |
| abstract_inverted_index.precision, | 77 |
| abstract_inverted_index.previously | 69 |
| abstract_inverted_index.demonstrate | 64 |
| abstract_inverted_index.integrating | 23 |
| abstract_inverted_index.outperforms | 68 |
| abstract_inverted_index.parameters, | 34 |
| abstract_inverted_index.significant | 33 |
| abstract_inverted_index.adaptability | 104 |
| abstract_inverted_index.architecture, | 22 |
| abstract_inverted_index.comprehensive | 87 |
| abstract_inverted_index.effectiveness | 84 |
| abstract_inverted_index.significantly | 67 |
| abstract_inverted_index.experimentation | 55 |
| abstract_inverted_index.personalization | 49 |
| abstract_inverted_index.personalization, | 94 |
| abstract_inverted_index.transformer-based | 21 |
| cited_by_percentile_year | |
| countries_distinct_count | 0 |
| institutions_distinct_count | 4 |
| citation_normalized_percentile |