Research on Gait Recognition Based on Deep Transfer Features Article Swipe
YOU?
·
· 2024
· Open Access
·
· DOI: https://doi.org/10.54097/aacrjm91
Gait recognition is to determine the identity information of pedestrians through the difference of their walking postures, which has received more and more attention from researchers in recent years. The current existing gait recognition methods have the situation of low recognition rate and difficult to be applied on the ground. This paper proposes a gait recognition method based on DenseNet201 deep network transfer features, aiming to improve the gait recognition rate and accelerate the rapid application of gait recognition. In this paper, a human body region segmentation method is first designed to divide the arm region and leg region from the whole pedestrian gait image. Then the pre-trained deep network model DenseNet201 after parameter fine-tuning is used to extract the depth transfer features of the whole human body region, the arm region, and the leg region in the three segmented regions, and do the sum-averaging fusion process for the depth transfer features of each segmented region. Finally, a discriminant analysis classifier is used to classify and recognize the fused depth migration features. After experiments on the CASIA-B gait database collected by the Institute of Automation of the Chinese Academy of Sciences, it is proved that the division of the arm region and the leg region has obvious effect on the improvement of the gait recognition rate, and the features extracted from the pedestrian gait images by using the deep transfer network have a good characterization performance.
Related Topics
- Type
- article
- Language
- en
- Landing Page
- https://doi.org/10.54097/aacrjm91
- https://drpress.org/ojs/index.php/ajst/article/download/27334/26877
- OA Status
- diamond
- Related Works
- 10
- OpenAlex ID
- https://openalex.org/W4405040723
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4405040723Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.54097/aacrjm91Digital Object Identifier
- Title
-
Research on Gait Recognition Based on Deep Transfer FeaturesWork title
- Type
-
articleOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2024Year of publication
- Publication date
-
2024-11-29Full publication date if available
- Authors
-
Xue‐Dong Yu, Lily Peng, Y. Y. Ji, Ziheng JiangList of authors in order
- Landing page
-
https://doi.org/10.54097/aacrjm91Publisher landing page
- PDF URL
-
https://drpress.org/ojs/index.php/ajst/article/download/27334/26877Direct link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
diamondOpen access status per OpenAlex
- OA URL
-
https://drpress.org/ojs/index.php/ajst/article/download/27334/26877Direct OA link when available
- Concepts
-
Gait, Physical medicine and rehabilitation, Artificial intelligence, Computer science, MedicineTop concepts (fields/topics) attached by OpenAlex
- Cited by
-
0Total citation count in OpenAlex
- Related works (count)
-
10Other works algorithmically related by OpenAlex
Full payload
| id | https://openalex.org/W4405040723 |
|---|---|
| doi | https://doi.org/10.54097/aacrjm91 |
| ids.doi | https://doi.org/10.54097/aacrjm91 |
| ids.openalex | https://openalex.org/W4405040723 |
| fwci | 0.0 |
| type | article |
| title | Research on Gait Recognition Based on Deep Transfer Features |
| biblio.issue | 2 |
| biblio.volume | 13 |
| biblio.last_page | 145 |
| biblio.first_page | 139 |
| topics[0].id | https://openalex.org/T12740 |
| topics[0].field.id | https://openalex.org/fields/22 |
| topics[0].field.display_name | Engineering |
| topics[0].score | 0.9980000257492065 |
| topics[0].domain.id | https://openalex.org/domains/3 |
| topics[0].domain.display_name | Physical Sciences |
| topics[0].subfield.id | https://openalex.org/subfields/2204 |
| topics[0].subfield.display_name | Biomedical Engineering |
| topics[0].display_name | Gait Recognition and Analysis |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| concepts[0].id | https://openalex.org/C151800584 |
| concepts[0].level | 2 |
| concepts[0].score | 0.7414003610610962 |
| concepts[0].wikidata | https://www.wikidata.org/wiki/Q2370000 |
| concepts[0].display_name | Gait |
| concepts[1].id | https://openalex.org/C99508421 |
| concepts[1].level | 1 |
| concepts[1].score | 0.5346134901046753 |
| concepts[1].wikidata | https://www.wikidata.org/wiki/Q2678675 |
| concepts[1].display_name | Physical medicine and rehabilitation |
| concepts[2].id | https://openalex.org/C154945302 |
| concepts[2].level | 1 |
| concepts[2].score | 0.46509233117103577 |
| concepts[2].wikidata | https://www.wikidata.org/wiki/Q11660 |
| concepts[2].display_name | Artificial intelligence |
| concepts[3].id | https://openalex.org/C41008148 |
| concepts[3].level | 0 |
| concepts[3].score | 0.46383437514305115 |
| concepts[3].wikidata | https://www.wikidata.org/wiki/Q21198 |
| concepts[3].display_name | Computer science |
| concepts[4].id | https://openalex.org/C71924100 |
| concepts[4].level | 0 |
| concepts[4].score | 0.2565276026725769 |
| concepts[4].wikidata | https://www.wikidata.org/wiki/Q11190 |
| concepts[4].display_name | Medicine |
| keywords[0].id | https://openalex.org/keywords/gait |
| keywords[0].score | 0.7414003610610962 |
| keywords[0].display_name | Gait |
| keywords[1].id | https://openalex.org/keywords/physical-medicine-and-rehabilitation |
| keywords[1].score | 0.5346134901046753 |
| keywords[1].display_name | Physical medicine and rehabilitation |
| keywords[2].id | https://openalex.org/keywords/artificial-intelligence |
| keywords[2].score | 0.46509233117103577 |
| keywords[2].display_name | Artificial intelligence |
| keywords[3].id | https://openalex.org/keywords/computer-science |
| keywords[3].score | 0.46383437514305115 |
| keywords[3].display_name | Computer science |
| keywords[4].id | https://openalex.org/keywords/medicine |
| keywords[4].score | 0.2565276026725769 |
| keywords[4].display_name | Medicine |
| language | en |
| locations[0].id | doi:10.54097/aacrjm91 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S4220651530 |
| locations[0].source.issn | 2771-3032 |
| locations[0].source.type | journal |
| locations[0].source.is_oa | True |
| locations[0].source.issn_l | 2771-3032 |
| locations[0].source.is_core | True |
| locations[0].source.is_in_doaj | False |
| locations[0].source.display_name | Academic Journal of Science and Technology |
| locations[0].source.host_organization | |
| locations[0].source.host_organization_name | |
| locations[0].license | cc-by |
| locations[0].pdf_url | https://drpress.org/ojs/index.php/ajst/article/download/27334/26877 |
| locations[0].version | publishedVersion |
| locations[0].raw_type | journal-article |
| locations[0].license_id | https://openalex.org/licenses/cc-by |
| locations[0].is_accepted | True |
| locations[0].is_published | True |
| locations[0].raw_source_name | Academic Journal of Science and Technology |
| locations[0].landing_page_url | https://doi.org/10.54097/aacrjm91 |
| indexed_in | crossref |
| authorships[0].author.id | https://openalex.org/A5008990908 |
| authorships[0].author.orcid | https://orcid.org/0009-0006-6179-634X |
| authorships[0].author.display_name | Xue‐Dong Yu |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Xuedong Yu |
| authorships[0].is_corresponding | False |
| authorships[1].author.id | https://openalex.org/A5108117554 |
| authorships[1].author.orcid | |
| authorships[1].author.display_name | Lily Peng |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Linxing Peng |
| authorships[1].is_corresponding | False |
| authorships[2].author.id | https://openalex.org/A5069542352 |
| authorships[2].author.orcid | https://orcid.org/0009-0003-2123-6311 |
| authorships[2].author.display_name | Y. Y. Ji |
| authorships[2].author_position | middle |
| authorships[2].raw_author_name | Yangde Ji |
| authorships[2].is_corresponding | False |
| authorships[3].author.id | https://openalex.org/A5066345089 |
| authorships[3].author.orcid | |
| authorships[3].author.display_name | Ziheng Jiang |
| authorships[3].author_position | last |
| authorships[3].raw_author_name | Ziheng Jiang |
| authorships[3].is_corresponding | False |
| has_content.pdf | True |
| has_content.grobid_xml | False |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://drpress.org/ojs/index.php/ajst/article/download/27334/26877 |
| open_access.oa_status | diamond |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | Research on Gait Recognition Based on Deep Transfer Features |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-11-06T03:46:38.306776 |
| primary_topic.id | https://openalex.org/T12740 |
| primary_topic.field.id | https://openalex.org/fields/22 |
| primary_topic.field.display_name | Engineering |
| primary_topic.score | 0.9980000257492065 |
| primary_topic.domain.id | https://openalex.org/domains/3 |
| primary_topic.domain.display_name | Physical Sciences |
| primary_topic.subfield.id | https://openalex.org/subfields/2204 |
| primary_topic.subfield.display_name | Biomedical Engineering |
| primary_topic.display_name | Gait Recognition and Analysis |
| related_works | https://openalex.org/W4391375266, https://openalex.org/W2899084033, https://openalex.org/W2748952813, https://openalex.org/W2390279801, https://openalex.org/W4391913857, https://openalex.org/W1989734657, https://openalex.org/W4226004263, https://openalex.org/W4210601529, https://openalex.org/W2528228280, https://openalex.org/W2757733761 |
| cited_by_count | 0 |
| locations_count | 1 |
| best_oa_location.id | doi:10.54097/aacrjm91 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S4220651530 |
| best_oa_location.source.issn | 2771-3032 |
| best_oa_location.source.type | journal |
| best_oa_location.source.is_oa | True |
| best_oa_location.source.issn_l | 2771-3032 |
| best_oa_location.source.is_core | True |
| best_oa_location.source.is_in_doaj | False |
| best_oa_location.source.display_name | Academic Journal of Science and Technology |
| best_oa_location.source.host_organization | |
| best_oa_location.source.host_organization_name | |
| best_oa_location.license | cc-by |
| best_oa_location.pdf_url | https://drpress.org/ojs/index.php/ajst/article/download/27334/26877 |
| best_oa_location.version | publishedVersion |
| best_oa_location.raw_type | journal-article |
| best_oa_location.license_id | https://openalex.org/licenses/cc-by |
| best_oa_location.is_accepted | True |
| best_oa_location.is_published | True |
| best_oa_location.raw_source_name | Academic Journal of Science and Technology |
| best_oa_location.landing_page_url | https://doi.org/10.54097/aacrjm91 |
| primary_location.id | doi:10.54097/aacrjm91 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S4220651530 |
| primary_location.source.issn | 2771-3032 |
| primary_location.source.type | journal |
| primary_location.source.is_oa | True |
| primary_location.source.issn_l | 2771-3032 |
| primary_location.source.is_core | True |
| primary_location.source.is_in_doaj | False |
| primary_location.source.display_name | Academic Journal of Science and Technology |
| primary_location.source.host_organization | |
| primary_location.source.host_organization_name | |
| primary_location.license | cc-by |
| primary_location.pdf_url | https://drpress.org/ojs/index.php/ajst/article/download/27334/26877 |
| primary_location.version | publishedVersion |
| primary_location.raw_type | journal-article |
| primary_location.license_id | https://openalex.org/licenses/cc-by |
| primary_location.is_accepted | True |
| primary_location.is_published | True |
| primary_location.raw_source_name | Academic Journal of Science and Technology |
| primary_location.landing_page_url | https://doi.org/10.54097/aacrjm91 |
| publication_date | 2024-11-29 |
| publication_year | 2024 |
| referenced_works_count | 0 |
| abstract_inverted_index.a | 53, 82, 157, 232 |
| abstract_inverted_index.In | 79 |
| abstract_inverted_index.be | 45 |
| abstract_inverted_index.by | 180, 225 |
| abstract_inverted_index.do | 142 |
| abstract_inverted_index.in | 26, 136 |
| abstract_inverted_index.is | 2, 88, 115, 161, 192 |
| abstract_inverted_index.it | 191 |
| abstract_inverted_index.of | 8, 13, 38, 76, 123, 152, 183, 185, 189, 197, 211 |
| abstract_inverted_index.on | 47, 58, 174, 208 |
| abstract_inverted_index.to | 3, 44, 65, 91, 117, 163 |
| abstract_inverted_index.The | 29 |
| abstract_inverted_index.and | 21, 42, 71, 96, 132, 141, 165, 201, 216 |
| abstract_inverted_index.arm | 94, 130, 199 |
| abstract_inverted_index.for | 147 |
| abstract_inverted_index.has | 18, 205 |
| abstract_inverted_index.leg | 97, 134, 203 |
| abstract_inverted_index.low | 39 |
| abstract_inverted_index.the | 5, 11, 36, 48, 67, 73, 93, 100, 106, 119, 124, 129, 133, 137, 143, 148, 167, 175, 181, 186, 195, 198, 202, 209, 212, 217, 221, 227 |
| abstract_inverted_index.Gait | 0 |
| abstract_inverted_index.Then | 105 |
| abstract_inverted_index.This | 50 |
| abstract_inverted_index.body | 84, 127 |
| abstract_inverted_index.deep | 60, 108, 228 |
| abstract_inverted_index.each | 153 |
| abstract_inverted_index.from | 24, 99, 220 |
| abstract_inverted_index.gait | 32, 54, 68, 77, 103, 177, 213, 223 |
| abstract_inverted_index.good | 233 |
| abstract_inverted_index.have | 35, 231 |
| abstract_inverted_index.more | 20, 22 |
| abstract_inverted_index.rate | 41, 70 |
| abstract_inverted_index.that | 194 |
| abstract_inverted_index.this | 80 |
| abstract_inverted_index.used | 116, 162 |
| abstract_inverted_index.After | 172 |
| abstract_inverted_index.after | 112 |
| abstract_inverted_index.based | 57 |
| abstract_inverted_index.depth | 120, 149, 169 |
| abstract_inverted_index.first | 89 |
| abstract_inverted_index.fused | 168 |
| abstract_inverted_index.human | 83, 126 |
| abstract_inverted_index.model | 110 |
| abstract_inverted_index.paper | 51 |
| abstract_inverted_index.rapid | 74 |
| abstract_inverted_index.rate, | 215 |
| abstract_inverted_index.their | 14 |
| abstract_inverted_index.three | 138 |
| abstract_inverted_index.using | 226 |
| abstract_inverted_index.which | 17 |
| abstract_inverted_index.whole | 101, 125 |
| abstract_inverted_index.aiming | 64 |
| abstract_inverted_index.divide | 92 |
| abstract_inverted_index.effect | 207 |
| abstract_inverted_index.fusion | 145 |
| abstract_inverted_index.image. | 104 |
| abstract_inverted_index.images | 224 |
| abstract_inverted_index.method | 56, 87 |
| abstract_inverted_index.paper, | 81 |
| abstract_inverted_index.proved | 193 |
| abstract_inverted_index.recent | 27 |
| abstract_inverted_index.region | 85, 95, 98, 135, 200, 204 |
| abstract_inverted_index.years. | 28 |
| abstract_inverted_index.Academy | 188 |
| abstract_inverted_index.CASIA-B | 176 |
| abstract_inverted_index.Chinese | 187 |
| abstract_inverted_index.applied | 46 |
| abstract_inverted_index.current | 30 |
| abstract_inverted_index.extract | 118 |
| abstract_inverted_index.ground. | 49 |
| abstract_inverted_index.improve | 66 |
| abstract_inverted_index.methods | 34 |
| abstract_inverted_index.network | 61, 109, 230 |
| abstract_inverted_index.obvious | 206 |
| abstract_inverted_index.process | 146 |
| abstract_inverted_index.region, | 128, 131 |
| abstract_inverted_index.region. | 155 |
| abstract_inverted_index.through | 10 |
| abstract_inverted_index.walking | 15 |
| abstract_inverted_index.Finally, | 156 |
| abstract_inverted_index.analysis | 159 |
| abstract_inverted_index.classify | 164 |
| abstract_inverted_index.database | 178 |
| abstract_inverted_index.designed | 90 |
| abstract_inverted_index.division | 196 |
| abstract_inverted_index.existing | 31 |
| abstract_inverted_index.features | 122, 151, 218 |
| abstract_inverted_index.identity | 6 |
| abstract_inverted_index.proposes | 52 |
| abstract_inverted_index.received | 19 |
| abstract_inverted_index.regions, | 140 |
| abstract_inverted_index.transfer | 62, 121, 150, 229 |
| abstract_inverted_index.Institute | 182 |
| abstract_inverted_index.Sciences, | 190 |
| abstract_inverted_index.attention | 23 |
| abstract_inverted_index.collected | 179 |
| abstract_inverted_index.determine | 4 |
| abstract_inverted_index.difficult | 43 |
| abstract_inverted_index.extracted | 219 |
| abstract_inverted_index.features, | 63 |
| abstract_inverted_index.features. | 171 |
| abstract_inverted_index.migration | 170 |
| abstract_inverted_index.parameter | 113 |
| abstract_inverted_index.postures, | 16 |
| abstract_inverted_index.recognize | 166 |
| abstract_inverted_index.segmented | 139, 154 |
| abstract_inverted_index.situation | 37 |
| abstract_inverted_index.Automation | 184 |
| abstract_inverted_index.accelerate | 72 |
| abstract_inverted_index.classifier | 160 |
| abstract_inverted_index.difference | 12 |
| abstract_inverted_index.pedestrian | 102, 222 |
| abstract_inverted_index.DenseNet201 | 59, 111 |
| abstract_inverted_index.application | 75 |
| abstract_inverted_index.experiments | 173 |
| abstract_inverted_index.fine-tuning | 114 |
| abstract_inverted_index.improvement | 210 |
| abstract_inverted_index.information | 7 |
| abstract_inverted_index.pedestrians | 9 |
| abstract_inverted_index.pre-trained | 107 |
| abstract_inverted_index.recognition | 1, 33, 40, 55, 69, 214 |
| abstract_inverted_index.researchers | 25 |
| abstract_inverted_index.discriminant | 158 |
| abstract_inverted_index.performance. | 235 |
| abstract_inverted_index.recognition. | 78 |
| abstract_inverted_index.segmentation | 86 |
| abstract_inverted_index.sum-averaging | 144 |
| abstract_inverted_index.characterization | 234 |
| cited_by_percentile_year | |
| countries_distinct_count | 0 |
| institutions_distinct_count | 4 |
| sustainable_development_goals[0].id | https://metadata.un.org/sdg/10 |
| sustainable_development_goals[0].score | 0.75 |
| sustainable_development_goals[0].display_name | Reduced inequalities |
| citation_normalized_percentile.value | 0.2586772 |
| citation_normalized_percentile.is_in_top_1_percent | False |
| citation_normalized_percentile.is_in_top_10_percent | False |