Advancing UAV Multi-Object Tracking: Integrating YOLOv8, Nano Instance Segmentation, and Dueling Double Deep Q-Network Article Swipe
YOU?
·
· 2024
· Open Access
·
· DOI: https://doi.org/10.21203/rs.3.rs-4854100/v1
Unmanned Aerial Vehicles (UAVs) have become indispensable for navigating complex terrains, accessing remote or hazardous locations, and capturing high-resolution imagery. This paper presents an innovative approach to object detection specifically tailored for computer vision applications in UAVs. Traditional deep learning models such as RCNN, Fast RCNN, and YOLO often face challenges in detecting occluded, blurred, or clustered objects and struggle with simultaneously identifying and tracking multiple objects. To overcome these challenges, we propose a framework that integrates YOLOv8x, Nano Instance Segmentation (NIS), and Dueling Double Deep Q Network (DDDQN). YOLOv8x exhibits outstanding performance, achieving an Average Precision (AP) of 53.9% on the demanding MSCOCO dataset, outperforming previous versions. The DDDQN algorithm significantly enhances tracking capabilities by effectively estimating state values and state-dependent action advantages independently. The combination of YOLOv8x and DDDQN facilitates proficient management of obstacles, varying object sizes, and unpredictable movements. We simulated the proposed framework using the UAVDT and VisDrone datasets and compared its performance against approximately nine contemporary frameworks from recent literature. The results demonstrate that our framework significantly improves 1 object tracking in densely populated environments, offering a robust solution for real-world applications requiring precise and resilient object detection.
Related Topics
- Type
- preprint
- Language
- en
- Landing Page
- https://doi.org/10.21203/rs.3.rs-4854100/v1
- https://www.researchsquare.com/article/rs-4854100/latest.pdf
- OA Status
- gold
- Cited By
- 1
- References
- 27
- Related Works
- 10
- OpenAlex ID
- https://openalex.org/W4401810344
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4401810344Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.21203/rs.3.rs-4854100/v1Digital Object Identifier
- Title
-
Advancing UAV Multi-Object Tracking: Integrating YOLOv8, Nano Instance Segmentation, and Dueling Double Deep Q-NetworkWork title
- Type
-
preprintOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2024Year of publication
- Publication date
-
2024-08-23Full publication date if available
- Authors
-
R. Kiruthiga, B. Nithya, Martin Prabhu SList of authors in order
- Landing page
-
https://doi.org/10.21203/rs.3.rs-4854100/v1Publisher landing page
- PDF URL
-
https://www.researchsquare.com/article/rs-4854100/latest.pdfDirect link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
goldOpen access status per OpenAlex
- OA URL
-
https://www.researchsquare.com/article/rs-4854100/latest.pdfDirect OA link when available
- Concepts
-
Artificial intelligence, Segmentation, Computer science, Tracking (education), Object (grammar), Nano-, Deep learning, Computer vision, Engineering, Psychology, Chemical engineering, PedagogyTop concepts (fields/topics) attached by OpenAlex
- Cited by
-
1Total citation count in OpenAlex
- Citations by year (recent)
-
2025: 1Per-year citation counts (last 5 years)
- References (count)
-
27Number of works referenced by this work
- Related works (count)
-
10Other works algorithmically related by OpenAlex
Full payload
| id | https://openalex.org/W4401810344 |
|---|---|
| doi | https://doi.org/10.21203/rs.3.rs-4854100/v1 |
| ids.doi | https://doi.org/10.21203/rs.3.rs-4854100/v1 |
| ids.openalex | https://openalex.org/W4401810344 |
| fwci | 0.53015756 |
| type | preprint |
| title | Advancing UAV Multi-Object Tracking: Integrating YOLOv8, Nano Instance Segmentation, and Dueling Double Deep Q-Network |
| biblio.issue | |
| biblio.volume | |
| biblio.last_page | |
| biblio.first_page | |
| topics[0].id | https://openalex.org/T10036 |
| topics[0].field.id | https://openalex.org/fields/17 |
| topics[0].field.display_name | Computer Science |
| topics[0].score | 0.9951000213623047 |
| topics[0].domain.id | https://openalex.org/domains/3 |
| topics[0].domain.display_name | Physical Sciences |
| topics[0].subfield.id | https://openalex.org/subfields/1707 |
| topics[0].subfield.display_name | Computer Vision and Pattern Recognition |
| topics[0].display_name | Advanced Neural Network Applications |
| topics[1].id | https://openalex.org/T10331 |
| topics[1].field.id | https://openalex.org/fields/17 |
| topics[1].field.display_name | Computer Science |
| topics[1].score | 0.9933000206947327 |
| topics[1].domain.id | https://openalex.org/domains/3 |
| topics[1].domain.display_name | Physical Sciences |
| topics[1].subfield.id | https://openalex.org/subfields/1707 |
| topics[1].subfield.display_name | Computer Vision and Pattern Recognition |
| topics[1].display_name | Video Surveillance and Tracking Methods |
| topics[2].id | https://openalex.org/T12389 |
| topics[2].field.id | https://openalex.org/fields/22 |
| topics[2].field.display_name | Engineering |
| topics[2].score | 0.9915000200271606 |
| topics[2].domain.id | https://openalex.org/domains/3 |
| topics[2].domain.display_name | Physical Sciences |
| topics[2].subfield.id | https://openalex.org/subfields/2202 |
| topics[2].subfield.display_name | Aerospace Engineering |
| topics[2].display_name | Infrared Target Detection Methodologies |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| concepts[0].id | https://openalex.org/C154945302 |
| concepts[0].level | 1 |
| concepts[0].score | 0.633871853351593 |
| concepts[0].wikidata | https://www.wikidata.org/wiki/Q11660 |
| concepts[0].display_name | Artificial intelligence |
| concepts[1].id | https://openalex.org/C89600930 |
| concepts[1].level | 2 |
| concepts[1].score | 0.5700111389160156 |
| concepts[1].wikidata | https://www.wikidata.org/wiki/Q1423946 |
| concepts[1].display_name | Segmentation |
| concepts[2].id | https://openalex.org/C41008148 |
| concepts[2].level | 0 |
| concepts[2].score | 0.5596052408218384 |
| concepts[2].wikidata | https://www.wikidata.org/wiki/Q21198 |
| concepts[2].display_name | Computer science |
| concepts[3].id | https://openalex.org/C2775936607 |
| concepts[3].level | 2 |
| concepts[3].score | 0.5314316749572754 |
| concepts[3].wikidata | https://www.wikidata.org/wiki/Q466845 |
| concepts[3].display_name | Tracking (education) |
| concepts[4].id | https://openalex.org/C2781238097 |
| concepts[4].level | 2 |
| concepts[4].score | 0.526342511177063 |
| concepts[4].wikidata | https://www.wikidata.org/wiki/Q175026 |
| concepts[4].display_name | Object (grammar) |
| concepts[5].id | https://openalex.org/C2780357685 |
| concepts[5].level | 2 |
| concepts[5].score | 0.473990797996521 |
| concepts[5].wikidata | https://www.wikidata.org/wiki/Q154357 |
| concepts[5].display_name | Nano- |
| concepts[6].id | https://openalex.org/C108583219 |
| concepts[6].level | 2 |
| concepts[6].score | 0.46436095237731934 |
| concepts[6].wikidata | https://www.wikidata.org/wiki/Q197536 |
| concepts[6].display_name | Deep learning |
| concepts[7].id | https://openalex.org/C31972630 |
| concepts[7].level | 1 |
| concepts[7].score | 0.4017375409603119 |
| concepts[7].wikidata | https://www.wikidata.org/wiki/Q844240 |
| concepts[7].display_name | Computer vision |
| concepts[8].id | https://openalex.org/C127413603 |
| concepts[8].level | 0 |
| concepts[8].score | 0.24843448400497437 |
| concepts[8].wikidata | https://www.wikidata.org/wiki/Q11023 |
| concepts[8].display_name | Engineering |
| concepts[9].id | https://openalex.org/C15744967 |
| concepts[9].level | 0 |
| concepts[9].score | 0.08891633152961731 |
| concepts[9].wikidata | https://www.wikidata.org/wiki/Q9418 |
| concepts[9].display_name | Psychology |
| concepts[10].id | https://openalex.org/C42360764 |
| concepts[10].level | 1 |
| concepts[10].score | 0.0 |
| concepts[10].wikidata | https://www.wikidata.org/wiki/Q83588 |
| concepts[10].display_name | Chemical engineering |
| concepts[11].id | https://openalex.org/C19417346 |
| concepts[11].level | 1 |
| concepts[11].score | 0.0 |
| concepts[11].wikidata | https://www.wikidata.org/wiki/Q7922 |
| concepts[11].display_name | Pedagogy |
| keywords[0].id | https://openalex.org/keywords/artificial-intelligence |
| keywords[0].score | 0.633871853351593 |
| keywords[0].display_name | Artificial intelligence |
| keywords[1].id | https://openalex.org/keywords/segmentation |
| keywords[1].score | 0.5700111389160156 |
| keywords[1].display_name | Segmentation |
| keywords[2].id | https://openalex.org/keywords/computer-science |
| keywords[2].score | 0.5596052408218384 |
| keywords[2].display_name | Computer science |
| keywords[3].id | https://openalex.org/keywords/tracking |
| keywords[3].score | 0.5314316749572754 |
| keywords[3].display_name | Tracking (education) |
| keywords[4].id | https://openalex.org/keywords/object |
| keywords[4].score | 0.526342511177063 |
| keywords[4].display_name | Object (grammar) |
| keywords[5].id | https://openalex.org/keywords/nano |
| keywords[5].score | 0.473990797996521 |
| keywords[5].display_name | Nano- |
| keywords[6].id | https://openalex.org/keywords/deep-learning |
| keywords[6].score | 0.46436095237731934 |
| keywords[6].display_name | Deep learning |
| keywords[7].id | https://openalex.org/keywords/computer-vision |
| keywords[7].score | 0.4017375409603119 |
| keywords[7].display_name | Computer vision |
| keywords[8].id | https://openalex.org/keywords/engineering |
| keywords[8].score | 0.24843448400497437 |
| keywords[8].display_name | Engineering |
| keywords[9].id | https://openalex.org/keywords/psychology |
| keywords[9].score | 0.08891633152961731 |
| keywords[9].display_name | Psychology |
| language | en |
| locations[0].id | doi:10.21203/rs.3.rs-4854100/v1 |
| locations[0].is_oa | True |
| locations[0].source | |
| locations[0].license | cc-by |
| locations[0].pdf_url | https://www.researchsquare.com/article/rs-4854100/latest.pdf |
| locations[0].version | acceptedVersion |
| locations[0].raw_type | posted-content |
| locations[0].license_id | https://openalex.org/licenses/cc-by |
| locations[0].is_accepted | True |
| locations[0].is_published | False |
| locations[0].raw_source_name | |
| locations[0].landing_page_url | https://doi.org/10.21203/rs.3.rs-4854100/v1 |
| indexed_in | crossref |
| authorships[0].author.id | https://openalex.org/A5038469342 |
| authorships[0].author.orcid | |
| authorships[0].author.display_name | R. Kiruthiga |
| authorships[0].countries | IN |
| authorships[0].affiliations[0].institution_ids | https://openalex.org/I122964287 |
| authorships[0].affiliations[0].raw_affiliation_string | Department of Computer Science and Engineering, National Institute of Technology, Tiruchirapalli, Tamilnadu, India. |
| authorships[0].affiliations[1].institution_ids | https://openalex.org/I122964287 |
| authorships[0].affiliations[1].raw_affiliation_string | National Institute of Technology Tiruchirappalli |
| authorships[0].affiliations[2].raw_affiliation_string | College |
| authorships[0].institutions[0].id | https://openalex.org/I122964287 |
| authorships[0].institutions[0].ror | https://ror.org/047x65e68 |
| authorships[0].institutions[0].type | education |
| authorships[0].institutions[0].lineage | https://openalex.org/I122964287 |
| authorships[0].institutions[0].country_code | IN |
| authorships[0].institutions[0].display_name | National Institute of Technology Tiruchirappalli |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | R Kiruthiga |
| authorships[0].is_corresponding | False |
| authorships[0].raw_affiliation_strings | College, Department of Computer Science and Engineering, National Institute of Technology, Tiruchirapalli, Tamilnadu, India., National Institute of Technology Tiruchirappalli |
| authorships[1].author.id | https://openalex.org/A5101806896 |
| authorships[1].author.orcid | |
| authorships[1].author.display_name | B. Nithya |
| authorships[1].countries | IN |
| authorships[1].affiliations[0].raw_affiliation_string | Department of Data Science, St.Joseph's College, Tiruchirapalli, Tamilnadu, India. |
| authorships[1].affiliations[1].institution_ids | https://openalex.org/I122964287 |
| authorships[1].affiliations[1].raw_affiliation_string | Department of Computer Science and Engineering, National Institute of Technology, Tiruchirapalli, Tamilnadu, India. |
| authorships[1].institutions[0].id | https://openalex.org/I122964287 |
| authorships[1].institutions[0].ror | https://ror.org/047x65e68 |
| authorships[1].institutions[0].type | education |
| authorships[1].institutions[0].lineage | https://openalex.org/I122964287 |
| authorships[1].institutions[0].country_code | IN |
| authorships[1].institutions[0].display_name | National Institute of Technology Tiruchirappalli |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Nithya B |
| authorships[1].is_corresponding | True |
| authorships[1].raw_affiliation_strings | Department of Computer Science and Engineering, National Institute of Technology, Tiruchirapalli, Tamilnadu, India., Department of Data Science, St.Joseph's College, Tiruchirapalli, Tamilnadu, India. |
| authorships[2].author.id | https://openalex.org/A5111298805 |
| authorships[2].author.orcid | |
| authorships[2].author.display_name | Martin Prabhu S |
| authorships[2].countries | IN |
| authorships[2].affiliations[0].institution_ids | https://openalex.org/I122964287 |
| authorships[2].affiliations[0].raw_affiliation_string | Department of Computer Science and Engineering, National Institute of Technology, Tiruchirapalli, Tamilnadu, India. |
| authorships[2].affiliations[1].institution_ids | https://openalex.org/I122964287 |
| authorships[2].affiliations[1].raw_affiliation_string | National Institute of Technology Tiruchirappalli |
| authorships[2].affiliations[2].raw_affiliation_string | St.Joseph’s College |
| authorships[2].institutions[0].id | https://openalex.org/I122964287 |
| authorships[2].institutions[0].ror | https://ror.org/047x65e68 |
| authorships[2].institutions[0].type | education |
| authorships[2].institutions[0].lineage | https://openalex.org/I122964287 |
| authorships[2].institutions[0].country_code | IN |
| authorships[2].institutions[0].display_name | National Institute of Technology Tiruchirappalli |
| authorships[2].author_position | last |
| authorships[2].raw_author_name | Martin Prabhu S |
| authorships[2].is_corresponding | False |
| authorships[2].raw_affiliation_strings | Department of Computer Science and Engineering, National Institute of Technology, Tiruchirapalli, Tamilnadu, India., National Institute of Technology Tiruchirappalli, St.Joseph’s College |
| has_content.pdf | True |
| has_content.grobid_xml | False |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://www.researchsquare.com/article/rs-4854100/latest.pdf |
| open_access.oa_status | gold |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | Advancing UAV Multi-Object Tracking: Integrating YOLOv8, Nano Instance Segmentation, and Dueling Double Deep Q-Network |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-11-06T03:46:38.306776 |
| primary_topic.id | https://openalex.org/T10036 |
| primary_topic.field.id | https://openalex.org/fields/17 |
| primary_topic.field.display_name | Computer Science |
| primary_topic.score | 0.9951000213623047 |
| primary_topic.domain.id | https://openalex.org/domains/3 |
| primary_topic.domain.display_name | Physical Sciences |
| primary_topic.subfield.id | https://openalex.org/subfields/1707 |
| primary_topic.subfield.display_name | Computer Vision and Pattern Recognition |
| primary_topic.display_name | Advanced Neural Network Applications |
| related_works | https://openalex.org/W4375867731, https://openalex.org/W2084086966, https://openalex.org/W2611989081, https://openalex.org/W4392633724, https://openalex.org/W2023861399, https://openalex.org/W4230611425, https://openalex.org/W2731899572, https://openalex.org/W2004288825, https://openalex.org/W4294635752, https://openalex.org/W4401380321 |
| cited_by_count | 1 |
| counts_by_year[0].year | 2025 |
| counts_by_year[0].cited_by_count | 1 |
| locations_count | 1 |
| best_oa_location.id | doi:10.21203/rs.3.rs-4854100/v1 |
| best_oa_location.is_oa | True |
| best_oa_location.source | |
| best_oa_location.license | cc-by |
| best_oa_location.pdf_url | https://www.researchsquare.com/article/rs-4854100/latest.pdf |
| best_oa_location.version | acceptedVersion |
| best_oa_location.raw_type | posted-content |
| best_oa_location.license_id | https://openalex.org/licenses/cc-by |
| best_oa_location.is_accepted | True |
| best_oa_location.is_published | False |
| best_oa_location.raw_source_name | |
| best_oa_location.landing_page_url | https://doi.org/10.21203/rs.3.rs-4854100/v1 |
| primary_location.id | doi:10.21203/rs.3.rs-4854100/v1 |
| primary_location.is_oa | True |
| primary_location.source | |
| primary_location.license | cc-by |
| primary_location.pdf_url | https://www.researchsquare.com/article/rs-4854100/latest.pdf |
| primary_location.version | acceptedVersion |
| primary_location.raw_type | posted-content |
| primary_location.license_id | https://openalex.org/licenses/cc-by |
| primary_location.is_accepted | True |
| primary_location.is_published | False |
| primary_location.raw_source_name | |
| primary_location.landing_page_url | https://doi.org/10.21203/rs.3.rs-4854100/v1 |
| publication_date | 2024-08-23 |
| publication_year | 2024 |
| referenced_works | https://openalex.org/W4308120827, https://openalex.org/W1964846093, https://openalex.org/W2896811747, https://openalex.org/W2929565236, https://openalex.org/W2901886595, https://openalex.org/W3211449378, https://openalex.org/W2963037989, https://openalex.org/W3129271542, https://openalex.org/W2145938889, https://openalex.org/W4387129273, https://openalex.org/W4384406330, https://openalex.org/W2508384486, https://openalex.org/W4220929648, https://openalex.org/W4367300003, https://openalex.org/W3093105758, https://openalex.org/W4200335308, https://openalex.org/W1981049948, https://openalex.org/W4312127593, https://openalex.org/W2951799221, https://openalex.org/W2977248885, https://openalex.org/W3216389654, https://openalex.org/W3113580286, https://openalex.org/W4312906297, https://openalex.org/W3127561923, https://openalex.org/W4206619550, https://openalex.org/W4362598372, https://openalex.org/W3003127026 |
| referenced_works_count | 27 |
| abstract_inverted_index.1 | 174 |
| abstract_inverted_index.Q | 87 |
| abstract_inverted_index.a | 74, 182 |
| abstract_inverted_index.To | 68 |
| abstract_inverted_index.We | 143 |
| abstract_inverted_index.an | 24, 95 |
| abstract_inverted_index.as | 43 |
| abstract_inverted_index.by | 116 |
| abstract_inverted_index.in | 36, 52, 177 |
| abstract_inverted_index.of | 99, 128, 135 |
| abstract_inverted_index.on | 101 |
| abstract_inverted_index.or | 14, 56 |
| abstract_inverted_index.to | 27 |
| abstract_inverted_index.we | 72 |
| abstract_inverted_index.The | 109, 126, 166 |
| abstract_inverted_index.and | 17, 47, 59, 64, 83, 121, 130, 140, 151, 154, 190 |
| abstract_inverted_index.for | 8, 32, 185 |
| abstract_inverted_index.its | 156 |
| abstract_inverted_index.our | 170 |
| abstract_inverted_index.the | 102, 145, 149 |
| abstract_inverted_index.(AP) | 98 |
| abstract_inverted_index.Deep | 86 |
| abstract_inverted_index.Fast | 45 |
| abstract_inverted_index.Nano | 79 |
| abstract_inverted_index.This | 21 |
| abstract_inverted_index.YOLO | 48 |
| abstract_inverted_index.deep | 39 |
| abstract_inverted_index.face | 50 |
| abstract_inverted_index.from | 163 |
| abstract_inverted_index.have | 5 |
| abstract_inverted_index.nine | 160 |
| abstract_inverted_index.such | 42 |
| abstract_inverted_index.that | 76, 169 |
| abstract_inverted_index.with | 61 |
| abstract_inverted_index.53.9% | 100 |
| abstract_inverted_index.DDDQN | 110, 131 |
| abstract_inverted_index.RCNN, | 44, 46 |
| abstract_inverted_index.UAVDT | 150 |
| abstract_inverted_index.UAVs. | 37 |
| abstract_inverted_index.often | 49 |
| abstract_inverted_index.paper | 22 |
| abstract_inverted_index.state | 119 |
| abstract_inverted_index.these | 70 |
| abstract_inverted_index.using | 148 |
| abstract_inverted_index.(NIS), | 82 |
| abstract_inverted_index.(UAVs) | 4 |
| abstract_inverted_index.Aerial | 2 |
| abstract_inverted_index.Double | 85 |
| abstract_inverted_index.MSCOCO | 104 |
| abstract_inverted_index.action | 123 |
| abstract_inverted_index.become | 6 |
| abstract_inverted_index.models | 41 |
| abstract_inverted_index.object | 28, 138, 175, 192 |
| abstract_inverted_index.recent | 164 |
| abstract_inverted_index.remote | 13 |
| abstract_inverted_index.robust | 183 |
| abstract_inverted_index.sizes, | 139 |
| abstract_inverted_index.values | 120 |
| abstract_inverted_index.vision | 34 |
| abstract_inverted_index.Average | 96 |
| abstract_inverted_index.Dueling | 84 |
| abstract_inverted_index.Network | 88 |
| abstract_inverted_index.YOLOv8x | 90, 129 |
| abstract_inverted_index.against | 158 |
| abstract_inverted_index.complex | 10 |
| abstract_inverted_index.densely | 178 |
| abstract_inverted_index.objects | 58 |
| abstract_inverted_index.precise | 189 |
| abstract_inverted_index.propose | 73 |
| abstract_inverted_index.results | 167 |
| abstract_inverted_index.varying | 137 |
| abstract_inverted_index.(DDDQN). | 89 |
| abstract_inverted_index.Instance | 80 |
| abstract_inverted_index.Unmanned | 1 |
| abstract_inverted_index.Vehicles | 3 |
| abstract_inverted_index.VisDrone | 152 |
| abstract_inverted_index.YOLOv8x, | 78 |
| abstract_inverted_index.approach | 26 |
| abstract_inverted_index.blurred, | 55 |
| abstract_inverted_index.compared | 155 |
| abstract_inverted_index.computer | 33 |
| abstract_inverted_index.dataset, | 105 |
| abstract_inverted_index.datasets | 153 |
| abstract_inverted_index.enhances | 113 |
| abstract_inverted_index.exhibits | 91 |
| abstract_inverted_index.imagery. | 20 |
| abstract_inverted_index.improves | 173 |
| abstract_inverted_index.learning | 40 |
| abstract_inverted_index.multiple | 66 |
| abstract_inverted_index.objects. | 67 |
| abstract_inverted_index.offering | 181 |
| abstract_inverted_index.overcome | 69 |
| abstract_inverted_index.presents | 23 |
| abstract_inverted_index.previous | 107 |
| abstract_inverted_index.proposed | 146 |
| abstract_inverted_index.solution | 184 |
| abstract_inverted_index.struggle | 60 |
| abstract_inverted_index.tailored | 31 |
| abstract_inverted_index.tracking | 65, 114, 176 |
| abstract_inverted_index.Precision | 97 |
| abstract_inverted_index.accessing | 12 |
| abstract_inverted_index.achieving | 94 |
| abstract_inverted_index.algorithm | 111 |
| abstract_inverted_index.capturing | 18 |
| abstract_inverted_index.clustered | 57 |
| abstract_inverted_index.demanding | 103 |
| abstract_inverted_index.detecting | 53 |
| abstract_inverted_index.detection | 29 |
| abstract_inverted_index.framework | 75, 147, 171 |
| abstract_inverted_index.hazardous | 15 |
| abstract_inverted_index.occluded, | 54 |
| abstract_inverted_index.populated | 179 |
| abstract_inverted_index.requiring | 188 |
| abstract_inverted_index.resilient | 191 |
| abstract_inverted_index.simulated | 144 |
| abstract_inverted_index.terrains, | 11 |
| abstract_inverted_index.versions. | 108 |
| abstract_inverted_index.advantages | 124 |
| abstract_inverted_index.challenges | 51 |
| abstract_inverted_index.detection. | 193 |
| abstract_inverted_index.estimating | 118 |
| abstract_inverted_index.frameworks | 162 |
| abstract_inverted_index.innovative | 25 |
| abstract_inverted_index.integrates | 77 |
| abstract_inverted_index.locations, | 16 |
| abstract_inverted_index.management | 134 |
| abstract_inverted_index.movements. | 142 |
| abstract_inverted_index.navigating | 9 |
| abstract_inverted_index.obstacles, | 136 |
| abstract_inverted_index.proficient | 133 |
| abstract_inverted_index.real-world | 186 |
| abstract_inverted_index.Traditional | 38 |
| abstract_inverted_index.challenges, | 71 |
| abstract_inverted_index.combination | 127 |
| abstract_inverted_index.demonstrate | 168 |
| abstract_inverted_index.effectively | 117 |
| abstract_inverted_index.facilitates | 132 |
| abstract_inverted_index.identifying | 63 |
| abstract_inverted_index.literature. | 165 |
| abstract_inverted_index.outstanding | 92 |
| abstract_inverted_index.performance | 157 |
| abstract_inverted_index.Segmentation | 81 |
| abstract_inverted_index.applications | 35, 187 |
| abstract_inverted_index.capabilities | 115 |
| abstract_inverted_index.contemporary | 161 |
| abstract_inverted_index.performance, | 93 |
| abstract_inverted_index.specifically | 30 |
| abstract_inverted_index.approximately | 159 |
| abstract_inverted_index.environments, | 180 |
| abstract_inverted_index.indispensable | 7 |
| abstract_inverted_index.outperforming | 106 |
| abstract_inverted_index.significantly | 112, 172 |
| abstract_inverted_index.unpredictable | 141 |
| abstract_inverted_index.independently. | 125 |
| abstract_inverted_index.simultaneously | 62 |
| abstract_inverted_index.high-resolution | 19 |
| abstract_inverted_index.state-dependent | 122 |
| abstract_inverted_index.<title>Abstract</title> | 0 |
| cited_by_percentile_year.max | 95 |
| cited_by_percentile_year.min | 91 |
| corresponding_author_ids | https://openalex.org/A5101806896 |
| countries_distinct_count | 1 |
| institutions_distinct_count | 3 |
| corresponding_institution_ids | https://openalex.org/I122964287 |
| citation_normalized_percentile.value | 0.59045102 |
| citation_normalized_percentile.is_in_top_1_percent | False |
| citation_normalized_percentile.is_in_top_10_percent | False |