Guest Editorial: Integrating sensor fusion and perception for human–robot interaction Article Swipe
YOU?
·
· 2021
· Open Access
·
· DOI: https://doi.org/10.1049/ccs2.12031
This is the Special Issue ‘Integrating Sensor Fusion and Perception for Human–Robot Interaction’ of IET Cognitive Computation and System that introduces the latest advances in sensor fusion and perception in the human–robot interaction (HRI) field. In recent years, as intelligent systems have developed, HRI has attracted increasing research interest. In many areas, including factories, rehabilitation robots and operating rooms, HRI technology can be exploited to enhance safety by using intelligence for human operations. However, both available practical robotic systems and some ongoing investigations lack intelligence due to their limited capabilities in perceiving their environment. Nowadays, the HRI method usually focusses on a single sensing system without integrating algorithms and hardware, such as tactile perception and computer vision. Sensor fusion and perception with artificial intelligence (AI) techniques have been successful in environment perception and activity recognition by fusing information from a multi-modal sensing system and selecting the most appropriate information to perceive the activity or environment. Consequently, combining the technique of multi-sensor fusion and perception for HRI is an exciting and promising topic. This Special Issue aims to track the latest advances and newly appeared technology in the integrated sensor fusion and perception for HRI. After careful peer reviews and revision, four representative papers were accepted for publication in this Special Issue. These papers represent four important application areas of multi-sensor fusion and perception technology and can be assigned into four topics. The related summary of every topic is given below. We strongly recommend reading the entire paper if interested. They will bring some new ideas and inspire the mind. In the paper ‘Deep learning techniques-based perfection of multi-sensor fusion oriented human-robot interaction system for identification of dense organisms’, Li et al. present an HRI system based on deep learning and sensors' fusion to study the species and density of dense organisms in the deep-sea hydrothermal vent. In this paper, several deep learning models based on convolutional neural network (CNN) are improved and compared to study the species and density of dense organisms in deep-sea hydrothermal vent, which are fused with related environmental information provided by position sensors and conductivity–temperature–depth (CTD) sensors, so as to perfect the multi-sensor fusion-oriented HRI system. First, the authors combined different meta-architectures and different feature extractors and obtained five object identification algorithms based on CNN. Then, they compared the computational cost of feature extractors and weighed the pros and cons of each algorithm from mean detection speed, correlation coefficient and mean class-specific confidence score to confirm that Faster Region-based CNN (R-CNN)_InceptionNet is the best algorithm applicable to the hydrothermal vent biological dataset. Finally, they calculated the cognitive accuracy of rimicaris exoculata in dense and sparse areas, which were 88.3% and 95.9%, respectively, to analyse the performance of the Faster R-CNN_InceptionNet. Experiments show that the proposed method can automatically detect the species and quantity of dense organisms with higher speed and accuracy. And it is feasible and of realistic value that the improved multi-sensor fusion-oriented HRI system is used to help biologists analyse and maintain the ecological balance of deep-sea hydrothermal vents. Integrating sensor fusion and perception is not limited to the extraction and processing of the physic data. It also plays an important role in multi-system coupling. In the paper ‘Research on intelligent service of customer service system’, Nie et al. illustrate a new generation of customer service systems based on the sense coupling of the outbound call system, enterprise internal management system and knowledge base. This study introduces the principle of the outbound call system, the enterprise internal management system, and the knowledge base and explains the network structure of the intelligent customer service system. Also, the methods of accessing the intelligent customer service system and the whole workflow are described. Through data sharing and information exchange between multiple systems, the new customer service system proposed achieves the perceptual integration of the outbound call system, the enterprise internal management system and the knowledge base to provide service for customers intelligently. Based on the application of cloud service and IoT technology, the intelligent customer service system establishes a dynamically updated knowledge base and forms a management model dominated by the knowledge base. In recent years, wearable sensors have developed fast, especially in the medical and health field. More mature commercial wearable sensors have appeared and generated a new network formation—the body area network (BAN). The BAN is composed of every wearable device network on the body to share information and data, which is applied in medical and health devices, especially in intelligent clothing. This Special Issue includes a review article on wearable sensor and body area network by Ren et al. Based on the wearable sensor in the critical factor of wearable device fusion, this paper analyses the classification, technology, and current situation of a wearable sensor, discusses the problems of a wearable sensor for the BAN from the aspects of human–computer interaction experience, data accuracy, multiple interaction modes, and battery power supply, and summarises the direction of multi-sensor fusion, compatible biosensor materials, and low power consumption and high sensitivity. Furthermore, a sustainable design direction of visibility design, identification of use scenarios, short-term human–computer interaction, interaction process reduction and integration invisibility are introduced. Augmented reality is one of the most inspiring technology in recent years. There is no doubt that augment reality will lead the trend of the immerse application in the industry and medical field. This Special Issue includes a paper on augmented reality display of neurosurgery craniotomy lesions based on feature contour matching by Hao et al. In this paper, an augmented reality display method for neurosurgical craniotomy lesions based on feature contour matching is proposed, which uses an augmented reality display method to provide doctors with accurate lesion information. It can visualise the patient's intracranial information and help doctors plan the path of scalp cutting and craniectomy. This method also performs non-rigid matching for the patient, eliminates additional injury to the patient, reduces the extra work of doctors to paste marking points for the patient, and reduces the burden of multiple medical scans for the patient. Through experiments to compare the feature point cloud matching and feature contour matching methods, it is proved that the feature contour matching method has a better display effect. In addition, a user interface is designed. The doctor can determine the patient's personal information through the text displayed in the upper left corner of the interface and zoom in, zoom out, and rotate the virtual model on a mobile terminal screen by pressing buttons. It provides a visual basis for the doctor's preoperative preparation. The method described in this article effectively improves the efficiency of the doctor's operation as well as patient safety. The proposed augmented reality matching method based on feature contours also provides basic theoretical help for applying augmented reality to neurosurgery in the future. All of the papers selected for this Special Issue show the significant effect and application potential of sensor fusion and perception in HRI application. Multi-sensor fusion and perception can effectively improve system accuracy, increase stability and improve the human–computer interaction experience. There are still many challenges in this field that require future research attention, such as the fusion method and the fusion result evaluation. With further development, the integration of sensor fusion and perception in HRI will have broad application. Examples of published guest editorials are given below for your reference: https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/iet-gtd.2020.1493 https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/iet-pel.2020.0051 https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/iet-rsn.2020.0089 Dr. Hang Su received his M.Sc. degree in control theory and control engineering in South China University of Technology, Guangzhou, China, in 2015, and the Ph.D. degree in Bioengineering from Politecnico di Milano, Milano, Italy, in 2019. He participated in the EU-funded project (SMARTsurg) in the field of Surgical Robotics. Dr. Hang Su is currently working in the Department of Electronics, Information and Bioengineering (DEIB) of Politecnico Di Milano. He is also working as an Assistant Professor in the Institute of Advanced Technology, University of Science and Technology of China, China. He is currently Associate Editor for Frontiers in Neuroscience, Cognitive Computation and Systems, and Frontiers in Neurorobotics. He is currently a Program Chair of IEEE International Conference on Advanced Robotics and Mechatronics (ICARM 2021). He also severs as the Associate Editor for the IEEE International Conference on Robotics and Automation (ICRA) and the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), the IEEE International Conference on Robot and Human Interactive Communication (Ro-man) and the IEEE International Conference on Advanced Robotics and Mechatronics (ICARM), and the Guest Associate Editor for a couple of journals, like IEEE Robotics and Automation Letters, Complexity, Actuators, Mathematical Problems in Engineering, Sensors, Frontiers in Robotics and AI etc. He was a recipient of the Best Paper Award in Advanced Robotics at the IEEE International Conference on Advanced Robotics and Mechatronics in 2020 and the ICRA Travel Award funded by the IEEE Robotics and Automation Society in 2019. His main research interests include control and instrumentation in medical robotics, human–robot interaction, surgical robotics, deep learning, bilateral teleoperation etc. Jing Guo obtained his Ph.D. degree from LIRMM, CNRS-University of Montpelier, France in 2016. He received his master and bachelor degree from Guangdong University of Technology in 2009 and 2012, respectively. He has been research fellow at National University of Singapore (NUS) during 2016–2018. He is an Associate Professor affiliated with Guangdong University of Technology. His current research interests include robotic control and learning, haptic bilateral teleoperation, and surgical robotics. He has served as guest editor for IEEE RA-L, Frontiers in Robotics & AI etc. Dr. Wen Qi received her M.Sc. degree in control engineering from South China University of Technology, Guangzhou, China, in 2015 and her Ph.D. degree in Bioengineering as a member of the Laboratory of Biomedical Technologies (TBMLab) from Politecnico di Milano, Milano, Italy, in 2020. Dr. Wen Qi has served as a reviewer for over 30 scientific journals, such as IEEE Transaction on Biomedical Engineering and IEEE Engineering in Medicine and Biology Society. She has published more than 20 papers in e-health, deep learning, evolutionary algorithm, human–robot interaction, teleoperation and robotics. She was a recipient of the Best Paper Award in Advanced Robotics at the IEEE International Conference on Advanced Robotics and Mechatronics in 2020. In addition, her first-authored paper was awarded the finalist of the Best Paper Award on Control Applications on IEEE WCICA 2014. Her main research interests include machine learning, deep learning and signal processing algorithms in wearable medical devices. Mingchuan Zhou received a Ph.D. degree in computer science from the Technical University of Munich, Munich, Germany, in 2020. He was a visiting scholar at the Laboratory for Computational Sensing and Robotics (LCSR), Johns Hopkins University, USA, 2019. He was a joint postdoc at the Institute of Biological and Medical Imaging (IBMI) of the Helmholtz Centre, Munich and Chair for Computer Aided Medical Procedures Augmented Reality (CAMP) at the Technical University of Munich from 2019 to 2021. He is currently an assistant professor leading the multi-scale robotic manipulation lab for agriculture in Zhejiang University. He is also a guest senior researcher at the Department of Computer Science, Technical University of Munich. His research interests include the autonomous system, agricultural robotics, medical robotics and image processing. He is the Associate Editor for IEEE International Conference on Advanced Robotics and Mechatronics (ICARM). He was the winner for third Place and Golden Prize for the sixth China International ‘Internet+’ Competition 2020, a member of the champion team for agricultural robotics competition in 2020 UK Agri-EPI Centre agri-tech hackathon, and IEEE-ICRA RAS Travel Grant in IEEE & RAS 2019. Dr. Zhou was the recipient of Finalist of Best Paper Awards IEEE ROBIO in 2017 and best poster award in the IEEE ICRA 2021 workshop of Task-Informed Grasping: Agri-Food manipulation (TIG-III). Yue Chen received a B.S. in vehicle engineering from Hunan University, Hunan, China, in 2010, M.Phil. in mechanical engineering from Hong Kong Polytechnic University, Hung Hom, Hong Kong, in 2013, and Ph.D. in Mechanical Engineering, Vanderbilt University, Nashville, TN, USA, in 2018. He is an Assistant Professor at the Department of Biomedical Engineering, Georgia Tech/Emory, Atlanta. He has been an Assistant Professor at the Department of Mechanical Engineering, University of Arkansas, Fayetteville, since 2018. His current research interests include medical robotics and image-guided therapy.
Related Topics
- Type
- editorial
- Language
- en
- Landing Page
- https://doi.org/10.1049/ccs2.12031
- OA Status
- gold
- Related Works
- 10
- OpenAlex ID
- https://openalex.org/W3197711914
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W3197711914Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.1049/ccs2.12031Digital Object Identifier
- Title
-
Guest Editorial: Integrating sensor fusion and perception for human–robot interactionWork title
- Type
-
editorialOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2021Year of publication
- Publication date
-
2021-08-28Full publication date if available
- Authors
-
Hang Su, Jing Guo, Wen Qi, Mingchuan Zhou, Yue ChenList of authors in order
- Landing page
-
https://doi.org/10.1049/ccs2.12031Publisher landing page
- Open access
-
YesWhether a free full text is available
- OA status
-
goldOpen access status per OpenAlex
- OA URL
-
https://doi.org/10.1049/ccs2.12031Direct OA link when available
- Concepts
-
Sensor fusion, Perception, Robot, Human–computer interaction, Computer science, Artificial intelligence, Human–robot interaction, Field (mathematics), Robotics, Active perception, Psychology, Pure mathematics, Mathematics, NeuroscienceTop concepts (fields/topics) attached by OpenAlex
- Cited by
-
0Total citation count in OpenAlex
- Related works (count)
-
10Other works algorithmically related by OpenAlex
Full payload
| id | https://openalex.org/W3197711914 |
|---|---|
| doi | https://doi.org/10.1049/ccs2.12031 |
| ids.doi | https://doi.org/10.1049/ccs2.12031 |
| ids.mag | 3197711914 |
| ids.openalex | https://openalex.org/W3197711914 |
| fwci | 0.0 |
| type | editorial |
| title | Guest Editorial: Integrating sensor fusion and perception for human–robot interaction |
| biblio.issue | 3 |
| biblio.volume | 3 |
| biblio.last_page | 186 |
| biblio.first_page | 183 |
| topics[0].id | https://openalex.org/T11667 |
| topics[0].field.id | https://openalex.org/fields/22 |
| topics[0].field.display_name | Engineering |
| topics[0].score | 0.5483999848365784 |
| topics[0].domain.id | https://openalex.org/domains/3 |
| topics[0].domain.display_name | Physical Sciences |
| topics[0].subfield.id | https://openalex.org/subfields/2204 |
| topics[0].subfield.display_name | Biomedical Engineering |
| topics[0].display_name | Advanced Chemical Sensor Technologies |
| topics[1].id | https://openalex.org/T13734 |
| topics[1].field.id | https://openalex.org/fields/17 |
| topics[1].field.display_name | Computer Science |
| topics[1].score | 0.4878000020980835 |
| topics[1].domain.id | https://openalex.org/domains/3 |
| topics[1].domain.display_name | Physical Sciences |
| topics[1].subfield.id | https://openalex.org/subfields/1702 |
| topics[1].subfield.display_name | Artificial Intelligence |
| topics[1].display_name | Advanced Computational Techniques and Applications |
| topics[2].id | https://openalex.org/T10876 |
| topics[2].field.id | https://openalex.org/fields/22 |
| topics[2].field.display_name | Engineering |
| topics[2].score | 0.47119998931884766 |
| topics[2].domain.id | https://openalex.org/domains/3 |
| topics[2].domain.display_name | Physical Sciences |
| topics[2].subfield.id | https://openalex.org/subfields/2207 |
| topics[2].subfield.display_name | Control and Systems Engineering |
| topics[2].display_name | Fault Detection and Control Systems |
| is_xpac | False |
| apc_list.value | 2000 |
| apc_list.currency | EUR |
| apc_list.value_usd | 2200 |
| apc_paid.value | 2000 |
| apc_paid.currency | EUR |
| apc_paid.value_usd | 2200 |
| concepts[0].id | https://openalex.org/C33954974 |
| concepts[0].level | 2 |
| concepts[0].score | 0.6792406439781189 |
| concepts[0].wikidata | https://www.wikidata.org/wiki/Q486494 |
| concepts[0].display_name | Sensor fusion |
| concepts[1].id | https://openalex.org/C26760741 |
| concepts[1].level | 2 |
| concepts[1].score | 0.6781096458435059 |
| concepts[1].wikidata | https://www.wikidata.org/wiki/Q160402 |
| concepts[1].display_name | Perception |
| concepts[2].id | https://openalex.org/C90509273 |
| concepts[2].level | 2 |
| concepts[2].score | 0.6195027232170105 |
| concepts[2].wikidata | https://www.wikidata.org/wiki/Q11012 |
| concepts[2].display_name | Robot |
| concepts[3].id | https://openalex.org/C107457646 |
| concepts[3].level | 1 |
| concepts[3].score | 0.6185729503631592 |
| concepts[3].wikidata | https://www.wikidata.org/wiki/Q207434 |
| concepts[3].display_name | Human–computer interaction |
| concepts[4].id | https://openalex.org/C41008148 |
| concepts[4].level | 0 |
| concepts[4].score | 0.5962026119232178 |
| concepts[4].wikidata | https://www.wikidata.org/wiki/Q21198 |
| concepts[4].display_name | Computer science |
| concepts[5].id | https://openalex.org/C154945302 |
| concepts[5].level | 1 |
| concepts[5].score | 0.5391784906387329 |
| concepts[5].wikidata | https://www.wikidata.org/wiki/Q11660 |
| concepts[5].display_name | Artificial intelligence |
| concepts[6].id | https://openalex.org/C145460709 |
| concepts[6].level | 3 |
| concepts[6].score | 0.459771990776062 |
| concepts[6].wikidata | https://www.wikidata.org/wiki/Q859951 |
| concepts[6].display_name | Human–robot interaction |
| concepts[7].id | https://openalex.org/C9652623 |
| concepts[7].level | 2 |
| concepts[7].score | 0.42462003231048584 |
| concepts[7].wikidata | https://www.wikidata.org/wiki/Q190109 |
| concepts[7].display_name | Field (mathematics) |
| concepts[8].id | https://openalex.org/C34413123 |
| concepts[8].level | 3 |
| concepts[8].score | 0.4190446734428406 |
| concepts[8].wikidata | https://www.wikidata.org/wiki/Q170978 |
| concepts[8].display_name | Robotics |
| concepts[9].id | https://openalex.org/C2776010242 |
| concepts[9].level | 3 |
| concepts[9].score | 0.41006308794021606 |
| concepts[9].wikidata | https://www.wikidata.org/wiki/Q4677575 |
| concepts[9].display_name | Active perception |
| concepts[10].id | https://openalex.org/C15744967 |
| concepts[10].level | 0 |
| concepts[10].score | 0.08932316303253174 |
| concepts[10].wikidata | https://www.wikidata.org/wiki/Q9418 |
| concepts[10].display_name | Psychology |
| concepts[11].id | https://openalex.org/C202444582 |
| concepts[11].level | 1 |
| concepts[11].score | 0.0 |
| concepts[11].wikidata | https://www.wikidata.org/wiki/Q837863 |
| concepts[11].display_name | Pure mathematics |
| concepts[12].id | https://openalex.org/C33923547 |
| concepts[12].level | 0 |
| concepts[12].score | 0.0 |
| concepts[12].wikidata | https://www.wikidata.org/wiki/Q395 |
| concepts[12].display_name | Mathematics |
| concepts[13].id | https://openalex.org/C169760540 |
| concepts[13].level | 1 |
| concepts[13].score | 0.0 |
| concepts[13].wikidata | https://www.wikidata.org/wiki/Q207011 |
| concepts[13].display_name | Neuroscience |
| keywords[0].id | https://openalex.org/keywords/sensor-fusion |
| keywords[0].score | 0.6792406439781189 |
| keywords[0].display_name | Sensor fusion |
| keywords[1].id | https://openalex.org/keywords/perception |
| keywords[1].score | 0.6781096458435059 |
| keywords[1].display_name | Perception |
| keywords[2].id | https://openalex.org/keywords/robot |
| keywords[2].score | 0.6195027232170105 |
| keywords[2].display_name | Robot |
| keywords[3].id | https://openalex.org/keywords/human–computer-interaction |
| keywords[3].score | 0.6185729503631592 |
| keywords[3].display_name | Human–computer interaction |
| keywords[4].id | https://openalex.org/keywords/computer-science |
| keywords[4].score | 0.5962026119232178 |
| keywords[4].display_name | Computer science |
| keywords[5].id | https://openalex.org/keywords/artificial-intelligence |
| keywords[5].score | 0.5391784906387329 |
| keywords[5].display_name | Artificial intelligence |
| keywords[6].id | https://openalex.org/keywords/human–robot-interaction |
| keywords[6].score | 0.459771990776062 |
| keywords[6].display_name | Human–robot interaction |
| keywords[7].id | https://openalex.org/keywords/field |
| keywords[7].score | 0.42462003231048584 |
| keywords[7].display_name | Field (mathematics) |
| keywords[8].id | https://openalex.org/keywords/robotics |
| keywords[8].score | 0.4190446734428406 |
| keywords[8].display_name | Robotics |
| keywords[9].id | https://openalex.org/keywords/active-perception |
| keywords[9].score | 0.41006308794021606 |
| keywords[9].display_name | Active perception |
| keywords[10].id | https://openalex.org/keywords/psychology |
| keywords[10].score | 0.08932316303253174 |
| keywords[10].display_name | Psychology |
| language | en |
| locations[0].id | doi:10.1049/ccs2.12031 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S4210220299 |
| locations[0].source.issn | 2517-7567 |
| locations[0].source.type | journal |
| locations[0].source.is_oa | True |
| locations[0].source.issn_l | 2517-7567 |
| locations[0].source.is_core | True |
| locations[0].source.is_in_doaj | True |
| locations[0].source.display_name | Cognitive Computation and Systems |
| locations[0].source.host_organization | https://openalex.org/P4310311714 |
| locations[0].source.host_organization_name | Institution of Engineering and Technology |
| locations[0].source.host_organization_lineage | https://openalex.org/P4310311714 |
| locations[0].source.host_organization_lineage_names | Institution of Engineering and Technology |
| locations[0].license | cc-by |
| locations[0].pdf_url | |
| locations[0].version | publishedVersion |
| locations[0].raw_type | journal-article |
| locations[0].license_id | https://openalex.org/licenses/cc-by |
| locations[0].is_accepted | True |
| locations[0].is_published | True |
| locations[0].raw_source_name | Cognitive Computation and Systems |
| locations[0].landing_page_url | https://doi.org/10.1049/ccs2.12031 |
| locations[1].id | pmh:oai:doaj.org/article:96dd9bd4a09a4fac91d7ae017905eca0 |
| locations[1].is_oa | True |
| locations[1].source.id | https://openalex.org/S4306401280 |
| locations[1].source.issn | |
| locations[1].source.type | repository |
| locations[1].source.is_oa | False |
| locations[1].source.issn_l | |
| locations[1].source.is_core | False |
| locations[1].source.is_in_doaj | False |
| locations[1].source.display_name | DOAJ (DOAJ: Directory of Open Access Journals) |
| locations[1].source.host_organization | |
| locations[1].source.host_organization_name | |
| locations[1].license | cc-by-sa |
| locations[1].pdf_url | |
| locations[1].version | submittedVersion |
| locations[1].raw_type | article |
| locations[1].license_id | https://openalex.org/licenses/cc-by-sa |
| locations[1].is_accepted | False |
| locations[1].is_published | False |
| locations[1].raw_source_name | Cognitive Computation and Systems, Vol 3, Iss 3, Pp 183-186 (2021) |
| locations[1].landing_page_url | https://doaj.org/article/96dd9bd4a09a4fac91d7ae017905eca0 |
| indexed_in | crossref, doaj |
| authorships[0].author.id | https://openalex.org/A5100341891 |
| authorships[0].author.orcid | https://orcid.org/0000-0002-6877-6783 |
| authorships[0].author.display_name | Hang Su |
| authorships[0].countries | IT |
| authorships[0].affiliations[0].institution_ids | https://openalex.org/I93860229 |
| authorships[0].affiliations[0].raw_affiliation_string | Department of Electronics, Information and Bioengineering, Politecnico di Milano, Milan, Italy |
| authorships[0].institutions[0].id | https://openalex.org/I93860229 |
| authorships[0].institutions[0].ror | https://ror.org/01nffqt88 |
| authorships[0].institutions[0].type | education |
| authorships[0].institutions[0].lineage | https://openalex.org/I93860229 |
| authorships[0].institutions[0].country_code | IT |
| authorships[0].institutions[0].display_name | Politecnico di Milano |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Hang Su |
| authorships[0].is_corresponding | True |
| authorships[0].raw_affiliation_strings | Department of Electronics, Information and Bioengineering, Politecnico di Milano, Milan, Italy |
| authorships[1].author.id | https://openalex.org/A5100445472 |
| authorships[1].author.orcid | https://orcid.org/0000-0002-0053-2678 |
| authorships[1].author.display_name | Jing Guo |
| authorships[1].countries | CN |
| authorships[1].affiliations[0].institution_ids | https://openalex.org/I139024713 |
| authorships[1].affiliations[0].raw_affiliation_string | Department of Automation, Guangdong University of Technology, Guangzhou, China |
| authorships[1].institutions[0].id | https://openalex.org/I139024713 |
| authorships[1].institutions[0].ror | https://ror.org/04azbjn80 |
| authorships[1].institutions[0].type | education |
| authorships[1].institutions[0].lineage | https://openalex.org/I139024713 |
| authorships[1].institutions[0].country_code | CN |
| authorships[1].institutions[0].display_name | Guangdong University of Technology |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Jing Guo |
| authorships[1].is_corresponding | False |
| authorships[1].raw_affiliation_strings | Department of Automation, Guangdong University of Technology, Guangzhou, China |
| authorships[2].author.id | https://openalex.org/A5035835836 |
| authorships[2].author.orcid | https://orcid.org/0000-0002-2091-3718 |
| authorships[2].author.display_name | Wen Qi |
| authorships[2].countries | IT |
| authorships[2].affiliations[0].institution_ids | https://openalex.org/I93860229 |
| authorships[2].affiliations[0].raw_affiliation_string | Department of Electronics, Information and Bioengineering, Politecnico di Milano, Milan, Italy |
| authorships[2].institutions[0].id | https://openalex.org/I93860229 |
| authorships[2].institutions[0].ror | https://ror.org/01nffqt88 |
| authorships[2].institutions[0].type | education |
| authorships[2].institutions[0].lineage | https://openalex.org/I93860229 |
| authorships[2].institutions[0].country_code | IT |
| authorships[2].institutions[0].display_name | Politecnico di Milano |
| authorships[2].author_position | middle |
| authorships[2].raw_author_name | Wen Qi |
| authorships[2].is_corresponding | False |
| authorships[2].raw_affiliation_strings | Department of Electronics, Information and Bioengineering, Politecnico di Milano, Milan, Italy |
| authorships[3].author.id | https://openalex.org/A5061386768 |
| authorships[3].author.orcid | https://orcid.org/0000-0002-6944-1483 |
| authorships[3].author.display_name | Mingchuan Zhou |
| authorships[3].countries | CN |
| authorships[3].affiliations[0].institution_ids | https://openalex.org/I76130692 |
| authorships[3].affiliations[0].raw_affiliation_string | Department of Computer Science Technische Universität München & College of Biosystems Engineering and Food Science Zhejiang University Hangzhou China |
| authorships[3].institutions[0].id | https://openalex.org/I76130692 |
| authorships[3].institutions[0].ror | https://ror.org/00a2xv884 |
| authorships[3].institutions[0].type | education |
| authorships[3].institutions[0].lineage | https://openalex.org/I76130692 |
| authorships[3].institutions[0].country_code | CN |
| authorships[3].institutions[0].display_name | Zhejiang University |
| authorships[3].author_position | middle |
| authorships[3].raw_author_name | Mingchuan Zhou |
| authorships[3].is_corresponding | False |
| authorships[3].raw_affiliation_strings | Department of Computer Science Technische Universität München & College of Biosystems Engineering and Food Science Zhejiang University Hangzhou China |
| authorships[4].author.id | https://openalex.org/A5100454834 |
| authorships[4].author.orcid | https://orcid.org/0000-0003-1350-4972 |
| authorships[4].author.display_name | Yue Chen |
| authorships[4].countries | US |
| authorships[4].affiliations[0].institution_ids | https://openalex.org/I130701444 |
| authorships[4].affiliations[0].raw_affiliation_string | Department of Computer Science, Department of Biomedical Engineering, Georgia Tech/Emory, Atlanta, Georgia, USA |
| authorships[4].institutions[0].id | https://openalex.org/I130701444 |
| authorships[4].institutions[0].ror | https://ror.org/01zkghx44 |
| authorships[4].institutions[0].type | education |
| authorships[4].institutions[0].lineage | https://openalex.org/I130701444 |
| authorships[4].institutions[0].country_code | US |
| authorships[4].institutions[0].display_name | Georgia Institute of Technology |
| authorships[4].author_position | last |
| authorships[4].raw_author_name | Yue Chen |
| authorships[4].is_corresponding | False |
| authorships[4].raw_affiliation_strings | Department of Computer Science, Department of Biomedical Engineering, Georgia Tech/Emory, Atlanta, Georgia, USA |
| has_content.pdf | False |
| has_content.grobid_xml | False |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://doi.org/10.1049/ccs2.12031 |
| open_access.oa_status | gold |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | Guest Editorial: Integrating sensor fusion and perception for human–robot interaction |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-11-06T03:46:38.306776 |
| primary_topic.id | https://openalex.org/T11667 |
| primary_topic.field.id | https://openalex.org/fields/22 |
| primary_topic.field.display_name | Engineering |
| primary_topic.score | 0.5483999848365784 |
| primary_topic.domain.id | https://openalex.org/domains/3 |
| primary_topic.domain.display_name | Physical Sciences |
| primary_topic.subfield.id | https://openalex.org/subfields/2204 |
| primary_topic.subfield.display_name | Biomedical Engineering |
| primary_topic.display_name | Advanced Chemical Sensor Technologies |
| related_works | https://openalex.org/W1508899372, https://openalex.org/W2039460805, https://openalex.org/W4250956039, https://openalex.org/W3205513966, https://openalex.org/W2052971528, https://openalex.org/W3120459843, https://openalex.org/W4366547574, https://openalex.org/W3200191727, https://openalex.org/W2106688486, https://openalex.org/W2998535223 |
| cited_by_count | 0 |
| locations_count | 2 |
| best_oa_location.id | doi:10.1049/ccs2.12031 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S4210220299 |
| best_oa_location.source.issn | 2517-7567 |
| best_oa_location.source.type | journal |
| best_oa_location.source.is_oa | True |
| best_oa_location.source.issn_l | 2517-7567 |
| best_oa_location.source.is_core | True |
| best_oa_location.source.is_in_doaj | True |
| best_oa_location.source.display_name | Cognitive Computation and Systems |
| best_oa_location.source.host_organization | https://openalex.org/P4310311714 |
| best_oa_location.source.host_organization_name | Institution of Engineering and Technology |
| best_oa_location.source.host_organization_lineage | https://openalex.org/P4310311714 |
| best_oa_location.source.host_organization_lineage_names | Institution of Engineering and Technology |
| best_oa_location.license | cc-by |
| best_oa_location.pdf_url | |
| best_oa_location.version | publishedVersion |
| best_oa_location.raw_type | journal-article |
| best_oa_location.license_id | https://openalex.org/licenses/cc-by |
| best_oa_location.is_accepted | True |
| best_oa_location.is_published | True |
| best_oa_location.raw_source_name | Cognitive Computation and Systems |
| best_oa_location.landing_page_url | https://doi.org/10.1049/ccs2.12031 |
| primary_location.id | doi:10.1049/ccs2.12031 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S4210220299 |
| primary_location.source.issn | 2517-7567 |
| primary_location.source.type | journal |
| primary_location.source.is_oa | True |
| primary_location.source.issn_l | 2517-7567 |
| primary_location.source.is_core | True |
| primary_location.source.is_in_doaj | True |
| primary_location.source.display_name | Cognitive Computation and Systems |
| primary_location.source.host_organization | https://openalex.org/P4310311714 |
| primary_location.source.host_organization_name | Institution of Engineering and Technology |
| primary_location.source.host_organization_lineage | https://openalex.org/P4310311714 |
| primary_location.source.host_organization_lineage_names | Institution of Engineering and Technology |
| primary_location.license | cc-by |
| primary_location.pdf_url | |
| primary_location.version | publishedVersion |
| primary_location.raw_type | journal-article |
| primary_location.license_id | https://openalex.org/licenses/cc-by |
| primary_location.is_accepted | True |
| primary_location.is_published | True |
| primary_location.raw_source_name | Cognitive Computation and Systems |
| primary_location.landing_page_url | https://doi.org/10.1049/ccs2.12031 |
| publication_date | 2021-08-28 |
| publication_year | 2021 |
| referenced_works_count | 0 |
| abstract_inverted_index.& | 1568, 1905 |
| abstract_inverted_index.a | 101, 139, 546, 671, 678, 710, 751, 788, 795, 835, 894, 1027, 1033, 1069, 1078, 1334, 1404, 1429, 1598, 1621, 1663, 1726, 1744, 1763, 1820, 1881, 1942 |
| abstract_inverted_index.20 | 1648 |
| abstract_inverted_index.30 | 1625 |
| abstract_inverted_index.AI | 1425, 1569 |
| abstract_inverted_index.Di | 1290 |
| abstract_inverted_index.He | 1260, 1292, 1314, 1331, 1348, 1427, 1500, 1517, 1530, 1556, 1742, 1761, 1800, 1817, 1848, 1863, 1981, 1995 |
| abstract_inverted_index.In | 35, 49, 259, 306, 531, 686, 913, 1031, 1685 |
| abstract_inverted_index.It | 522, 946, 1076 |
| abstract_inverted_index.Li | 278 |
| abstract_inverted_index.Qi | 1573, 1617 |
| abstract_inverted_index.Su | 1225, 1275 |
| abstract_inverted_index.UK | 1893 |
| abstract_inverted_index.We | 240 |
| abstract_inverted_index.an | 167, 282, 525, 916, 934, 1297, 1532, 1803, 1983, 1998 |
| abstract_inverted_index.as | 38, 111, 352, 1100, 1102, 1184, 1296, 1351, 1559, 1597, 1620, 1629 |
| abstract_inverted_index.at | 1439, 1522, 1673, 1747, 1766, 1790, 1824, 1986, 2001 |
| abstract_inverted_index.be | 62, 226 |
| abstract_inverted_index.by | 67, 135, 344, 682, 761, 909, 1073, 1457 |
| abstract_inverted_index.di | 1254, 1609 |
| abstract_inverted_index.et | 279, 543, 763, 911 |
| abstract_inverted_index.if | 247 |
| abstract_inverted_index.in | 24, 29, 90, 129, 185, 207, 301, 332, 435, 528, 695, 738, 744, 770, 866, 884, 1050, 1089, 1126, 1150, 1175, 1203, 1230, 1236, 1244, 1250, 1258, 1262, 1267, 1279, 1300, 1321, 1329, 1418, 1422, 1436, 1449, 1464, 1474, 1498, 1512, 1566, 1578, 1589, 1595, 1613, 1638, 1650, 1670, 1683, 1719, 1729, 1740, 1814, 1891, 1903, 1921, 1927, 1944, 1952, 1955, 1967, 1971, 1979 |
| abstract_inverted_index.is | 1, 166, 237, 415, 477, 490, 510, 720, 736, 859, 870, 930, 1018, 1036, 1276, 1293, 1315, 1332, 1531, 1801, 1818, 1849, 1982 |
| abstract_inverted_index.it | 476, 1017 |
| abstract_inverted_index.no | 871 |
| abstract_inverted_index.of | 13, 159, 218, 234, 266, 275, 298, 329, 385, 394, 432, 450, 467, 480, 501, 518, 538, 549, 558, 575, 594, 603, 635, 659, 722, 774, 787, 794, 804, 821, 839, 843, 861, 880, 900, 959, 983, 996, 1055, 1096, 1130, 1145, 1198, 1210, 1240, 1270, 1282, 1288, 1303, 1307, 1311, 1337, 1406, 1431, 1495, 1510, 1525, 1539, 1585, 1600, 1603, 1665, 1694, 1736, 1769, 1775, 1794, 1827, 1832, 1883, 1913, 1915, 1933, 1989, 2004, 2008 |
| abstract_inverted_index.on | 100, 286, 314, 377, 535, 554, 656, 727, 754, 766, 896, 905, 926, 1068, 1112, 1341, 1360, 1370, 1380, 1392, 1444, 1632, 1678, 1699, 1702, 1857 |
| abstract_inverted_index.or | 153 |
| abstract_inverted_index.so | 351 |
| abstract_inverted_index.to | 64, 86, 149, 176, 292, 323, 353, 408, 420, 446, 492, 513, 649, 730, 939, 976, 985, 1005, 1124, 1798 |
| abstract_inverted_index.All | 1129 |
| abstract_inverted_index.And | 475 |
| abstract_inverted_index.BAN | 719, 800 |
| abstract_inverted_index.CNN | 413 |
| abstract_inverted_index.Dr. | 1223, 1273, 1571, 1615, 1908 |
| abstract_inverted_index.Guo | 1487 |
| abstract_inverted_index.HRI | 43, 59, 96, 165, 283, 358, 488, 1151, 1204 |
| abstract_inverted_index.Hao | 910 |
| abstract_inverted_index.Her | 1706 |
| abstract_inverted_index.His | 1466, 1541, 1834, 2013 |
| abstract_inverted_index.IET | 14 |
| abstract_inverted_index.IoT | 663 |
| abstract_inverted_index.Nie | 542 |
| abstract_inverted_index.RAS | 1900, 1906 |
| abstract_inverted_index.Ren | 762 |
| abstract_inverted_index.She | 1643, 1661 |
| abstract_inverted_index.TN, | 1977 |
| abstract_inverted_index.The | 231, 718, 1038, 1086, 1105 |
| abstract_inverted_index.Wen | 1572, 1616 |
| abstract_inverted_index.Yue | 1939 |
| abstract_inverted_index.al. | 280, 544, 764, 912 |
| abstract_inverted_index.and | 8, 17, 27, 56, 79, 108, 114, 119, 132, 143, 162, 169, 181, 190, 198, 221, 224, 255, 289, 296, 321, 327, 347, 366, 370, 388, 392, 403, 437, 443, 465, 473, 479, 496, 508, 516, 567, 585, 589, 610, 619, 645, 662, 676, 698, 708, 733, 740, 757, 784, 813, 817, 827, 831, 852, 887, 953, 962, 992, 1012, 1058, 1063, 1142, 1148, 1155, 1164, 1188, 1201, 1233, 1246, 1285, 1309, 1325, 1327, 1344, 1362, 1365, 1373, 1382, 1387, 1395, 1398, 1411, 1424, 1447, 1451, 1461, 1472, 1504, 1514, 1548, 1553, 1591, 1635, 1640, 1659, 1681, 1715, 1753, 1771, 1780, 1845, 1860, 1870, 1898, 1923, 1969, 2020 |
| abstract_inverted_index.are | 319, 337, 614, 855, 1171, 1214 |
| abstract_inverted_index.can | 61, 225, 460, 947, 1040, 1157 |
| abstract_inverted_index.due | 85 |
| abstract_inverted_index.for | 10, 70, 164, 192, 205, 273, 652, 798, 921, 970, 989, 1000, 1081, 1120, 1134, 1217, 1319, 1355, 1403, 1562, 1623, 1750, 1782, 1812, 1853, 1867, 1873, 1887 |
| abstract_inverted_index.has | 44, 1026, 1518, 1557, 1618, 1644, 1996 |
| abstract_inverted_index.her | 1575, 1592, 1687 |
| abstract_inverted_index.his | 1227, 1489, 1502 |
| abstract_inverted_index.in, | 1060 |
| abstract_inverted_index.lab | 1811 |
| abstract_inverted_index.low | 828 |
| abstract_inverted_index.new | 253, 547, 626, 711 |
| abstract_inverted_index.not | 511 |
| abstract_inverted_index.one | 860 |
| abstract_inverted_index.the | 2, 21, 30, 95, 145, 151, 157, 178, 186, 244, 257, 260, 294, 302, 325, 355, 361, 382, 390, 416, 421, 429, 448, 451, 457, 463, 484, 498, 514, 519, 532, 555, 559, 573, 576, 580, 586, 591, 595, 601, 605, 611, 625, 632, 636, 640, 646, 657, 665, 683, 696, 728, 767, 771, 781, 792, 799, 802, 819, 862, 878, 881, 885, 949, 957, 971, 977, 980, 990, 994, 1001, 1007, 1021, 1042, 1047, 1051, 1056, 1065, 1082, 1094, 1097, 1127, 1131, 1139, 1166, 1185, 1189, 1196, 1247, 1263, 1268, 1280, 1301, 1352, 1356, 1366, 1376, 1388, 1399, 1432, 1440, 1452, 1458, 1601, 1666, 1674, 1692, 1695, 1733, 1748, 1767, 1776, 1791, 1807, 1825, 1838, 1850, 1865, 1874, 1884, 1911, 1928, 1987, 2002 |
| abstract_inverted_index.use | 844 |
| abstract_inverted_index.was | 1428, 1662, 1690, 1743, 1762, 1864, 1910 |
| abstract_inverted_index.(AI) | 124 |
| abstract_inverted_index.2009 | 1513 |
| abstract_inverted_index.2015 | 1590 |
| abstract_inverted_index.2017 | 1922 |
| abstract_inverted_index.2019 | 1797 |
| abstract_inverted_index.2020 | 1450, 1892 |
| abstract_inverted_index.2021 | 1931 |
| abstract_inverted_index.B.S. | 1943 |
| abstract_inverted_index.Best | 1433, 1667, 1696, 1916 |
| abstract_inverted_index.CNN. | 378 |
| abstract_inverted_index.Chen | 1940 |
| abstract_inverted_index.HRI. | 193 |
| abstract_inverted_index.Hang | 1224, 1274 |
| abstract_inverted_index.Hom, | 1964 |
| abstract_inverted_index.Hong | 1959, 1965 |
| abstract_inverted_index.Hung | 1963 |
| abstract_inverted_index.ICRA | 1453, 1930 |
| abstract_inverted_index.IEEE | 1338, 1357, 1377, 1389, 1409, 1441, 1459, 1563, 1630, 1636, 1675, 1703, 1854, 1904, 1919, 1929 |
| abstract_inverted_index.Jing | 1486 |
| abstract_inverted_index.Kong | 1960 |
| abstract_inverted_index.More | 701 |
| abstract_inverted_index.They | 249 |
| abstract_inverted_index.This | 0, 172, 570, 747, 890, 964 |
| abstract_inverted_index.USA, | 1759, 1978 |
| abstract_inverted_index.With | 1193 |
| abstract_inverted_index.Zhou | 1724, 1909 |
| abstract_inverted_index.aims | 175 |
| abstract_inverted_index.also | 523, 966, 1115, 1294, 1349, 1819 |
| abstract_inverted_index.area | 715, 759 |
| abstract_inverted_index.base | 588, 648, 675 |
| abstract_inverted_index.been | 127, 1519, 1997 |
| abstract_inverted_index.best | 417, 1924 |
| abstract_inverted_index.body | 714, 729, 758 |
| abstract_inverted_index.both | 74 |
| abstract_inverted_index.call | 561, 578, 638 |
| abstract_inverted_index.cons | 393 |
| abstract_inverted_index.cost | 384 |
| abstract_inverted_index.data | 617, 808 |
| abstract_inverted_index.deep | 287, 310, 1481, 1652, 1713 |
| abstract_inverted_index.each | 395 |
| abstract_inverted_index.etc. | 1426, 1485, 1570 |
| abstract_inverted_index.five | 372 |
| abstract_inverted_index.four | 200, 214, 229 |
| abstract_inverted_index.from | 138, 397, 801, 1252, 1492, 1507, 1581, 1607, 1732, 1796, 1947, 1958 |
| abstract_inverted_index.have | 41, 126, 691, 706, 1206 |
| abstract_inverted_index.help | 493, 954, 1119 |
| abstract_inverted_index.high | 832 |
| abstract_inverted_index.into | 228 |
| abstract_inverted_index.lack | 83 |
| abstract_inverted_index.lead | 877 |
| abstract_inverted_index.left | 1053 |
| abstract_inverted_index.like | 1408 |
| abstract_inverted_index.main | 1467, 1707 |
| abstract_inverted_index.many | 50, 1173 |
| abstract_inverted_index.mean | 398, 404 |
| abstract_inverted_index.more | 1646 |
| abstract_inverted_index.most | 146, 863 |
| abstract_inverted_index.out, | 1062 |
| abstract_inverted_index.over | 1624 |
| abstract_inverted_index.path | 958 |
| abstract_inverted_index.peer | 196 |
| abstract_inverted_index.plan | 956 |
| abstract_inverted_index.pros | 391 |
| abstract_inverted_index.role | 527 |
| abstract_inverted_index.show | 455, 1138 |
| abstract_inverted_index.some | 80, 252 |
| abstract_inverted_index.such | 110, 1183, 1628 |
| abstract_inverted_index.team | 1886 |
| abstract_inverted_index.text | 1048 |
| abstract_inverted_index.than | 1647 |
| abstract_inverted_index.that | 19, 410, 456, 483, 873, 1020, 1178 |
| abstract_inverted_index.they | 380, 427 |
| abstract_inverted_index.this | 208, 307, 778, 914, 1090, 1135, 1176 |
| abstract_inverted_index.used | 491 |
| abstract_inverted_index.user | 1034 |
| abstract_inverted_index.uses | 933 |
| abstract_inverted_index.vent | 423 |
| abstract_inverted_index.well | 1101 |
| abstract_inverted_index.were | 203, 441 |
| abstract_inverted_index.will | 250, 876, 1205 |
| abstract_inverted_index.with | 121, 339, 470, 942, 1536 |
| abstract_inverted_index.work | 982 |
| abstract_inverted_index.your | 1218 |
| abstract_inverted_index.zoom | 1059, 1061 |
| abstract_inverted_index.(CNN) | 318 |
| abstract_inverted_index.(CTD) | 349 |
| abstract_inverted_index.(HRI) | 33 |
| abstract_inverted_index.(NUS) | 1527 |
| abstract_inverted_index.2010, | 1953 |
| abstract_inverted_index.2012, | 1515 |
| abstract_inverted_index.2013, | 1968 |
| abstract_inverted_index.2014. | 1705 |
| abstract_inverted_index.2015, | 1245 |
| abstract_inverted_index.2016. | 1499 |
| abstract_inverted_index.2018. | 1980, 2012 |
| abstract_inverted_index.2019. | 1259, 1465, 1760, 1907 |
| abstract_inverted_index.2020, | 1880 |
| abstract_inverted_index.2020. | 1614, 1684, 1741 |
| abstract_inverted_index.2021. | 1799 |
| abstract_inverted_index.88.3% | 442 |
| abstract_inverted_index.After | 194 |
| abstract_inverted_index.Aided | 1784 |
| abstract_inverted_index.Also, | 600 |
| abstract_inverted_index.Award | 1435, 1455, 1669, 1698 |
| abstract_inverted_index.Based | 655, 765 |
| abstract_inverted_index.Chair | 1336, 1781 |
| abstract_inverted_index.China | 1238, 1583, 1876 |
| abstract_inverted_index.Grant | 1902 |
| abstract_inverted_index.Guest | 1400 |
| abstract_inverted_index.Human | 1383 |
| abstract_inverted_index.Hunan | 1948 |
| abstract_inverted_index.Issue | 4, 174, 749, 892, 1137 |
| abstract_inverted_index.Johns | 1756 |
| abstract_inverted_index.Kong, | 1966 |
| abstract_inverted_index.M.Sc. | 1228, 1576 |
| abstract_inverted_index.Paper | 1434, 1668, 1697, 1917 |
| abstract_inverted_index.Ph.D. | 1248, 1490, 1593, 1727, 1970 |
| abstract_inverted_index.Place | 1869 |
| abstract_inverted_index.Prize | 1872 |
| abstract_inverted_index.RA-L, | 1564 |
| abstract_inverted_index.ROBIO | 1920 |
| abstract_inverted_index.Robot | 1381 |
| abstract_inverted_index.South | 1237, 1582 |
| abstract_inverted_index.Then, | 379 |
| abstract_inverted_index.There | 869, 1170 |
| abstract_inverted_index.These | 211 |
| abstract_inverted_index.WCICA | 1704 |
| abstract_inverted_index.areas | 217 |
| abstract_inverted_index.award | 1926 |
| abstract_inverted_index.base. | 569, 685 |
| abstract_inverted_index.based | 285, 313, 376, 553, 904, 925, 1111 |
| abstract_inverted_index.basic | 1117 |
| abstract_inverted_index.basis | 1080 |
| abstract_inverted_index.below | 1216 |
| abstract_inverted_index.bring | 251 |
| abstract_inverted_index.broad | 1207 |
| abstract_inverted_index.cloud | 660, 1010 |
| abstract_inverted_index.data, | 734 |
| abstract_inverted_index.data. | 521 |
| abstract_inverted_index.dense | 276, 299, 330, 436, 468 |
| abstract_inverted_index.doubt | 872 |
| abstract_inverted_index.every | 235, 723 |
| abstract_inverted_index.extra | 981 |
| abstract_inverted_index.fast, | 693 |
| abstract_inverted_index.field | 1177, 1269 |
| abstract_inverted_index.forms | 677 |
| abstract_inverted_index.fused | 338 |
| abstract_inverted_index.given | 238, 1215 |
| abstract_inverted_index.guest | 1212, 1560, 1821 |
| abstract_inverted_index.human | 71 |
| abstract_inverted_index.ideas | 254 |
| abstract_inverted_index.image | 1846 |
| abstract_inverted_index.joint | 1764 |
| abstract_inverted_index.mind. | 258 |
| abstract_inverted_index.model | 680, 1067 |
| abstract_inverted_index.newly | 182 |
| abstract_inverted_index.paper | 246, 261, 533, 779, 895, 1689 |
| abstract_inverted_index.paste | 986 |
| abstract_inverted_index.plays | 524 |
| abstract_inverted_index.point | 1009 |
| abstract_inverted_index.power | 815, 829 |
| abstract_inverted_index.scalp | 960 |
| abstract_inverted_index.scans | 999 |
| abstract_inverted_index.score | 407 |
| abstract_inverted_index.sense | 556 |
| abstract_inverted_index.share | 731 |
| abstract_inverted_index.since | 2011 |
| abstract_inverted_index.sixth | 1875 |
| abstract_inverted_index.speed | 472 |
| abstract_inverted_index.still | 1172 |
| abstract_inverted_index.study | 293, 324, 571 |
| abstract_inverted_index.their | 87, 92 |
| abstract_inverted_index.third | 1868 |
| abstract_inverted_index.topic | 236 |
| abstract_inverted_index.track | 177 |
| abstract_inverted_index.trend | 879 |
| abstract_inverted_index.upper | 1052 |
| abstract_inverted_index.using | 68 |
| abstract_inverted_index.value | 482 |
| abstract_inverted_index.vent, | 335 |
| abstract_inverted_index.vent. | 305 |
| abstract_inverted_index.which | 336, 440, 735, 932 |
| abstract_inverted_index.whole | 612 |
| abstract_inverted_index.(BAN). | 717 |
| abstract_inverted_index.(CAMP) | 1789 |
| abstract_inverted_index.(DEIB) | 1287 |
| abstract_inverted_index.(IBMI) | 1774 |
| abstract_inverted_index.(ICARM | 1346 |
| abstract_inverted_index.(ICRA) | 1364 |
| abstract_inverted_index.2021). | 1347 |
| abstract_inverted_index.95.9%, | 444 |
| abstract_inverted_index.Awards | 1918 |
| abstract_inverted_index.Centre | 1895 |
| abstract_inverted_index.China, | 1243, 1312, 1588, 1951 |
| abstract_inverted_index.China. | 1313 |
| abstract_inverted_index.Editor | 1318, 1354, 1402, 1852 |
| abstract_inverted_index.Faster | 411, 452 |
| abstract_inverted_index.First, | 360 |
| abstract_inverted_index.France | 1497 |
| abstract_inverted_index.Fusion | 7 |
| abstract_inverted_index.Golden | 1871 |
| abstract_inverted_index.Hunan, | 1950 |
| abstract_inverted_index.Issue. | 210 |
| abstract_inverted_index.Italy, | 1257, 1612 |
| abstract_inverted_index.LIRMM, | 1493 |
| abstract_inverted_index.Munich | 1779, 1795 |
| abstract_inverted_index.Robots | 1372 |
| abstract_inverted_index.Sensor | 6, 117 |
| abstract_inverted_index.System | 18 |
| abstract_inverted_index.Travel | 1454, 1901 |
| abstract_inverted_index.areas, | 51, 439 |
| abstract_inverted_index.below. | 239 |
| abstract_inverted_index.better | 1028 |
| abstract_inverted_index.burden | 995 |
| abstract_inverted_index.corner | 1054 |
| abstract_inverted_index.couple | 1405 |
| abstract_inverted_index.degree | 1229, 1249, 1491, 1506, 1577, 1594, 1728 |
| abstract_inverted_index.design | 837 |
| abstract_inverted_index.detect | 462 |
| abstract_inverted_index.device | 725, 776 |
| abstract_inverted_index.doctor | 1039 |
| abstract_inverted_index.during | 1528 |
| abstract_inverted_index.editor | 1561 |
| abstract_inverted_index.effect | 1141 |
| abstract_inverted_index.entire | 245 |
| abstract_inverted_index.factor | 773 |
| abstract_inverted_index.fellow | 1521 |
| abstract_inverted_index.field. | 34, 700, 889 |
| abstract_inverted_index.funded | 1456 |
| abstract_inverted_index.fusing | 136 |
| abstract_inverted_index.fusion | 26, 118, 161, 189, 220, 268, 291, 507, 1147, 1154, 1186, 1190, 1200 |
| abstract_inverted_index.future | 1180 |
| abstract_inverted_index.haptic | 1550 |
| abstract_inverted_index.health | 699, 741 |
| abstract_inverted_index.higher | 471 |
| abstract_inverted_index.injury | 975 |
| abstract_inverted_index.latest | 22, 179 |
| abstract_inverted_index.lesion | 944 |
| abstract_inverted_index.master | 1503 |
| abstract_inverted_index.mature | 702 |
| abstract_inverted_index.member | 1599, 1882 |
| abstract_inverted_index.method | 97, 459, 920, 938, 965, 1025, 1087, 1110, 1187 |
| abstract_inverted_index.mobile | 1070 |
| abstract_inverted_index.models | 312 |
| abstract_inverted_index.modes, | 812 |
| abstract_inverted_index.neural | 316 |
| abstract_inverted_index.object | 373 |
| abstract_inverted_index.paper, | 308, 915 |
| abstract_inverted_index.papers | 202, 212, 1132, 1649 |
| abstract_inverted_index.physic | 520 |
| abstract_inverted_index.points | 988 |
| abstract_inverted_index.poster | 1925 |
| abstract_inverted_index.proved | 1019 |
| abstract_inverted_index.recent | 36, 687, 867 |
| abstract_inverted_index.result | 1191 |
| abstract_inverted_index.review | 752 |
| abstract_inverted_index.robots | 55 |
| abstract_inverted_index.rooms, | 58 |
| abstract_inverted_index.rotate | 1064 |
| abstract_inverted_index.safety | 66 |
| abstract_inverted_index.screen | 1072 |
| abstract_inverted_index.senior | 1822 |
| abstract_inverted_index.sensor | 25, 188, 506, 756, 769, 797, 1146, 1199 |
| abstract_inverted_index.served | 1558, 1619 |
| abstract_inverted_index.severs | 1350 |
| abstract_inverted_index.signal | 1716 |
| abstract_inverted_index.single | 102 |
| abstract_inverted_index.sparse | 438 |
| abstract_inverted_index.speed, | 400 |
| abstract_inverted_index.system | 104, 142, 272, 284, 489, 566, 609, 629, 644, 669, 1160 |
| abstract_inverted_index.theory | 1232 |
| abstract_inverted_index.topic. | 171 |
| abstract_inverted_index.vents. | 504 |
| abstract_inverted_index.visual | 1079 |
| abstract_inverted_index.winner | 1866 |
| abstract_inverted_index.years, | 37, 688 |
| abstract_inverted_index.years. | 868 |
| abstract_inverted_index.(IROS), | 1375 |
| abstract_inverted_index.(LCSR), | 1755 |
| abstract_inverted_index.Biology | 1641 |
| abstract_inverted_index.Centre, | 1778 |
| abstract_inverted_index.Control | 1700 |
| abstract_inverted_index.Georgia | 1992 |
| abstract_inverted_index.Hopkins | 1757 |
| abstract_inverted_index.Imaging | 1773 |
| abstract_inverted_index.M.Phil. | 1954 |
| abstract_inverted_index.Medical | 1772, 1785 |
| abstract_inverted_index.Milano, | 1255, 1256, 1610, 1611 |
| abstract_inverted_index.Milano. | 1291 |
| abstract_inverted_index.Munich, | 1737, 1738 |
| abstract_inverted_index.Munich. | 1833 |
| abstract_inverted_index.Program | 1335 |
| abstract_inverted_index.Reality | 1788 |
| abstract_inverted_index.Science | 1308 |
| abstract_inverted_index.Sensing | 1752 |
| abstract_inverted_index.Society | 1463 |
| abstract_inverted_index.Special | 3, 173, 209, 748, 891, 1136 |
| abstract_inverted_index.Systems | 1374 |
| abstract_inverted_index.Through | 616, 1003 |
| abstract_inverted_index.analyse | 447, 495 |
| abstract_inverted_index.applied | 737 |
| abstract_inverted_index.article | 753, 1091 |
| abstract_inverted_index.aspects | 803 |
| abstract_inverted_index.augment | 874 |
| abstract_inverted_index.authors | 362 |
| abstract_inverted_index.awarded | 1691 |
| abstract_inverted_index.balance | 500 |
| abstract_inverted_index.battery | 814 |
| abstract_inverted_index.between | 622 |
| abstract_inverted_index.careful | 195 |
| abstract_inverted_index.compare | 1006 |
| abstract_inverted_index.confirm | 409 |
| abstract_inverted_index.contour | 907, 928, 1014, 1023 |
| abstract_inverted_index.control | 1231, 1234, 1471, 1547, 1579 |
| abstract_inverted_index.current | 785, 1542, 2014 |
| abstract_inverted_index.cutting | 961 |
| abstract_inverted_index.density | 297, 328 |
| abstract_inverted_index.design, | 841 |
| abstract_inverted_index.display | 899, 919, 937, 1029 |
| abstract_inverted_index.doctors | 941, 955, 984 |
| abstract_inverted_index.effect. | 1030 |
| abstract_inverted_index.enhance | 65 |
| abstract_inverted_index.feature | 368, 386, 906, 927, 1008, 1013, 1022, 1113 |
| abstract_inverted_index.further | 1194 |
| abstract_inverted_index.fusion, | 777, 823 |
| abstract_inverted_index.future. | 1128 |
| abstract_inverted_index.immerse | 882 |
| abstract_inverted_index.improve | 1159, 1165 |
| abstract_inverted_index.include | 1470, 1545, 1710, 1837, 2017 |
| abstract_inverted_index.inspire | 256 |
| abstract_inverted_index.leading | 1806 |
| abstract_inverted_index.lesions | 903, 924 |
| abstract_inverted_index.limited | 88, 512 |
| abstract_inverted_index.machine | 1711 |
| abstract_inverted_index.marking | 987 |
| abstract_inverted_index.medical | 697, 739, 888, 998, 1475, 1721, 1843, 2018 |
| abstract_inverted_index.methods | 602 |
| abstract_inverted_index.network | 317, 592, 712, 716, 726, 760 |
| abstract_inverted_index.ongoing | 81 |
| abstract_inverted_index.patient | 1103 |
| abstract_inverted_index.perfect | 354 |
| abstract_inverted_index.postdoc | 1765 |
| abstract_inverted_index.present | 281 |
| abstract_inverted_index.process | 850 |
| abstract_inverted_index.project | 1265 |
| abstract_inverted_index.provide | 650, 940 |
| abstract_inverted_index.reading | 243 |
| abstract_inverted_index.reality | 858, 875, 898, 918, 936, 1108, 1123 |
| abstract_inverted_index.reduces | 979, 993 |
| abstract_inverted_index.related | 232, 340 |
| abstract_inverted_index.require | 1179 |
| abstract_inverted_index.reviews | 197 |
| abstract_inverted_index.robotic | 77, 1546, 1809 |
| abstract_inverted_index.safety. | 1104 |
| abstract_inverted_index.scholar | 1746 |
| abstract_inverted_index.science | 1731 |
| abstract_inverted_index.sensing | 103, 141 |
| abstract_inverted_index.sensor, | 790 |
| abstract_inverted_index.sensors | 346, 690, 705 |
| abstract_inverted_index.service | 537, 540, 551, 598, 608, 628, 651, 661, 668 |
| abstract_inverted_index.several | 309 |
| abstract_inverted_index.sharing | 618 |
| abstract_inverted_index.species | 295, 326, 464 |
| abstract_inverted_index.summary | 233 |
| abstract_inverted_index.supply, | 816 |
| abstract_inverted_index.system, | 562, 579, 584, 639, 1840 |
| abstract_inverted_index.system. | 359, 599 |
| abstract_inverted_index.systems | 40, 78, 552 |
| abstract_inverted_index.tactile | 112 |
| abstract_inverted_index.through | 1046 |
| abstract_inverted_index.topics. | 230 |
| abstract_inverted_index.updated | 673 |
| abstract_inverted_index.usually | 98 |
| abstract_inverted_index.vehicle | 1945 |
| abstract_inverted_index.virtual | 1066 |
| abstract_inverted_index.vision. | 116 |
| abstract_inverted_index.weighed | 389 |
| abstract_inverted_index.without | 105 |
| abstract_inverted_index.working | 1278, 1295 |
| abstract_inverted_index.‘Deep | 262 |
| abstract_inverted_index.(ICARM), | 1397 |
| abstract_inverted_index.(ICARM). | 1862 |
| abstract_inverted_index.(Ro-man) | 1386 |
| abstract_inverted_index.(TBMLab) | 1606 |
| abstract_inverted_index.Advanced | 1304, 1342, 1393, 1437, 1445, 1671, 1679, 1858 |
| abstract_inverted_index.Agri-EPI | 1894 |
| abstract_inverted_index.Atlanta. | 1994 |
| abstract_inverted_index.Computer | 1783, 1828 |
| abstract_inverted_index.Examples | 1209 |
| abstract_inverted_index.Finalist | 1914 |
| abstract_inverted_index.Finally, | 426 |
| abstract_inverted_index.Germany, | 1739 |
| abstract_inverted_index.However, | 73 |
| abstract_inverted_index.IEEE/RSJ | 1367 |
| abstract_inverted_index.Letters, | 1413 |
| abstract_inverted_index.Medicine | 1639 |
| abstract_inverted_index.National | 1523 |
| abstract_inverted_index.Problems | 1417 |
| abstract_inverted_index.Robotics | 1343, 1361, 1394, 1410, 1423, 1438, 1446, 1460, 1567, 1672, 1680, 1754, 1859 |
| abstract_inverted_index.Science, | 1829 |
| abstract_inverted_index.Sensors, | 1420 |
| abstract_inverted_index.Society. | 1642 |
| abstract_inverted_index.Surgical | 1271 |
| abstract_inverted_index.Systems, | 1326 |
| abstract_inverted_index.Zhejiang | 1815 |
| abstract_inverted_index.accepted | 204 |
| abstract_inverted_index.accuracy | 431 |
| abstract_inverted_index.accurate | 943 |
| abstract_inverted_index.achieves | 631 |
| abstract_inverted_index.activity | 133, 152 |
| abstract_inverted_index.advances | 23, 180 |
| abstract_inverted_index.analyses | 780 |
| abstract_inverted_index.appeared | 183, 707 |
| abstract_inverted_index.applying | 1121 |
| abstract_inverted_index.assigned | 227 |
| abstract_inverted_index.bachelor | 1505 |
| abstract_inverted_index.buttons. | 1075 |
| abstract_inverted_index.champion | 1885 |
| abstract_inverted_index.combined | 363 |
| abstract_inverted_index.compared | 322, 381 |
| abstract_inverted_index.composed | 721 |
| abstract_inverted_index.computer | 115, 1730 |
| abstract_inverted_index.contours | 1114 |
| abstract_inverted_index.coupling | 557 |
| abstract_inverted_index.critical | 772 |
| abstract_inverted_index.customer | 539, 550, 597, 607, 627, 667 |
| abstract_inverted_index.dataset. | 425 |
| abstract_inverted_index.deep-sea | 303, 333, 502 |
| abstract_inverted_index.devices, | 742 |
| abstract_inverted_index.devices. | 1722 |
| abstract_inverted_index.doctor's | 1083, 1098 |
| abstract_inverted_index.exchange | 621 |
| abstract_inverted_index.exciting | 168 |
| abstract_inverted_index.explains | 590 |
| abstract_inverted_index.feasible | 478 |
| abstract_inverted_index.finalist | 1693 |
| abstract_inverted_index.focusses | 99 |
| abstract_inverted_index.improved | 320, 485 |
| abstract_inverted_index.improves | 1093 |
| abstract_inverted_index.includes | 750, 893 |
| abstract_inverted_index.increase | 1162 |
| abstract_inverted_index.industry | 886 |
| abstract_inverted_index.internal | 564, 582, 642 |
| abstract_inverted_index.learning | 263, 288, 311, 1714 |
| abstract_inverted_index.maintain | 497 |
| abstract_inverted_index.matching | 908, 929, 969, 1011, 1015, 1024, 1109 |
| abstract_inverted_index.methods, | 1016 |
| abstract_inverted_index.multiple | 623, 810, 997 |
| abstract_inverted_index.obtained | 371, 1488 |
| abstract_inverted_index.oriented | 269 |
| abstract_inverted_index.outbound | 560, 577, 637 |
| abstract_inverted_index.patient, | 972, 978, 991 |
| abstract_inverted_index.patient. | 1002 |
| abstract_inverted_index.perceive | 150 |
| abstract_inverted_index.performs | 967 |
| abstract_inverted_index.personal | 1044 |
| abstract_inverted_index.position | 345 |
| abstract_inverted_index.pressing | 1074 |
| abstract_inverted_index.problems | 793 |
| abstract_inverted_index.proposed | 458, 630, 1106 |
| abstract_inverted_index.provided | 343 |
| abstract_inverted_index.provides | 1077, 1116 |
| abstract_inverted_index.quantity | 466 |
| abstract_inverted_index.received | 1226, 1501, 1574, 1725, 1941 |
| abstract_inverted_index.research | 47, 1181, 1468, 1520, 1543, 1708, 1835, 2015 |
| abstract_inverted_index.reviewer | 1622 |
| abstract_inverted_index.robotics | 1844, 1889, 2019 |
| abstract_inverted_index.selected | 1133 |
| abstract_inverted_index.sensors' | 290 |
| abstract_inverted_index.sensors, | 350 |
| abstract_inverted_index.strongly | 241 |
| abstract_inverted_index.surgical | 1479, 1554 |
| abstract_inverted_index.systems, | 624 |
| abstract_inverted_index.terminal | 1071 |
| abstract_inverted_index.therapy. | 2022 |
| abstract_inverted_index.visiting | 1745 |
| abstract_inverted_index.wearable | 689, 704, 724, 755, 768, 775, 789, 796, 1720 |
| abstract_inverted_index.workflow | 613 |
| abstract_inverted_index.workshop | 1932 |
| abstract_inverted_index.Agri-Food | 1936 |
| abstract_inverted_index.Arkansas, | 2009 |
| abstract_inverted_index.Assistant | 1298, 1984, 1999 |
| abstract_inverted_index.Associate | 1317, 1353, 1401, 1533, 1851 |
| abstract_inverted_index.Augmented | 857, 1787 |
| abstract_inverted_index.Cognitive | 15, 1323 |
| abstract_inverted_index.EU-funded | 1264 |
| abstract_inverted_index.Frontiers | 1320, 1328, 1421, 1565 |
| abstract_inverted_index.Grasping: | 1935 |
| abstract_inverted_index.Guangdong | 1508, 1537 |
| abstract_inverted_index.Helmholtz | 1777 |
| abstract_inverted_index.IEEE-ICRA | 1899 |
| abstract_inverted_index.Institute | 1302, 1768 |
| abstract_inverted_index.Mingchuan | 1723 |
| abstract_inverted_index.Nowadays, | 94 |
| abstract_inverted_index.Professor | 1299, 1534, 1985, 2000 |
| abstract_inverted_index.Robotics. | 1272 |
| abstract_inverted_index.Singapore | 1526 |
| abstract_inverted_index.Technical | 1734, 1792, 1830 |
| abstract_inverted_index.accessing | 604 |
| abstract_inverted_index.accuracy, | 809, 1161 |
| abstract_inverted_index.accuracy. | 474 |
| abstract_inverted_index.addition, | 1032, 1686 |
| abstract_inverted_index.agri-tech | 1896 |
| abstract_inverted_index.algorithm | 396, 418 |
| abstract_inverted_index.assistant | 1804 |
| abstract_inverted_index.attracted | 45 |
| abstract_inverted_index.augmented | 897, 917, 935, 1107, 1122 |
| abstract_inverted_index.available | 75 |
| abstract_inverted_index.bilateral | 1483, 1551 |
| abstract_inverted_index.biosensor | 825 |
| abstract_inverted_index.clothing. | 746 |
| abstract_inverted_index.cognitive | 430 |
| abstract_inverted_index.combining | 156 |
| abstract_inverted_index.coupling. | 530 |
| abstract_inverted_index.currently | 1277, 1316, 1333, 1802 |
| abstract_inverted_index.customers | 653 |
| abstract_inverted_index.described | 1088 |
| abstract_inverted_index.designed. | 1037 |
| abstract_inverted_index.detection | 399 |
| abstract_inverted_index.determine | 1041 |
| abstract_inverted_index.developed | 692 |
| abstract_inverted_index.different | 364, 367 |
| abstract_inverted_index.direction | 820, 838 |
| abstract_inverted_index.discusses | 791 |
| abstract_inverted_index.displayed | 1049 |
| abstract_inverted_index.dominated | 681 |
| abstract_inverted_index.e-health, | 1651 |
| abstract_inverted_index.exoculata | 434 |
| abstract_inverted_index.exploited | 63 |
| abstract_inverted_index.generated | 709 |
| abstract_inverted_index.hardware, | 109 |
| abstract_inverted_index.important | 215, 526 |
| abstract_inverted_index.including | 52 |
| abstract_inverted_index.inspiring | 864 |
| abstract_inverted_index.interest. | 48 |
| abstract_inverted_index.interests | 1469, 1544, 1709, 1836, 2016 |
| abstract_inverted_index.interface | 1035, 1057 |
| abstract_inverted_index.journals, | 1407, 1627 |
| abstract_inverted_index.knowledge | 568, 587, 647, 674, 684 |
| abstract_inverted_index.learning, | 1482, 1549, 1653, 1712 |
| abstract_inverted_index.non-rigid | 968 |
| abstract_inverted_index.operating | 57 |
| abstract_inverted_index.operation | 1099 |
| abstract_inverted_index.organisms | 300, 331, 469 |
| abstract_inverted_index.patient's | 950, 1043 |
| abstract_inverted_index.potential | 1144 |
| abstract_inverted_index.practical | 76 |
| abstract_inverted_index.principle | 574 |
| abstract_inverted_index.professor | 1805 |
| abstract_inverted_index.promising | 170 |
| abstract_inverted_index.proposed, | 931 |
| abstract_inverted_index.published | 1211, 1645 |
| abstract_inverted_index.realistic | 481 |
| abstract_inverted_index.recipient | 1430, 1664, 1912 |
| abstract_inverted_index.recommend | 242 |
| abstract_inverted_index.reduction | 851 |
| abstract_inverted_index.represent | 213 |
| abstract_inverted_index.revision, | 199 |
| abstract_inverted_index.rimicaris | 433 |
| abstract_inverted_index.robotics, | 1476, 1480, 1842 |
| abstract_inverted_index.robotics. | 1555, 1660 |
| abstract_inverted_index.selecting | 144 |
| abstract_inverted_index.situation | 786 |
| abstract_inverted_index.stability | 1163 |
| abstract_inverted_index.structure | 593 |
| abstract_inverted_index.technique | 158 |
| abstract_inverted_index.visualise | 948 |
| abstract_inverted_index.(TIG-III). | 1938 |
| abstract_inverted_index.Actuators, | 1415 |
| abstract_inverted_index.Automation | 1363, 1412, 1462 |
| abstract_inverted_index.Biological | 1770 |
| abstract_inverted_index.Biomedical | 1604, 1633, 1990 |
| abstract_inverted_index.Conference | 1340, 1359, 1369, 1379, 1391, 1443, 1677, 1856 |
| abstract_inverted_index.Department | 1281, 1826, 1988, 2003 |
| abstract_inverted_index.Guangzhou, | 1242, 1587 |
| abstract_inverted_index.Laboratory | 1602, 1749 |
| abstract_inverted_index.Mechanical | 1972, 2005 |
| abstract_inverted_index.Nashville, | 1976 |
| abstract_inverted_index.Perception | 9 |
| abstract_inverted_index.Procedures | 1786 |
| abstract_inverted_index.Technology | 1310, 1511 |
| abstract_inverted_index.University | 1239, 1306, 1509, 1524, 1538, 1584, 1735, 1793, 1831, 2007 |
| abstract_inverted_index.Vanderbilt | 1974 |
| abstract_inverted_index.additional | 974 |
| abstract_inverted_index.affiliated | 1535 |
| abstract_inverted_index.algorithm, | 1655 |
| abstract_inverted_index.algorithms | 107, 375, 1718 |
| abstract_inverted_index.applicable | 419 |
| abstract_inverted_index.artificial | 122 |
| abstract_inverted_index.attention, | 1182 |
| abstract_inverted_index.autonomous | 1839 |
| abstract_inverted_index.biological | 424 |
| abstract_inverted_index.biologists | 494 |
| abstract_inverted_index.calculated | 428 |
| abstract_inverted_index.challenges | 1174 |
| abstract_inverted_index.commercial | 703 |
| abstract_inverted_index.compatible | 824 |
| abstract_inverted_index.confidence | 406 |
| abstract_inverted_index.craniotomy | 902, 923 |
| abstract_inverted_index.described. | 615 |
| abstract_inverted_index.developed, | 42 |
| abstract_inverted_index.ecological | 499 |
| abstract_inverted_index.editorials | 1213 |
| abstract_inverted_index.efficiency | 1095 |
| abstract_inverted_index.eliminates | 973 |
| abstract_inverted_index.enterprise | 563, 581, 641 |
| abstract_inverted_index.especially | 694, 743 |
| abstract_inverted_index.extraction | 515 |
| abstract_inverted_index.extractors | 369, 387 |
| abstract_inverted_index.factories, | 53 |
| abstract_inverted_index.generation | 548 |
| abstract_inverted_index.hackathon, | 1897 |
| abstract_inverted_index.illustrate | 545 |
| abstract_inverted_index.increasing | 46 |
| abstract_inverted_index.integrated | 187 |
| abstract_inverted_index.introduces | 20, 572 |
| abstract_inverted_index.management | 565, 583, 643, 679 |
| abstract_inverted_index.materials, | 826 |
| abstract_inverted_index.mechanical | 1956 |
| abstract_inverted_index.perceiving | 91 |
| abstract_inverted_index.perception | 28, 113, 120, 131, 163, 191, 222, 509, 1149, 1156, 1202 |
| abstract_inverted_index.perceptual | 633 |
| abstract_inverted_index.perfection | 265 |
| abstract_inverted_index.processing | 517, 1717 |
| abstract_inverted_index.reference: | 1219 |
| abstract_inverted_index.researcher | 1823 |
| abstract_inverted_index.scenarios, | 845 |
| abstract_inverted_index.scientific | 1626 |
| abstract_inverted_index.short-term | 846 |
| abstract_inverted_index.successful | 128 |
| abstract_inverted_index.summarises | 818 |
| abstract_inverted_index.system’, | 541 |
| abstract_inverted_index.techniques | 125 |
| abstract_inverted_index.technology | 60, 184, 223, 865 |
| abstract_inverted_index.visibility | 840 |
| abstract_inverted_index.(SMARTsurg) | 1266 |
| abstract_inverted_index.Competition | 1879 |
| abstract_inverted_index.Complexity, | 1414 |
| abstract_inverted_index.Computation | 16, 1324 |
| abstract_inverted_index.Engineering | 1634, 1637 |
| abstract_inverted_index.Experiments | 454 |
| abstract_inverted_index.Information | 1284 |
| abstract_inverted_index.Integrating | 505 |
| abstract_inverted_index.Intelligent | 1371 |
| abstract_inverted_index.Interactive | 1384 |
| abstract_inverted_index.Montpelier, | 1496 |
| abstract_inverted_index.Politecnico | 1253, 1289, 1608 |
| abstract_inverted_index.Polytechnic | 1961 |
| abstract_inverted_index.Tech/Emory, | 1993 |
| abstract_inverted_index.Technology, | 1241, 1305, 1586 |
| abstract_inverted_index.Technology. | 1540 |
| abstract_inverted_index.Transaction | 1631 |
| abstract_inverted_index.University, | 1758, 1949, 1962, 1975 |
| abstract_inverted_index.University. | 1816 |
| abstract_inverted_index.agriculture | 1813 |
| abstract_inverted_index.application | 216, 658, 883, 1143 |
| abstract_inverted_index.appropriate | 147 |
| abstract_inverted_index.coefficient | 402 |
| abstract_inverted_index.competition | 1890 |
| abstract_inverted_index.consumption | 830 |
| abstract_inverted_index.correlation | 401 |
| abstract_inverted_index.dynamically | 672 |
| abstract_inverted_index.effectively | 1092, 1158 |
| abstract_inverted_index.engineering | 1235, 1580, 1946, 1957 |
| abstract_inverted_index.environment | 130 |
| abstract_inverted_index.establishes | 670 |
| abstract_inverted_index.evaluation. | 1192 |
| abstract_inverted_index.experience, | 807 |
| abstract_inverted_index.experience. | 1169 |
| abstract_inverted_index.experiments | 1004 |
| abstract_inverted_index.human-robot | 270 |
| abstract_inverted_index.information | 137, 148, 342, 620, 732, 952, 1045 |
| abstract_inverted_index.integrating | 106 |
| abstract_inverted_index.integration | 634, 853, 1197 |
| abstract_inverted_index.intelligent | 39, 536, 596, 606, 666, 745 |
| abstract_inverted_index.interaction | 32, 271, 806, 811, 849, 1168 |
| abstract_inverted_index.interested. | 248 |
| abstract_inverted_index.introduced. | 856 |
| abstract_inverted_index.multi-modal | 140 |
| abstract_inverted_index.multi-scale | 1808 |
| abstract_inverted_index.operations. | 72 |
| abstract_inverted_index.performance | 449 |
| abstract_inverted_index.processing. | 1847 |
| abstract_inverted_index.publication | 206 |
| abstract_inverted_index.recognition | 134 |
| abstract_inverted_index.significant | 1140 |
| abstract_inverted_index.sustainable | 836 |
| abstract_inverted_index.technology, | 664, 783 |
| abstract_inverted_index.theoretical | 1118 |
| abstract_inverted_index.‘Research | 534 |
| abstract_inverted_index.2016–2018. | 1529 |
| abstract_inverted_index.Applications | 1701 |
| abstract_inverted_index.Electronics, | 1283 |
| abstract_inverted_index.Engineering, | 1419, 1973, 1991, 2006 |
| abstract_inverted_index.Furthermore, | 834 |
| abstract_inverted_index.Mathematical | 1416 |
| abstract_inverted_index.Mechatronics | 1345, 1396, 1448, 1682, 1861 |
| abstract_inverted_index.Multi-sensor | 1153 |
| abstract_inverted_index.Region-based | 412 |
| abstract_inverted_index.Technologies | 1605 |
| abstract_inverted_index.agricultural | 1841, 1888 |
| abstract_inverted_index.application. | 1152, 1208 |
| abstract_inverted_index.capabilities | 89 |
| abstract_inverted_index.craniectomy. | 963 |
| abstract_inverted_index.development, | 1195 |
| abstract_inverted_index.environment. | 93, 154 |
| abstract_inverted_index.evolutionary | 1654 |
| abstract_inverted_index.hydrothermal | 304, 334, 422, 503 |
| abstract_inverted_index.image-guided | 2021 |
| abstract_inverted_index.information. | 945 |
| abstract_inverted_index.intelligence | 69, 84, 123 |
| abstract_inverted_index.interaction, | 848, 1478, 1657 |
| abstract_inverted_index.intracranial | 951 |
| abstract_inverted_index.invisibility | 854 |
| abstract_inverted_index.manipulation | 1810, 1937 |
| abstract_inverted_index.multi-sensor | 160, 219, 267, 356, 486, 822 |
| abstract_inverted_index.multi-system | 529 |
| abstract_inverted_index.neurosurgery | 901, 1125 |
| abstract_inverted_index.participated | 1261 |
| abstract_inverted_index.preoperative | 1084 |
| abstract_inverted_index.preparation. | 1085 |
| abstract_inverted_index.sensitivity. | 833 |
| abstract_inverted_index.Communication | 1385 |
| abstract_inverted_index.Computational | 1751 |
| abstract_inverted_index.Consequently, | 155 |
| abstract_inverted_index.Fayetteville, | 2010 |
| abstract_inverted_index.Human–Robot | 11 |
| abstract_inverted_index.International | 1339, 1358, 1368, 1378, 1390, 1442, 1676, 1855, 1877 |
| abstract_inverted_index.Neuroscience, | 1322 |
| abstract_inverted_index.Task-Informed | 1934 |
| abstract_inverted_index.automatically | 461 |
| abstract_inverted_index.computational | 383 |
| abstract_inverted_index.convolutional | 315 |
| abstract_inverted_index.environmental | 341 |
| abstract_inverted_index.human–robot | 31, 1477, 1656 |
| abstract_inverted_index.neurosurgical | 922 |
| abstract_inverted_index.organisms’, | 277 |
| abstract_inverted_index.respectively, | 445 |
| abstract_inverted_index.respectively. | 1516 |
| abstract_inverted_index.teleoperation | 1484, 1658 |
| abstract_inverted_index.Bioengineering | 1251, 1286, 1596 |
| abstract_inverted_index.Interaction’ | 12 |
| abstract_inverted_index.Neurorobotics. | 1330 |
| abstract_inverted_index.class-specific | 405 |
| abstract_inverted_index.first-authored | 1688 |
| abstract_inverted_index.identification | 274, 374, 842 |
| abstract_inverted_index.intelligently. | 654 |
| abstract_inverted_index.investigations | 82 |
| abstract_inverted_index.rehabilitation | 54 |
| abstract_inverted_index.representative | 201 |
| abstract_inverted_index.teleoperation, | 1552 |
| abstract_inverted_index.‘Integrating | 5 |
| abstract_inverted_index.CNRS-University | 1494 |
| abstract_inverted_index.classification, | 782 |
| abstract_inverted_index.formation—the | 713 |
| abstract_inverted_index.fusion-oriented | 357, 487 |
| abstract_inverted_index.instrumentation | 1473 |
| abstract_inverted_index.‘Internet+’ | 1878 |
| abstract_inverted_index.human–computer | 805, 847, 1167 |
| abstract_inverted_index.techniques-based | 264 |
| abstract_inverted_index.meta-architectures | 365 |
| abstract_inverted_index.R-CNN_InceptionNet. | 453 |
| abstract_inverted_index.(R-CNN)_InceptionNet | 414 |
| abstract_inverted_index.conductivity–temperature–depth | 348 |
| abstract_inverted_index.https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/iet-gtd.2020.1493 | 1220 |
| abstract_inverted_index.https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/iet-pel.2020.0051 | 1221 |
| abstract_inverted_index.https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/iet-rsn.2020.0089 | 1222 |
| cited_by_percentile_year | |
| corresponding_author_ids | https://openalex.org/A5100341891 |
| countries_distinct_count | 3 |
| institutions_distinct_count | 5 |
| corresponding_institution_ids | https://openalex.org/I93860229 |
| sustainable_development_goals[0].id | https://metadata.un.org/sdg/14 |
| sustainable_development_goals[0].score | 0.7200000286102295 |
| sustainable_development_goals[0].display_name | Life below water |
| citation_normalized_percentile.value | 0.10065796 |
| citation_normalized_percentile.is_in_top_1_percent | False |
| citation_normalized_percentile.is_in_top_10_percent | False |