An Algorithm for Wrist Position and Attitude Recognition Based on Visual and Inertial Sensors in Human-robot Collaboration Scenarios

Zhenhua Zhao, Jiangbo Zhao*, Junzheng Wang

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

With the development of technology, industrial robots are increasingly widely used in various industries. In some scenarios, robots and humans need to work in the same area, thus gradually giving rise to the concepts of collaborative robots and human-robot collaboration. Among the human-robot collaboration technologies, sensing the location of the human body is the most fundamental and important part. The commonly used solutions can generally be divided into wearable devices and visual-based solutions. The commonly used wearable device is an inertial measurement unit, which has the characteristics of not being affected by obstruction and being cheap, but has a large cumulative error after long-term use. The human body pose estimation based on visual perception is to process image information and use corresponding algorithms to identify the positions of each joint of the human body, thus outputting the human body pose. However, when the device's performance is insufficient, there will be a high recognition delay. In order to solve the problems of cumulative error and delay in recognition, this paper designs a fusion localization algorithm based on visual and inertial measurement units. The algorithm utilizes the attitude and angular velocity data of the inertial sensor and the pose data of the human's right arm and calculates the orientation data of the right wrist by using a fusion algorithm based on the Kalman filter. Then, it uses the acceleration data from the inertial sensor and the wrist position identified by the visual solution and applies the error-state Kalman filter algorithm for data fusion to overcome the high latency of the visual solution and the accumulated error of the sensory sensor. The experimental results prove that the algorithm significantly reduces the recognition delay and almost eliminates the accumulated error.

Original languageEnglish
Title of host publicationProceedings of 2024 IEEE International Conference on Unmanned Systems, ICUS 2024
EditorsRong Song
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1209-1214
Number of pages6
ISBN (Electronic)9798350384185
DOIs
Publication statusPublished - 2024
Event2024 IEEE International Conference on Unmanned Systems, ICUS 2024 - Nanjing, China
Duration: 18 Oct 202420 Oct 2024

Publication series

NameProceedings of 2024 IEEE International Conference on Unmanned Systems, ICUS 2024

Conference

Conference2024 IEEE International Conference on Unmanned Systems, ICUS 2024
Country/TerritoryChina
CityNanjing
Period18/10/2420/10/24

Keywords

  • error-state Kalman filter
  • human-robot collaboration technologies
  • inertial measurement unit
  • Kalman filter

Fingerprint

Dive into the research topics of 'An Algorithm for Wrist Position and Attitude Recognition Based on Visual and Inertial Sensors in Human-robot Collaboration Scenarios'. Together they form a unique fingerprint.

Cite this