TY - GEN
T1 - An Algorithm for Wrist Position and Attitude Recognition Based on Visual and Inertial Sensors in Human-robot Collaboration Scenarios
AU - Zhao, Zhenhua
AU - Zhao, Jiangbo
AU - Wang, Junzheng
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - With the development of technology, industrial robots are increasingly widely used in various industries. In some scenarios, robots and humans need to work in the same area, thus gradually giving rise to the concepts of collaborative robots and human-robot collaboration. Among the human-robot collaboration technologies, sensing the location of the human body is the most fundamental and important part. The commonly used solutions can generally be divided into wearable devices and visual-based solutions. The commonly used wearable device is an inertial measurement unit, which has the characteristics of not being affected by obstruction and being cheap, but has a large cumulative error after long-term use. The human body pose estimation based on visual perception is to process image information and use corresponding algorithms to identify the positions of each joint of the human body, thus outputting the human body pose. However, when the device's performance is insufficient, there will be a high recognition delay. In order to solve the problems of cumulative error and delay in recognition, this paper designs a fusion localization algorithm based on visual and inertial measurement units. The algorithm utilizes the attitude and angular velocity data of the inertial sensor and the pose data of the human's right arm and calculates the orientation data of the right wrist by using a fusion algorithm based on the Kalman filter. Then, it uses the acceleration data from the inertial sensor and the wrist position identified by the visual solution and applies the error-state Kalman filter algorithm for data fusion to overcome the high latency of the visual solution and the accumulated error of the sensory sensor. The experimental results prove that the algorithm significantly reduces the recognition delay and almost eliminates the accumulated error.
AB - With the development of technology, industrial robots are increasingly widely used in various industries. In some scenarios, robots and humans need to work in the same area, thus gradually giving rise to the concepts of collaborative robots and human-robot collaboration. Among the human-robot collaboration technologies, sensing the location of the human body is the most fundamental and important part. The commonly used solutions can generally be divided into wearable devices and visual-based solutions. The commonly used wearable device is an inertial measurement unit, which has the characteristics of not being affected by obstruction and being cheap, but has a large cumulative error after long-term use. The human body pose estimation based on visual perception is to process image information and use corresponding algorithms to identify the positions of each joint of the human body, thus outputting the human body pose. However, when the device's performance is insufficient, there will be a high recognition delay. In order to solve the problems of cumulative error and delay in recognition, this paper designs a fusion localization algorithm based on visual and inertial measurement units. The algorithm utilizes the attitude and angular velocity data of the inertial sensor and the pose data of the human's right arm and calculates the orientation data of the right wrist by using a fusion algorithm based on the Kalman filter. Then, it uses the acceleration data from the inertial sensor and the wrist position identified by the visual solution and applies the error-state Kalman filter algorithm for data fusion to overcome the high latency of the visual solution and the accumulated error of the sensory sensor. The experimental results prove that the algorithm significantly reduces the recognition delay and almost eliminates the accumulated error.
KW - error-state Kalman filter
KW - human-robot collaboration technologies
KW - inertial measurement unit
KW - Kalman filter
UR - http://www.scopus.com/pages/publications/85218074567
U2 - 10.1109/ICUS61736.2024.10839929
DO - 10.1109/ICUS61736.2024.10839929
M3 - Conference contribution
AN - SCOPUS:85218074567
T3 - Proceedings of 2024 IEEE International Conference on Unmanned Systems, ICUS 2024
SP - 1209
EP - 1214
BT - Proceedings of 2024 IEEE International Conference on Unmanned Systems, ICUS 2024
A2 - Song, Rong
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 IEEE International Conference on Unmanned Systems, ICUS 2024
Y2 - 18 October 2024 through 20 October 2024
ER -