The ability to track human operators’ hand usage when working in production plants and factories is critically important for developing realistic digital factory simulators as well as manufacturing process control. We propose an instrumented glove with only a few strain gauge sensors and a micro-controller that continuously tracks and records the hand configuration during actual use. At the heart of our approach is a trainable system that can predict the fourteen joint angles in the hand using only a small set of strain sensors. First, ten strain gauges are placed at the various joints in the hand to optimize the sensor layout using the English letters in the American Sign Language as a benchmark for assessment. Next, the best sensor configurations for three through ten strain gauges are computed using a support vector machine classifier. Following the layout optimization, our approach learns a mapping between the sensor readouts to the actual joint angles optically captured using a Leap Motion system. Three regression methods including linear, quadratic and neural regression are then used to train the mapping between the strain gauge data and the corresponding joint angles. The final proposed model involves four strain gauges mapped to the fourteen joint angles using a two-layer feed-forward neural network.
Wentai Zhang, Jonelle Z. Yu, Fangcheng Zhu, Yifang Zhu, Nurcan Gecer Ulu, Batuhan Arisoy, Levent Burak Kara. (2018). High Degree of Freedom Hand Pose Tracking Using Limited Strain Sensing and Optical Training. ASME International Design Engineering Technical Conferences/CIE., 2018, Quebec City, Canada.