We present a novel, non-intrusive approach for estimating contact forces during hand-object interactions relying solely on visual input provided by a single RGB-D camera. We consider a manipulated object with known geometrical and physical properties. First, we rely on model-based visual tracking to estimate the object’s pose together with that of the hand manipulating it throughout the motion. Following this, we compute the object’s first and second order kinematics using a new class of numerical differentiation operators. The estimated kinematics is then instantly fed into a second-order cone program that returns a minimal force distribution explaining the observed motion. However, humans typically apply more forces than mechanically required when manipulating objects. Thus, we complete our estimation method by learning these excessive forces and their distribution among the fingers in contact. We provide a full validity analysis of the proposed method by evaluating it based on ground truth data from additional sensors such as accelerometers, gyroscopes and pressure sensors. Experimental results show that force sensing from vision (FSV) is indeed feasible.

Video

YouTube

Reference

[pdf], [video]

@inproceedings{cvpr:pham:2015,
    Author       = {Pham, Tu-Hoa and Kheddar, Abderrahmane and Qammaz, Ammar and Argyros, Antonis A.},
    Title        = {Towards Force Sensing from Vision: Observing Hand-Object Interactions to Infer Manipulation Forces},
    Booktitle    = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition ({CVPR})},
    Year         = {2015},
    Organization = {IEEE}
}