Segment Tracking

Motion capture is not a new technology. Much has been made of the pioneering work done by Eadweard Muybridge and Etienne Jules Marey for example. The relative position and orientation of body segments is of great interest to biomechanists and the current ‘gold standards’ are  marker or sensor based systems. However, the requirement to have markers or sensors attached to the body can detract from the ecological validity of data obtained using these methods. ‘Markerless’ motion capture systems are of great interest to the research community, a method has been developed which can calculate joint and segment position using the visual hull of a person [1] although this still requires complex calibration and a multi-camera system.

Depth cameras’ ability to ‘see’ in three dimensions is a big advantage in markerless motion capture, so much so that an estimation of segment position and orientation is possible with only a single depth camera and without any specific calibration.

A major motivation for segment tracking in low-price depth cameras (such as the Kinect) is gesture control.

We recognised the potential of segment tracking after seeing early videos of Primesense’s NITE software.

Our early investigations suggested that accuracy wasn’t high but we were eager to explore the accuracy fully. We conducted a study using NITE and a more sophisticated software package called IPI Soft. We know that the Kinect will never compete with high end motion capture systems in terms of accuracy, but it shouldn’t try to. The size, cost and lack of markers still allow for some incredible applications which we’re eager to explore. We hope the information in these pages will allow others interested in this area to conduct some of their own research.

We started looking at the Kinect shortly after its release, posting a blog article describing our investigations into Kinect segment tracking, as posted on EngineeringSport. Please look at the drop down menu to see related articles into segment tracking (for example a brief description on how to obtain bvh files with the Kinect).

We have released software and data related to segment tracking, please look in the programs and data sections:

Software:

A Matlab based biovision file (bvh) viewer for observing motions recorded with the Kinect.

A windows program which can extract skeleton data as comma separated value (csv) information (in addition to rgb and depth images).

 Data

Sample motion capture data files obtained during our Kinect accuracy study (in proprietary csv and also bvh file formats).

References

[1] Corazza, S., Mündermann, L., Chaudhari, a M., Demattio, T., Cobelli, C., & Andriacchi, T. P. (2006). A markerless motion capture system to study musculoskeletal biomechanics: visual hull and simulated annealing approach. Annals of biomedical engineering, 34(6), 1019–29.