A Kinect NUI for 3D Medical Visualization

A Kinect NUI for 3D Medical Visualization
Luigi Gallo, Alessio P. Placitelli, Giuseppe De Pietro

Second Place Award at ChaLearn Gesture Demonstration Competition

at ICPR 2012 - 21st International Conference on Pattern Recognition
November 11, 2012, Tsukuba, Japan

This proposal introduces a natural user interface (NUI) that allows users to rotate with 3 degrees of freedom, point to and crop with 6 degrees of freedom 3D reconstructions of anatomical parts by using a Kinect as the only input device.
The NUI is built upon a view-independent hand pose recognition module, which allows users to recognize a limited set of pre-deļ¬ned postures from single, low resolution depth images.
A set of velocity-based filters is used to enhance pointing precision, to remove hand tremors and to increase the accuracy of the hand pose recognition module.

The challenge is organized by ChaLearn and is sponsored in part by Microsoft (Kinect for Xbox 360). Other sponsors include Texas Instrument. This effort was initiated by the DARPA Deep Learning program and is supported by the US National Science Foundation (NSF) under grants ECCS 1128436 and ECCS 1128296, the EU Pascal2 network of excellence and the Challenges in Machine Learning foundation.

ChaLearn Demonstration Competition website

Videos:

Download Video: MP4, HTML5 Video Player by VideoJS

The NUI in action


The Kinect NUI running on a notebook running Windows 7 x64, equipped with an Intel i7 processor and HD Graphics 3000

Download Video: MP4, HTML5 Video Player by VideoJS

View-independent Hand Shape Recognition 


We use principal component analysis to estimate the hand orientation in space, Flusser moment invariants as image features for visual recognition and a Support Vector Machine to classify the features extracted to a limited set of static hand postures

Download Video: MP4, HTML5 Video Player by VideoJS

Pointing


The index finger controlled depth-enhanced mouse pointer. To derive the depth within volume rendering the transfer function is analyzed to check the opacity of voxels collected in the positions defined by the virtual beam

Download Video: MP4, HTML5 Video Player by VideoJS

Rotation


Single-handed volume rotation with 3 degrees of freedom

Download Video: MP4, HTML5 Video Player by VideoJS

Cropping


A 6 degrees of freedom tasks. The hand position controls the position of the cropping box, whereas the palm orientation controls its orientation.