Smart Interactions

This topic deals with the design of advanced user interfaces tailored for 2D/3D medical data exploration by using well known, off-the-shelf input devices, with the goal of providing techniques for a practical, seamless interaction in both desktop and semi-immersive virtual environments.

A Study on the Degrees of Freedom in Touchless Interaction

a28-galloDuring the last few years, we have been witnessing a widespread adoption of touchless technologies in the context of surgical procedures. Touchless interfaces are advantageous in that they can preserve sterility around the patient, allowing surgeons to visualize medical images without having to physically touch any control or to rely on a proxy. Such interfaces have been tailored to interact with 2D medical images but not with 3D reconstructions of anatomical data, since such an interaction requires at least three degrees of freedom.


A Kinect NUI for 3D Medical Visualization

A Kinect NUI for 3D Medical Visualization
Luigi Gallo, Alessio P. Placitelli, Giuseppe De Pietro

Second Place Award at ChaLearn Gesture Demonstration Competition

at ICPR 2012 - 21st International Conference on Pattern Recognition
November 11, 2012, Tsukuba, Japan

This proposal introduces a natural user interface (NUI) that allows users to rotate with 3 degrees of freedom, point to and crop with 6 degrees of freedom 3D reconstructions of anatomical parts by using a Kinect as the only input device.
The NUI is built upon a view-independent hand pose recognition module, which allows users to recognize a limited set of pre-defined postures from single, low resolution depth images.
A set of velocity-based filters is used to enhance pointing precision, to remove hand tremors and to increase the accuracy of the hand pose recognition module.

The challenge is organized by ChaLearn and is sponsored in part by Microsoft (Kinect for Xbox 360). Other sponsors include Texas Instrument. This effort was initiated by the DARPA Deep Learning program and is supported by the US National Science Foundation (NSF) under grants ECCS 1128436 and ECCS 1128296, the EU Pascal2 network of excellence and the Challenges in Machine Learning foundation.

ChaLearn Demonstration Competition website


Smoothed Pointing

smoothedSmoothed Pointing is a velocity-based precision enhancing technique for remote pointing via absolute input devices. Differently from other similar pointing enhancement techniques, Smoothed Pointing takes advantage of a model that relies on the visual acuity theory to configure some parameters of the filter in an automatic way so as to better fit with user and device characteristics. The user is requested to execute a 10-second dwelling task, namely to try to keep the cursor steady on a target visualized on the screen. Then the logged data are used to configure the filter so as to decrease the jitter due to noise in the device signal or to hand tremor.


Kinect-based medical image exploration


An open-source system for a controller-free, highly interactive exploration of medical images. By using a Microsoft Xbox KinectTM as the only input device, the system’s user interface allows users to interact at a distance through hand and arm gestures. The system features interaction techniques specifically designed for the deviceless exploration of medical imaging data. Since the user interface is touch-free and does not require complex calibration steps, it is suitable for use in operating rooms, where non-sterilizable devices cannot be used.

In the system's controller-free interface all the interaction commands are mapped to gestures, which can be executed at a distance from the display without touching it. Moreover, filters have been implemented to reduce the noise in the device signal, to increase the accuracy of the remote pointing and to filter hand tremors during all the interaction tasks.


A glove-based interface for 3D medical image visualization

glove1In this work, we describe a 3D user interface that uses a Wiimote-enhanced wireless data glove as the input device and provides interaction techniques specifically developed for exploring medical data in semi-immersive virtual environments.

In greater detail, the interface allows you to rotate and move 3D reconstructions of anatomical parts, to dolly the camera and to control the position of a 3D cursor over the object shapes. Different sources of data are considered: positional data provided by the Wiimote, which tracks the InfraRed (IR) Light Emitting Diodes (LED) placed on the glove; orientation data provided by the accelerometer integrated into the glove; and finger joint movement data provided by the finger bend sensors of the glove. Besides the inexpensiveness of the whole system, a main advantage of the proposed interface is its portability. To manipulate 3D data all that is required is to wear the glove and place the Wiimote in front of it.


Rapid Prototyping of Touchless User Interfaces

prototypingNUIsRecent advances in depth-sensing technologies are fostering the design of Natural User Interfaces (NUI) for use in several application domains. However, due to the complexity of existing software components and to compatibility issues, the design process remains challenging.

We propose a framework aimed at facilitating the development of natural, touch less user interfaces. The proposed framework, which is based on the publish-subscribe paradigm, allows product and interaction designers to rapidly prototype and test their system by building upon a set of standard modules. The framework also provides the building blocks to extend the basic set of modules, easing code reuse.


Wiimote-based 3D user interfaces

wiimote1The goal is to design a semi-immersive medical imaging environment which uses a Wiimote, the primary controller for the Nintendo WiiTM console, as the only input device. The Wiimote is an input device that, by virtue of its features, can be considered as an absolute novelty among 3D user interfaces. The presence of both an infrared camera and a three-axes accelerometer makes it possible to use it in many ways following different interaction metaphors.