• Tidak ada hasil yang ditemukan

https://drive.google.com/file/d/0B9HTG AMkP1YM1VFZm0yU1h6UG8/view?usp=sharing

N/A
N/A
Protected

Academic year: 2017

Membagikan "https://drive.google.com/file/d/0B9HTG AMkP1YM1VFZm0yU1h6UG8/view?usp=sharing"

Copied!
5
0
0

Teks penuh

(1)

A Touchless Interface for Interventional Radiology

Procedures

K.R.Sivaramakrishnan

Gattamaneni Kumar Raja

Chegu Girish Kumar

Magnetic Resonance Imaging Magnetic Resonance Imaging Magnetic Resonance Imaging Philips India Limited Philips India Limited Philips India Limited

Bangalore, India. Bangalore, India Bangalore, India

Abstract— Interventional radiology is a medical sub-specialty of radiology which utilizes minimally-invasive image-guided procedures to diagnose and treat diseases in nearly every organ system. The entire process is pivoted on imaging and the way surgeon or doctor interacts with the images to guide him/her through the entire procedure. The process still remains to be a challenge to interact and manipulate objects in an image in sterile medical environment where keyboard, mice and touch screen are potential source of infection. This paper aims at investigating a potential solution with a low-cost, touch-free motion-tracking (hand gestures) device for interactions with image visualization environment. The solution involves a device that tracks a person’s hand gestures and its motion and custom software that translates motion information obtained with the motion-tracking device into commands to review and manipulate images on a workstation. This solution would yield to a Touchless Sterile Interface for effortless image interactions in Interventional Radiology procedures.

Index Terms— hand gestures, motion-tracking, Interventional Radiology.

I.INTRODUCTION

Computer information technology is increasingly penetrating into the hospital domain. One such hospital domain where in the information technology has been inevitable is Interventional Radiology. Interventional Radiology (IR) is one of the rapidly growing areas of medicine that provides solutions to common problems affecting men and women of all ages. This is a minimally invasive treatment for vascular and non-vascular disease, using small catheters and catheter-based instruments guided by radiological imaging techniques such as x-rays, fluoroscopy, ultrasound, MRI and CT. These non-surgical techniques are advancing medicine and improving outcomes for a range of patients with life-threatening conditions.

Interventional radiologists are physicians who specialize in minimally invasive, targeted treatments that are performed using imaging guidance. The interventional radiologist or surgeon performing this procedure needs to interact frequently with an increasing number of computerized medical systems before and during surgeries in order to review medical images and records. However, the computers and their peripherals are

difficult to sterilize, so usually during a surgery, an assistant or nurse operates the mouse and keyboard for such interactions.

This indirect interaction suffers from communication problems and misunderstandings. This is one of the main reasons why, in recent years, touch-less interactions has been considered for use in operation theatres.

II. DIFFERENT TOUCHLESS INTERFACES –COMPARATIVE

STUDY

In general, touch-less interaction has been implemented using vocal commands and body gestures.

a) Voice-Based Interface: A limitation of voice-based methods is that they cannot usually distinguish between different people speaking in same room, in addition to being sensitive to environment noise. Voice input can be realized by recording the voice with a microphone and a processing it through dedicated algorithms. To ensure a good functionality, training of the system usually required and special voice commands have to be remembered.

As mentioned before, the main problem of this approach is the fact that the system will react to all people speaking in the room, especially if they have similar voices. This problem has tackled by installing a microphone array instead of just a single microphone. The advantage of interacting through voice lies in the fact that it is independent of any occlusions and the user does not need to wear any additional devices.

b) Body-worn sensor Interface: The Body-worn motion sensors are influenced by user activity only. The problem with activity recognition using such sensors lies less in the extraction of relevant features than in the fact that the information is often ambiguous and incomplete. The body worn sensors like accelerometers react to a combination of earth gravity and arm speed changes. The body-worn sensors like Gyroscopes describe rotational motions of arm. However, none of the above provides exact trajectory information. 2015 International Conference on Automation, Cognitive Science, Optics, Micro Electro-Mechanical System, and

Information Technology (ICACOMIT), Bandung, Indonesia, October 29–30, 2015

(2)

With body-worn sensors it becomes a challenge to differentiate between gestures performed as intentional input to the interactive system and gestures that take place as part of other activities (and should be ignored by interactive system). The body worn sensors are sometimes uncomfortable when it becomes as additional device attachment in surgical environment.

c) Vision-Based Interface: In Vision-Based interface, as with Voice-Based Interface, the user does not need to wear any addition devices. However, the direct line of sight is needed for interaction. The detection of gestures can for instance be done with regular webcams, a stereo camera, a time of flight camera or the Microsoft Kinect. The latter is becoming increasingly popular thanks to low cost implementation. However, the users have the users typically have to hold their hand in unnatural position in order for the system to detect the gesture.

III.PROPOSED TOUCHLESS SOLUTION

The proposed Touchless interface solution is a variation in Vision-Based Interface considering the fact that an Interventional Radiologist or a surgeon in operating room should have an effortless way of interacting with the medical images. The solution we propose makes use of small and simple hand gestures for interaction with images for achieving this effortless workflow which would ideally increase patient safety for a real-time interventional procedure.

The solution uses Leap motion controller – an advance motion sensing technology for human-computer interaction for gesture recognition. The User Interface for Medical image visualization is modeled using Visualization Toolkit (VTK) – open-source software system for 3D computer graphics, image processing and visualization.

I. Gesture detection with Leap motion device:

The Leap motion controller works with infrared optics and cameras instead of depth. Leap motion device does its motion sensing at fidelity, unmatched by any depth camera currently available: it can track all 10 of your fingers simultaneously to within a hundredth of a millimeter with latency lower than the refresh rate on your monitor. Of course, that tracking ability isn't just about the hardware and the capabilities of the Leap are only realized by the software built to work with it.

Fig 1.1: Hands being detected on Leap visualizer.

The Leap motion controller tracks the hand like general object allowing simple actions like pinching, crossing fingers, moving one hand over another and hand-to-hand interactions like brushing and tapping fingers on one another. The precise moments and accuracy to the leap motion tracking opens up an entirely new level of depth to hand movement that developers can utilize for new inputs or increased functionality.

Desktop control relies on dividing 3D space into two separate zones: one closer to your body, which is for “hovering,” and one closer to the display, which is for “touching.”

Fig 1.2: Leap motion controller interaction with desktop.

Fig 1.3: Leap motion controller device.

(3)

II. Visual simulator with Leap motion device:

The challenge of interacting with images in a sterile surgical environment is attempted to solve in the proposed system. The prototype is divided into two components 1) Visual simulation system 2) Leap motion device setup for image interaction.

A.Visual simulation system:

In visual simulation system we tried to depict the real world scenario of interacting with images in operating room. The system has the following views and interaction mechanism:

a) CT image volumeSlice through images, Pan, zoom. b) Three-Dimensional rendered image (skin and bone) -

segmentation, zoom and pan.

c) Contouring mode– Marking and contouring.

To switch between these three views there will be a dock pane on top the view screen which is also controlled by gestures.

B.Leap motion setup:

The Leap motion device in real world scenario can be affixed to operating switch panel on the operating table. This setup will help the surgeon in interaction in an easier way without much effort.

DETAILEDDESCRIPTION

i. A dock pane which would allow the user to switch between the visualization modes like two dimensional viewer mode (2D), three dimensional viewer mode (3D) and contouring mode.

Fig. 1.4 VTK Viewer showing different visualization modes pane (on top).

Fig.1.5 Viewer with CT image with 2D options

End Result (Action) Gesture

Dock or undock the visualization mode pane

from top (circle in space)

Switch between the visualization modes

(swipe)

Table1.1: Gestures for pane and viewer transition.

ii. The 2D viewer which has options to slice through the CT image, panning and zooming an image.

End Result (Action) Gesture

Slice through the CT image slices (Move two fingers up and

down and Join two fingers to freeze

the image at a particular slice)

Panning the image (Move the finger in the

(4)

Zooming the image (Pinch – close in two fingers to zoom out and

spread the two fingers apart to zoom in)

Table1.2: Gestures for slice through the CT image, panning and zooming an image. [2]

The above mentioned gestures are standard gestures to interact with the 2D image and needs no special training as these gestures mimic the gestures for a touch screen interface for interacting with images. The interactions are tailored such a way that after a particular gesture, if the user wants the image to remain in certain state for the diagnosis, he or she has to just unclench a clenched fist and fade out from the gesture recognition zone.

iii. The 3D viewer has options for rendering the skin over bone rendered image, rotating and zooming the rendered image.

Fig.1.6 3D Viewer with Bone rendered image.

End Result (Action) Gesture

Wrapping and Unwrapping skin surface

over the bone rendered image

(Move the finger in the direction to wrap or

unwrap)

Rotate the 3D rendered image in any direction (Sweep in the direction

you wish to orbit)

Zooming the image (Pinch – close in two fingers to zoom out and

spread the two fingers apart to zoom in)

Table1.3: Gestures for rendering the skin over bone rendered image, rotating and zooming the rendered image.

The gestures for 3D viewer interactions are designed for different clinical requirements like visualizing the image with different threshold regions in this case its skin and bone regions which is visualized and interacted with.

iv. The contouring mode involves drawing or restructuring the region of interest drawn.

Fig.1.7 contouring mode with region of interest drawn on image using gesture.

The contouring is also added with gradient based approach, in which contour would latch on to the nearest gradient region in image along the user driven path. This would compensate the hand tremors during the process of contouring.

(5)

Fig. 1.7: A snapshot of user interacting with the system (Slicing through the CT image with gesture referred in

Table 1.2)

IV.DISCUSSION AND CONCLUSION

This study and prototype gave a possibility to explore and come up with a better approach in terms of effortless way of interacting with the viewer in a surgical environment during interventional radiology procedures. The accuracy of the leap motion controller in recognizing the gestures flagged the way for this exploration and it was supported by effective rendering capable visualization toolkit. The prototype still needs to evolve in a way to fully integrate it into a real hospital environment but the scope of it diversifies itself for clinical simulation based applications. The clinical simulations such as medical education enhancement by digitally dissecting human cadavers, biopsy simulation on a 3D anatomy model, simulating isotope seed placement guidance in Brachytherapy planning are some futuristic scope with the help of this device and interface which can be nailed through its accuracy in detection.

ACKNOWLEDGEMENT

[1] Leap Motion (Gesture recognition system) website: https://www.leapmotion.com/

[2] Standard Gestures – Turbo Viewer (standard gesture images)

https://turbocaddoc.atlassian.net/wiki/display/TView/Standard+Gestures

REFERENCES

[1] Shahram Jalaliniya, Jeremiah Smith, Miguel Sousa, Lars Buthe and Thomas Pederson “Touch-less interaction with Medical images using Hand & Foot Gestures” Workshop on Human factors and activity recognition in healthcare, wellness and assisted living, Ubicomp’13, September 8-12, 2013, Zurich, Switzerland.

[2] Holger Junker, Oliver Amft, Paul Lukowicz, Gerhard Tröster “Gesture spotting with body-worn inertial sensors to detect user activities” Pattern Recognition, The Journal of the Pattern Recognition Society, Pattern Recognition 41 (2008) 2010-2024. Elsevier Ltd.

[3] Juan Wachs, Helman Stern, Yael Edan, Michael Gillam, Craig Feied, Mark Smith, Jon Handler “Real-Time Hand Gesture Interface for Browsing Medical Images” IC-MED Vol.1, No. 3, Issue 1, Page 175 of 185.

Gambar

Fig 1.2: Leap motion controller interaction with desktop.
Fig. 1.4 VTK Viewer showing different visualization modes pane (on top).
Fig. 1.7: A snapshot of user interacting with the system  (Slicing through the CT image with gesture referred in Table 1.2)

Referensi

Dokumen terkait

Surat Pernyataan bahwa perusahaan yang bersangkutan dan manajemennya tidak dalam pengawasan pengadilan, tidak bangkrut dan tidak sedang dihentikan kegiatan usahanya dibubuhi

Tujuan praktikum di laboratorium gambar dan CAD adalah untuk pelatihan mahasiswa dalam mengenal gambar sebagai bahasa teknik menggunakan peralatan manual, yang

Analisis Pengaruh Dimensi Kualitas Layanan Jasa Terhadap Kepuasan dan Loyalitas Wisatawan Pada Obyek Wisata Oseng Kemiren ; Dhiniyul Trisika Winanti, 090810201145; 2013;

Penelitian ini bertujuan untuk menganalisis Pengaruh Dimensi Kualitas Pelayanan Terhadap Kepuasan Konsumen Pada Auto Care Araya Di Kota Jember. Penelitian ini merubah

Bahkan dalam SK Mendikbud tersebut secara tegas muatan praktis Secara nasional ditentukan dengan bobot 6 SKS yang lebih dikenal dengan kelompok mata Kuliah

1. Disposisi matematis merupakan keinginan kuat yang mendorong siswa untuk belajar matematika secara sadar sehingga memunculkan sikap positif yang membantu siswa

Surat Pernyataan bahwa perusahaan yang bersangkutan dan manajemennya tidak dalam pengawasan pengadilan, tidak bangkrut dan tidak sedang dihentikan kegiatan usahanya dibubuhi

Kegiat an labor at or ium dapat dibagi dalam 4 pr ogr am pokok yang sem uanya diar ahkan bagi keber langsungan labor at or ium dan peningkat an per annya bagi penunj