Berlin
Technische Universität Berlin Gesellschaft für Informatik e.V.
41. Jahrestagung, Gesellschaft für Informatik e.V. (GI), Berlin
Informatik 2011 > Programm > Workshops > Artikel

An Approach for Projector-based Surgeon-Computer Interaction using Tracked Instruments

Bojan Kocev, Darko Ojdanic, Heinz-Otto Peitgen

Abstract: Purpose: Providing an intuitive and easy to operate interaction mechanism for use in medical augmented reality applications, is one of the very crucial aspects for accepting these applications for use in the operating room. Currently, liver navigation information is displayed on the monitor, requiring the operating surgeon to change focus from the monitor to the surgical site and vice versa during navigation. Projector-based augmented reality has the potential to solve this problem. The aim of this work was to use projection not only for the visualization, but also for interaction. Methods: As a development platform, we used a soft-tissue navigation system which contains an optical (infrared) tracking system. A consumer-grade projector was added to visualize preoperatively defined surgical planning data. Moreover, the projection was used as a virtual touch screen with which the surgeon could not only see information, but also interact with it. We performed the registration of the projection in the camera coordinate system by using either three or four points. The former case assumed a rectangular screen projection, while the latter allowed projecting a random quadrilateral shape. In this fashion, we were able to decide if the surgeon tried to interact with the projected virtual information while tracking the surgical tool. To interact with the virtual information, the surgeon could use a predefined set of surgical tool gestures that emulated a computer mouse (left_click_up, left_click_down, mouse_drag). These were successfully recognized by the system and mapped to application-specific commands to modify the displayed virtual data. Results: The system was tested in a lab environment at Fraunhofer MEVIS in Bremen and showed that the newly developed interaction mechanism works in real time and without false positives in the gesture recognition process. Although this novel medical augmented reality application is in a preliminary phase of development, the users gave positive feedback. The setup is based exclusively on commercially available equipment. In summary, this application provides the functionality of a touch screen, but at a location chosen by the surgeon. Conclusion: The developed system simplifies the intraoperative surgeon-computer interaction and decreases surgeon error due to losing concentration when changing focus from the computer to the surgical site or having a limited field of view when using head-mounted displays. Nevertheless, the technique still requires evaluation in the operating environment. Furthermore, additional simplified interaction and gesture paradigms have to be investigated. Another meaningful extension would be to provide projection and interaction on curved surfaces, e.g. registered to the target organ, which may offer additional flexibility.