Zoltán Tomori, Peter Keša, Matej Nikorovič, Jan Kaňka, Petr Jákl, Mojmír Šerý, Silvie Bernatová, Eva Valušová, Marián Antalík and Pavel Zemánek
Holographic optical tweezers provide a contactless way to trap and manipulate several microobjects independently in space using focused laser beams. Although the methods of fast and efficient generation of optical traps are well developed, their user friendly control still lags behind. Even though several attempts have appeared recently to exploit touch tablets, 2D cameras, or Kinect game consoles, they have not yet reached the level of natural human interface. Here we demonstrate a multi-modal 'natural user interface' approach that combines finger and gaze tracking with gesture and speech recognition. This allows us to select objects with an operator's gaze and voice, to trap the objects and control their positions via tracking of finger movement in space and to run semi-automatic procedures such as acquisition of Raman spectra from preselected objects. This approach takes advantage of the power of human processing of images together with smooth control of human fingertips and downscales these skills to control remotely the motion of microobjects at microscale in a natural way for the human operator.
DOI
No comments:
Post a Comment