Vision based Surgical Instrument Tracking

Retinal microsurgery is one of the most demanding types of surgery. The difficulty stems from the microscopic dimensions of tissue planes and blood vessels in the eye, the delicate nature of the neurosensory retina and the poor recovery of retinal function after injury. Many micron-scale maneuvers are physically not possible for many retinal surgeons due to inability to visualize the tissue planes, tremor, or insufficient dexterity. To safely perform these maneuvers, microscopes are required to view the retina. A central issue for the surgeon is the compromise between adequate illumination of retinal structures, and the risk of iatrogenic phototoxicity either from the operating microscope or endoilluminators, which are fiber-optic light sources that are placed into the vitreous cavity to provide adequate illumination of the retina during delicate maneuvers.

Given the aforementioned reasons, and the prevalence of eye diseases where such surgeries are the only form of treatment (Diabetic Retinopathy, Glaucoma, Age-Related Macular Degeneration, Retinal Detachment, etc...), we are interested in providing a road map to how a vision system for computer assisted retinal surgery may be established. That is, we would ultimately like to have a system that can take images from a microscope, infer what part of the retina we are observing, track surgical tools and guide these to locations predefined by a clinician through pre-operative data.

In this context, visual tracking of instruments is a key component of robotics assistance. The difficulty of the task and major reason why most existing strategies fail on in-vivo image sequences lies in the fact that complex and severe changes in instrument appearance are challenging to model. Below are some are results of our strategies.

Video for IPCAI '11 - This video shows the performance of a method for tracking surgical tools in retinal surgery for detecting proximity between surgical tools and the retinal surface. An image similarity function based on weighted mutual information is specially tailored for tracking under critical illumination variations, lens distortions, and rapid motions.
The result of our pipeline on sequences of retinal microsurgery. For more details please see: R. Sznitman, K. Ali, R. Richa, R. Taylor, G. Hager and P. Fua. Data-Driven Visual Tracking in Retinal Microsurgery. In MICCAI, 2012.
Fast Part-Based Classification for Instrument Detection in Minimally Invasive Surgery. In Proceedings of the 17th International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI), 2014
For more details please see: R. Sznitman, K. Ali, R. Richa, R. Taylor, G. Hager and P. Fua. Data-Driven Visual Tracking in Retinal Microsurgery. In MICCAI, 2012.