Surgical Simulation Research Laboratory

Bin Zheng

For details of projects and the team visit Surgical Simulation Research Laboratory Website

The world-leading Surgical Simulation Research Laboratory (SSRL), located in the Li Ka Shing Centre for Health Research Innovation, is combining IT with healthcare to develop training technology for surgeons of the future.

Tracking Surgeons' Eye Movements
SSRL director, Dr. Bin Zheng, who holds the Endowed Research Chair in Surgical Simulation, is applying body-movement training technology for athletes to surgeons' eyes. He first recorded the eye-tracking signals of surgeons in 2010 during live laparoscopic surgery inside the operating room. Eye-tracking reveals many hidden behaviors of a surgeon, such as on-site decisions based on visual-searching strategy. Combined with the laboratory's hand-motion tracking data, eye-tracking data can give a clear description of a surgeon's hand-eye coordination pattern.

This research includes virtual reality (VR), which is being extended into augmented reality. The researchers believe it will play an important role in creating a simulation model to train surgeons. But first they need to know that the human eye inside VR goggles behaves exactly the same as it does during an operation. For example, Dr. Zheng's team has built tiny eye-trackers into the eyepiece of the VR goggles to ensure that there is no motion sickness effect. In SSRL, trainees deep dive inside a human brain weaving in and out of neurons and dendrites, firing a laser-like tool at damaged cells, while Dr. Zheng's team monitor hand and eye motion. The VR scenario can be changed to meet the education purpose, such as an abdominal cavity for general surgery.

The team is also working on producing the world's most accurate surgery training videos. The best videos available today do not capture exactly the perfect moment when a surgeon cuts, because they do not show it happening through the eyes of a surgeon. So Dr. Zheng's team is capturing and overlapping eye-tracking signals onto surgical videos.
The videos replicate exactly the surgeon's gaze-searching strategy - a few seconds before cutting, the surgeon has searched the surrounding tissue, decided where to cut, and has gathered the information that the eye pin points. This creates a new training protocol. Instead of asking young surgeons just to mimic the hand/tool motion of an expert, Dr. Zheng asks them to understand how an expert surgeon was taking in visual inputs before the first hand action. Gaze training shortens the learning phase and builds robust surgical skills when working in a tough surgical situation.

Robotic "Hands" Teach Motion
There is information that cannot be presented in any surgical training video, such as how much pressure is being applied. Surgeons perform many intriguing delicate motions with their hands, and most are not apparent in a video and are hard to describe verbally. A robotic system developed at the SSRL transfers a surgeon'sw hand motion from an expert to a novice. A recent grant to Dr. Zheng from the Royal College of Physicians and Surgeons of Canada supports research to speed up the learning procedure using this system. Surgeons placing their hands on the robot's "hands" transfer all of their tiny movements to the robot hands that students are holding. Also, when learners operate the hands, their motion is recorded and measured against the teachers'. Simulation allows the trainee to do a task many times before moving into the operating room.

Virtual Organs Projected
The SSRL work uses a novel technology developed by University of Alberta computer scientists to superimpose digital 3D models onto the patient's body via a projector. Motion sensors detect the movement of the patient, such as breathing and adjusts the organ positions and size in real time. This enhances pre-surgery planning rehearsal and increases accuracy, providing patient-specific data, exact size and location of organs.