A multi-million dollar project has been undertaken in the United States which will see a better level of back-and-forth between surgeons and their robotic instruments.

Surgeons may soon be able to work with the machines, rather than through them at the completion of the five-year, $3.6 million research project titled ‘Complementary Situational Awareness for Human-Robot Partnerships’.

The collaborative project brings together separate teams directed by Nabil Simaan, associate professor of mechanical engineering at Vanderbilt University; Howie Choset, professor of robotics at Carnegie Mellon University; and Russell Taylor, the John C. Malone Professor of Computer Science at Johns Hopkins University.

The project covers new hardware for surgery including tiny probes, sensors and robotic equipment, right through to software and programming specific to the task.

“Our goal is to establish a new concept called complementary situational awareness,” said Simaan.

“Complementary situational awareness refers to the robot's ability to gather sensory information as it works and to use this information to guide its actions.”

There have been plenty of benefits for patients with the advent of minimally invasive surgery, but it has deprived surgeons of the level of awareness inherent in more open procedures. The new project aims to restore some of the awareness lost when working remotely and through tiny incisions.

“In the past we have used robots to augment specific manipulative skills,” said Simaan.

“This project will be a major change because the robots will become partners not only in manipulation but in sensory information gathering and interpretation, creation of a sense of robot awareness and in using this robot awareness to complement the user's own awareness of the task and the environment.”

The researchers intend to create a system that takes data from a variety of sensors as an operation is underway, integrating them with pre-operative information to produce dynamic physical maps in real time. The position of the robot probes can be precisely tracked to show how the tissue in their vicinity responds to movements and the presence of equipment.

It is much more than just some better sensors though. The probes will gather information on the shape and variations in stiffness of internal organs and tissues. Hidden anatomical features such as arteries and tumours can be identified and used in adaptive telemanipulation techniques to assist surgeons in carrying out their procedures.

One team has been assigned to partially automating various surgical sub-tasks, such as tying off a suture, resecting a tumour or ablating tissue. For example, the resection task would allow a surgeon to instruct his robot to resect tissue from point “A” to “B” to “C” to “D” at a depth of five millimetres - the robot would then cut out the tissue specified.

Autonomous protections will be programmed as well, with software that will not allow an instrument to cut in an area where a major blood vessel has been identified – preventing accidents from the operator or the autonomous tasks.

“We will design the robot to be aware of what it is touching and then use this information to assist the surgeon in carrying out surgical tasks safely,” Simaan said.

The software will be based on the open-source “Surgical Assistant Workstation” toolkit, allowing researchers within and outside the team to access the results of the research and adapt them for other projects. It is possible that some of the techniques developed will assist almost completed unrelated activities, such as robotic bomb-disarming or excavation.