The system, which is being developed by a team led by Prof. Sanja Dogramadzi at Bristol Robotics Laboratory and the University of West England (UWE Bristol), is based on a sensor-equipped exoskeleton that picks up the movement of the surgeon’s hands and fingers.
This information is then transmitted to flexible surgical instruments capable of mimicking this movement, while virtual reality glasses give the surgeon a three-dimensional view inside the patient.
Minimally invasive procedures offer a range of benefits to patients including reduced blood loss, fewer infections and faster recovery. Robot-assisted surgery has increased the number of procedures that can be performed minimally invasively, but until now the technology’s use has been limited by the rigid instruments associated with it, according to Dogramadzi.
The €4m research project, funded by the European Commission under the HORIZON 2020 scheme, will develop more flexible instruments, with greater levels of articulation, said Dogramadzi.
“We are developing instrumentation that will give surgeons better access,” she said. “They will have more degrees of freedom, and more flexibility to move around inside the body, so that they can get to areas that they can’t get to with rigid instruments.”
The project, which also includes the North Bristol NHS Trust, the Bristol Urological Institute and the Translational Biomedical Research Centre (TBRC) at Bristol University, is designed to expand the potential for the technology to be used in more complex procedures in urological, cardiovascular and orthopaedic surgery.
The instruments will be based on a new surgical gripper, which mimics the thumb and two fingers of the hand. Sensors on the exoskeleton detect the motion of the surgeon’s fingers, and the thumb and fingers on the instrument move in the same way, said Dogramadzi.
“The instruments inside the body have the same degrees of freedom as the surgeon’s hands, and at the moment their movement is being mapped one to one, although this may change, depending on the application,” she said.
By using the surgeon’s own hand movements to perform the procedure, rather than a console or joystick, it reduces the overall cognitive and training demand upon them, she said.
The instrument will also provide haptic feedback to the surgeon, allowing them to ‘feel’ the tissues and organs inside the body, in the same way as conventional surgery.
“At the moment the instruments do not have any force feedback, so surgeons lack a sense of the force they are using when they are manipulating the soft tissue, and obviously we are talking about sensitive tissue,” said Dogramadzi.
The researchers are planning to develop different versions of the instrument, depending on the requirements of the various types of procedure.
They will also develop smart glasses that will enable surgeons to view live images of what is happening inside the body, while they move around the operating theatre. This offers the surgeon greater freedom, as they no longer need to be plugged into a screen, said Dogramadzi.
“The virtual reality glasses allow the surgeon to position themselves anywhere within the operating theatre,” she said. “The glasses can either be transparent, to allow the surgeon to talk to other members of the team, or they can project images from inside the body, as well as any other information the team need,” she said.
The researchers will use expertise and feedback from senior surgeons to assist them in developing and testing the robotic tools.
Promoted content: Does social media work for engineers – and how can you make it work for you?
So in addition to doing their own job, engineers are expected to do the marketing department´s work for them as well? Sorry, wait a minute, I know the...