At Balgrist University Hospital in Zurich, researchers are developing a robot-supported assistance system, with the aim of making spinal surgery both quicker and more precise. Sensors, ultrasound, acoustic analysis and artificial intelligence help the robot respond to changes in the body in real time – and avoid dangers
Prof. Dr. Philipp Fürnstahl
Professor of Orthopedic Research Specializing in the Application of Computer Technologies
University of Zurich
+41 44 510 73 60
E-Mail
University of Zurich
Balgrist University Hospital
When surgeons insert screws into the spine, every millimetre counts. Pedicle screws, which are used to stabilise vertebrae, need to be fed through a bone canal just six to seven millimetres wide. Even minor deviations can have fatal consequences – like the spinal cord or surrounding nerves getting damaged by a screw.
This particular, delicate issue is being looked at by the research project known as FAROS. The EU-sponsored project combines robotics, sensors and artificial intelligence – with the aim of making spinal interventions safer and more precise. A system was developed that supports both high-precision drilling and real-time analyses and responses: if the person doing the drilling deviates from the proposed path, the system stops automatically – before any damage occurs.
The project started in 2021 and was officially ended in June 2024. A final assessment by the EU Commission confirmed that all of the project’s goals had been achieved. At the heart of the project is a prototype, which, for the first time, combined the technologies developed into a single system. In April 2024, the system was assessed, in realistic conditions, on both human specimens and live animals. The evaluation process took place at OR-X, a brand-new and ultra-modern centre for translational surgery. The result showed that the robot could insert screws with precision – and do so autonomously. ‘Of course, there are still lots of challenges – we are talking about top-level research, after all – but the progress is significant,’ according to Philipp Fürnstahl, a professor at the University of Zurich, who is responsible for the FAROS international research project in Switzerland. He also points out that the research initiated by FAROS is set to continue.
As part of the FAROS project, the sensors on the robot’s arms were developed further. The drill bit on the arms contains sensors, which record changes involving tissue in real time. Sensors are also placed around the operating field and provide continuous data. To ensure that the robot behaves as it should with this set-up, an intelligent control system has been developed, which analyses movements and intervenes if danger looms. The robot responded in a reliable manner in simulated test scenarios, working undisturbed where drilling paths were calculated correctly. Whenever an incorrect route was taken – on purpose – the system stopped automatically. 'This is clear evidence that sensor-supported navigation works,' believes Fürnstahl.
Another tool in the FAROS project relied on the familiar technology of ultrasound. But while ultrasound is well established in diagnostics, it has not played much of a role in the operating theatre so far. This is changing now – with the help of robotics, new sensors and artificial intelligence. 'FAROS achieved something of a first, with a three-dimensional reconstruction of the spine, using ultrasound only, for real-time surgery planning ahead of spinal operations,' explains Fürnstahl. This represents significant progress, as the process is not just radiation-free, and thus so much more gentle than, say, CT scans or X-rays, but also cost-effective and with real-time capability. So far, the vast majority of sensors have been two-dimensional, with the robot using these to create a 3D image through precise movements. The next stage, in development terms, is genuine 3D ultrasound transducers capable of capturing three-dimensional spaces instantaneously – say, five times five centimetres in high resolution.
Why is this important? Because patients move during surgery: they breathe, tissue moves and structures change position. One of the next big challenges is to determine how to detect and compensate for these movements in real time. The research team is currently working on this.
The FAROS team in front of the operating theater.
Besides visual and tactile information, the system also uses a previously largely overlooked source of information in the form of acoustic signals. Experienced surgeons have an intuitive feeling for certain things – the characteristic grinding of a screw as it neatly rotates into bone, or the dull thud of a hammer as it strikes healthy tissue – and now, technology allows these to be measured.
To help robots 'learn' these skills, researchers are using various microphones – like contact microphones to capture structure-borne sound or spatial microphones. The latter not only pick up every sound but also calculate where exactly the sound is coming from in spatial terms. So, a chisel strike on bone can be localised with accuracy – something that even high-resolution cameras cannot manage if instruments are blocking the view. Here, the research team is concentrating on those critical moments where every vibration can make the difference between success or complications. This centuries-old art of surgeons using their ears will now be digitised and provide another form of sensory feedback, to be integrated into the control system for the robot.
The acoustic data is analysed with the help of modern forms of signal processing. While interpretation used to be based on set rules, modern trends tend to favour data-driven methods – particularly machine learning. The system learns how to automatically detect and assess certain acoustic patterns and draw conclusions about how surgery is going from these. Making the system both smarter and more flexible. It can adapt to various tools, anatomical features or surgical techniques – and possibly make its own suggestions for improvement in future.
Despite all the progress, the actual breakthrough has yet to be made: a system that analyses the various sensor data simultaneously, links it up intelligently – and makes surgical decisions on the basis of this in real time. This is precisely what researchers at Operating Room X (OR-X) in Zurich are working on. The platform was created with a view to both testing various technologies – from 3D ultrasound to acoustic analysis – and also combining these to create a connected and adaptive operating theatre. 'The aim is to create a system that not only registers changes in the body, but also understands – and responds to these,' says Fürnstahl.
FAROS was sponsored for three years, from 2021, by the European Union’s Horizon 2020 research and innovation programme, with the aim of developing surgical robots capable of performing complex and high-precision surgical measures on their own.
FAROS was an international project with the following research priorities:
Balgrist University Hospital at the University of Zurich: clinical, experimental and interdisciplinary tasks, and acting as an interface between the areas of robotics, informatics and clinical research.
Within the FAROS consortium, the representatives are Philipp Fürnstahl, a professor of Orthopaedic Research at the University of Zurich, Mazda Farshad, Clinical Director at Balgrist University Hospital and a full professor of Orthopaedics at the University of Zurich, and Reto Sutter, a professor at the University of Zurich and senior physician for Radiology at Balgrist University Hospital.
Are you suffering from back or neck pain
or would you like a second opinion?
Clinic at Balgrist University Hospital
Endoscope:
A medical device that is used to examine and visualize the inside of the body. It consists of a flexible tube or a rigid one fitted with a light source and a camera. Endoscopes provide views of the inside of the body, without any need for major surgical interventions.
Haptic Sensor:
A technological device that captures information through touch, pressure, and other forms of physical contact, enabling interaction with the environment.
Hyperspectral:
Refers to the ability of cameras to pick up light in many different spectral ranges (colors) that are not normally visible to the human eye.
EU (Horizon 2020)
The project funding lasts
from 2021 bis 2024
Text: Marita Fuchs
Pictures: Daniel Hager, Frank Brüderli
University of Zurich: Mazda Farshad, Philipp Fürnstahl
Balgrist University Hospital: Mazda Farshad, Philipp Fürnstahl