0

Robotic surgeons learn like residents and have just had their first autonomous surgery

Three surgical robots trained in video presentations have achieved what many think is past: performing complex procedures with the accuracy of an expert surgeon and adapting to adaptability that can handle unexpected distortions that define actual medical emergencies. The system marks the transition from simple surgical assistance to truly autonomous decision-making in the operating room.

The most eye-catching demonstration came from Johns Hopkins University, a robot that completed the entire stage of gallbladder removal without human intervention. But it’s not just mechanical precision – the robot responds to voice commands, learns from errors in real time, and adapts it when researchers deliberately change conditions in the mid-term.

Beyond Programming: Robots who actually understand surgery

The difference between these systems and early surgical robots is not only ability, but also understanding. Johns Hopkins medical robot Axel Krieger explains that traditional surgical robots follow rigid predetermined paths, such as “teaching the robot to travel along carefully mapped routes.” His new system “is like teaching a robot to browse any road in any situation, thus responding wisely to everything about it.”

The robot, known as the surgical robot Transformers (SRT-H), learns by watching videos of Johns Hopkins surgeon performing gallbladder surgery on pig carcasses. After analyzing 17-hour surgical lenses of 34 different specimens, the system performed a complex 17-step procedure of 100% accuracy on eight different gallbladders – each with unique anatomical characteristics.

SRT-H is built using a machine learning architecture that powers Chatgpt and can respond to spoken commands like “grab the gallbladder head” and “moving the left arm to the left.” More notably, it can be learned from this feedback during surgery.

Needle and thread navigation challenge

Meanwhile, researchers at the University of North Carolina at Chapel Hill are pioneering what they call the “AI guidance” of medical acupuncture procedures, where physicians must pass critical moments of instruments adorned in mazes such as blood vessels and airways to achieve the goal of being as small as peas.

Since X-rays were discovered in the late 1800s, traditional image guides have helped doctors visualize anatomy. But AI guidance goes further, automatically analyzing images, identifying targets and obstacles, calculating safety trajectories, and even automatically turning to the robotic needle around sensitive tissue.

The team demonstrated their system navigation needles to better accurately target clinically in living lung tissue with physicians using traditional tools. This technology defines four levels of AI participation:

  • Eyes/Hands-Off: AI assists physicians in performing tasks
  • Eyes/right hand: AI performs while physicians monitor
  • Eyes/Hands: AI and doctors work independently
  • Complete AI guidance: Complete independent operation

Advantages of human figure

Michael Yip of the University of California, San Diego believes that the future belongs to humanoid surgical robots – mobile phones with arms and multi-finger hands are similar to industrial robots. His reasoning challenges traditional ideas about specialized surgical equipment.

Current surgical robots are expensive and dedicated machines that require a lot of training to operate. YIP wrote in scientific robotics that the model “does not extend.” Solution? Give surgical robots human-like appendages so that they can take advantage of the large data sets that have been trained in industrial robots.

Humanoid robots can hold ultrasound probes, remove tools or serve as assistance from scrubbing nurses – tasks currently performed by other surgeons or nurses to keep them away from patient care. These roles are “critical and currently performed by other surgeons or nurses, which prevents them from helping other patients and may be physically exhausted.”

Learn from billions of examples

The key insight that drives all three approaches is data leverage. Due to privacy laws and the difficulty of collecting medical data, industrial robots learn from a wide range of data that cannot be obtained from surgical systems. However, by taking the humanoid form, surgical robots can take advantage of the underlying model that accepts billions of manipulation examples.

Johns Hopkins’ SRT-H has demonstrated this principle on a smaller scale. The hierarchical design of the system reflects how surgical residents learn – shaping individual components before processing the complete procedure. During the test, the robot took longer than a human surgeon, but produced smoother, more precise movements without unnecessary movement.

When researchers raised unexpected challenges (changing the robot’s starting position or adding blood-like dyes that change the appearance of tissue) successfully adapted. This robustness comes from training in various examples, rather than remembering a specific situation.

The path to clinical reality

Despite these advances, there are still major obstacles before AI surgeons enter the operating room. Transforming from laboratory pig organs to living patients introduces bleeding, tissue movement, and complications to instruments that are small enough to adapt through the laparoscopic port.

Security issues lead the discussion. UNC’s Ron Alterovitz noted that the system requires “safety, regulatory operations within the environment” and an intuitive physician interface. When AI encounters unfamiliar situations, conservative methods can combine uncertainty calculations to trigger human supervision.

Privacy and data challenges persist. Most surgical AI studies rely on the same few data sets, and hospitals have little incentive to share proprietary surgical records. This field requires what YIP calls a “basic model” – a LARGE-scale AI system trained with multiple surgical data – but collecting such data sets is nearly impossible.

Solutions to surgical labor shortage

The ultimate motivation goes beyond technical achievements. Healthcare faces a skilled workforce shortage that has left patients waiting for longer surgeries, while surgeon burnout is getting bigger. Autonomous surgical systems can solve routine, time-consuming tasks that release human expertise in cases requiring clinical judgment and complex decision-making.

As Alterovitz points out: “AI and robots can provide doctors with new tools to make challenging programs safer and more efficient.” Robots have shown this week that the future may be earlier than expected, rather than sudden progress, but rather through a steady accumulation of capabilities that reflect how human surgeons themselves learn their own crafts.

The question is no longer whether robots will perform surgery independently, but the speed at which medical institutions can adapt to these increasingly capable human colleagues incorporate them into patient care.


Discover more from Neuroweed

Subscribe to send the latest posts to your email.