Featured Product
This Week in Quality Digest Live
Health Care Features
Jón Bergsteinsson
Understanding the standard is essential
Rob Moorey
Efficient processes and technology are key
Stephanie Ojeda
The FDA’s new QMSR will harmonize with ISO 13485 for medical device quality management
Steve Thompson
An excellent technological tool that improves quality and compliance
Delivering quality to the health industry

More Features

Health Care News
Study of intelligent noise reduction in pediatric study
Streamlines annual regulatory review for life sciences
The company is also facilitating donations to the cause
Mass spectromic analysis from iotaSciences
Showcasing the latest in digital transformation for validation professionals in life sciences
An expansion of its medical-device cybersecurity solution as independent services to all health systems
Purchase combines goals and complementary capabilities
Better compliance, outbreak forecasting, and prediction of pathogens such as listeria or salmonella
Links ZEISS research and capabilities in automated, high-resolution 3D imaging and analysis

More News

James Gaines

Health Care

Handing the Surgeon’s Scalpel to a Robot

After decades of merely assisting doctors, are sophisticated machines ready to take charge?

Published: Tuesday, October 25, 2022 - 11:03

In 2004, the United States’ Defense Advanced Research Projects Agency (DARPA) dangled a $1 million prize for any group that could design an autonomous car that could drive itself through 142 miles of rough terrain from Barstow, California, to Primm, Nevada. Thirteen years later, the U.S. Department of Defense announced another award—not for a robot car this time, but for autonomous robotic doctors.

Robots have been found in the operating suite since the 1980s for things like holding a patient’s limbs in place, and later for laparoscopic surgery, during which surgeons can use remote-controlled robot arms to operate on the human body through tiny holes instead of huge cuts. But for the most part, these robots have been, in essence, just very fancy versions of the scalpels and forceps surgeons have used for centuries—incredibly sophisticated, granted, and capable of operating with incredible precision, but still tools in the surgeon’s hands.

Despite many challenges, that is changing.

Today, five years after that award announcement, engineers are taking steps toward building independent machines that not only can cut or suture but also plan those cuts, improvise, and adapt. Researchers are improving the machines’ ability to navigate the complexities of the human body and coordinate with human doctors. But the truly autonomous robotic surgeon that the military may envision—just like truly driverless cars—may still be a long way off. Their biggest challenge may not be technological, but convincing people it’s OK to use them.

Navigating unpredictability

Like drivers, surgeons must learn to navigate their specific environments; something that sounds easy in principle but is endlessly complicated in the real world. Real-life roads have traffic, construction equipment, pedestrians—all things that don’t necessarily show up on Google Maps, and which the car must learn to avoid.

Similarly, although one human body is generally like another, children’s movies are right: We’re all special on the inside. The precise size and shape of organs, the presence of scar tissue, and the placement of nerves or blood vessels often differ from person to person.

“There’s so much variation in the individual patients,” says Barbara Goff, a gynecologic oncologist and surgeon-in-chief at the University of Washington Medical Center in Seattle. “I think that that could be challenging.” She’s been using laparoscopic surgical robots—the kind that don’t move on their own but do translate the surgeon’s movements—for more than a decade.

The fact that bodies move poses a further complexity. A few robots already display some amount of autonomy, with one of the classic examples being a device with the (accurate but unworthy) name ROBODOC, which can be used in hip surgery to shave down bone around the hip socket. But bone’s relatively easy to work with and, once locked into place, doesn’t move around much. “Bones don’t bend,” says Aleks Attanasio, a research specialist now at Konica Minolta, who wrote about robots in surgery for the 2021 Annual Review of Control, Robotics, and Autonomous Systems. “And if they do, there’s a bigger problem.”

The da Vinci surgical robot, shown here on a U.S. Navy hospital ship, is one of the most widely used devices to assist doctors in laparoscopic surgery. The procedure—in which tools are inserted through tiny holes in the abdomen instead of cutting a long incision—allows patients to recover more quickly. Credit: Kelsey L. Adams, U.S. Navy.

Unfortunately, the rest of the body isn’t as easy to lock in place. Muscles contract, stomachs gurgle, brains jiggle, and lungs expand and contract, for instance—even before a surgeon gets in there and starts moving things around. And while a human surgeon can obviously see and feel what they’re doing, how could a robot know if its scalpel is in the right place or if tissues have shifted?

One of the most promising options for such dynamic situations couples the use of cameras and sophisticated tracking software. In early 2022, for example, researchers at Johns Hopkins University used a device called the Smart Tissue Autonomous Robot (STAR for short) to sew two ends of severed intestine back together in an anesthetized pig—a potentially very jiggly task—thanks to this visual system.

A human operator tags the ends of the intestine with drops of fluorescent glue, creating markers the robot can track (a bit like an actor wearing a motion-capture suit in a Hollywood movie). At the same time, a camera system creates a 3D model of the tissue using a grid of light points projected onto the area. Together, these technologies allow the robot to see what is in front of it.

“What’s really special about our vision system is that it allows us to not only reconstruct what that tissue looks like, but it also does so fast enough that you can do it in real time,” says STAR system co-designer Justin Opfermann, an engineering Ph.D. student at Hopkins. “If something does move during the surgery, you can detect and follow it.”

The robot can then use this visual information to predict the best course of action, presenting the human operator with different plans to choose from or checking in with them in between sutures. In tests, STAR worked well on its own—though not perfectly. In total, 83 percent of the sutures could be done autonomously. But the human still had to step in the other 17 percent of the time to correct things.

“The 83 percent can definitely be overcome,” says Opfermann. Most of the problem was that the robot had a little trouble finding the right angle at certain corners and needed a human to nudge it into the right spot, he says. Newer, yet-to-be-published trials now have success rates in the high 90s. In the future, the human may only need to approve the plan, then watch it go—no intervention needed.

Since the early days of NASA designs during the 1970s, surgical robots have gradually become more capable. Eventually, they may be able to make and carry out decisions on their own without intervention or supervision by human surgeons.

Passing the safety test

For now, though, there still must be someone in the driver’s seat, so to speak. And it might be that way for a while for many different autonomous robots. Although we could theoretically hand over complete decision-making to the robot, this does raise a question, one that has also plagued driverless cars: “What happens if some of these activities go wrong?” says Attanasio. “What if the car has an accident?”

The general view, for now, is that keeping the humans ultimately in control is best—at least in a supervisory role, reviewing and signing off on procedures, and standing by in case of emergency.

Even so, proving to hospitals and regulators that autonomous robots are both safe and effective may be the single biggest roadblock to truly human-free robots entering the surgical suite. Experts have a few takes on how to get around this.

For instance, designers will likely need to be able to explain to regulators exactly how the robots think and decide what to do next, says Attanasio, especially if they progress to the point where they’re not just assisting a human surgeon but arguably practicing medicine themselves. That explanation may be easier said than done, though, since current artificial intelligence systems may leave observers few hints of how they make decisions. As a result, engineers may want to design with “explainability” in mind from the beginning.

Pietro Valdastri, a biomedical engineer at the University of Leeds in England and one of Attanasio’s co-authors, thinks it’s possible that no manufacturer will be able to easily solve the regulatory question, though he does have a work-around. “The solution here is to make a system that, even if it’s autonomous, is inherently safe.” This means the next generation of surgical robots may not resemble roadsters so much as bumper cars.

This soft robot, steerable by externally controlled magnets, is designed to snake deep into a patient’s lungs to view the tissue there. The robot navigates the narrow passages on its own, eliminating the need for X-rays to help guide a human operator. Credit: University of Leeds.

Valdastri is working on what are known as soft robots, particularly for colonoscopies. Traditionally, a colonoscopy requires snaking a flexible tube with a camera—an endoscope—through the intestine to look for early signs of colon cancer. The procedure is recommended for anyone over the age of 45, but it can take a long time and a lot of training for an operator to become proficient with the endoscope. With few properly trained operators to go around, wait lists have ballooned.

But using a smart robot that can steer itself would make the job much easier—like driving a car in a video game, Valdastri says. The doctor could then focus on the matter at hand: spotting early signs of cancer. And in this case, the robot, created from soft materials, would be inherently safer than more rigid devices. It may even reduce the need for anesthesia or sedation, says Valdastri, since it could more easily avoid pushing against the intestinal walls. And with no way for the robot to cut or zap anything on its own, it may be easier for regulators to accept.

As the technology develops, Opfermann suggests, autonomous robots may start out getting approval only for simpler tasks, such as holding a camera. As more of these basic jobs get approved, tasks may build up into an autonomous system. In cars, we first got cruise control, he says, but now there’s brake assist, lane assist, even assisted parking—all of which build toward something driverless.

“I think this will be kind of similar,” says Opfermann, “where we see small, autonomous tasks that eventually get chained together into a full system.”

This article originally appeared in Knowable Magazine on Sept. 13, 2022. Knowable Magazine is an independent journalistic endeavor from Annual Reviews, a nonprofit publisher dedicated to synthesizing and integrating knowledge for the progress of science and the benefit of society. Sign up for Knowable Magazine’s newsletter.


About The Author

James Gaines’s picture

James Gaines

James Gaines is a researcher for Knowable Magazine and a freelance science journalist living in Seattle. He is autonomous, but struggles to explain how or why.