A Brave New World of Medicine, Robotic Surgery, Nears Reality

This article is from the archive of The New York Sun before the launch of its new website in 2022. The Sun has neither altered nor updated such articles but will seek to correct any errors, mis-categorizations or other problems introduced during transfer.

The New York Sun

Talk about a major step in human-cyborg relations: Biomedical engineers at Duke University are building a robot surgeon that will have the potential to operate in remote locations with little or no human guidance.

To science-fiction fans, this development carries tremendous appeal, as robots equipped with artificial intelligence may soon be able to travel virtually anywhere and perform sophisticated surgeries on patients — even treating astronauts in orbit.

There are also more immediate — and less fantastic — implications. Robots could roll across battlefields and operate on wounded soldiers when Army medics aren’t available. Even during routine surgeries, medical procedures would become more precise and safer because of a robot’s ability to make more exact surgical cuts.

The secret to the Duke team’s success is a test robot that utilizes three-dimensional ultrasound technology combined with an artificial intelligence program. This is the first instance that a real-time, 3-D ultrasound scanner has been used in this sort of operating room scenario.

“What’s remarkable is that the robot can see within the organ tissue,” the project’s chief, Stephen Smith, a professor of biomedical engineering at Duke, said. “Up until now, you could only see what was in front of a conventional camera during an operation. With 3-D imaging, you can see all parts of an organ.”

The Duke team’s robot sports a camera that is able to convey 30 three-dimensional images a second to a computer processor. In the case of, say, prostate surgery, the computer collects the images and constructs a real-time image of the prostate gland. Then it sends pictures of the entire organ to both the robot and the surgeon.

After the computer processes the images, it sends commands to the robot brain. So far, the scientists have enabled the robot to complete two important tasks: It has touched the tips of two needles together with a 2-millimeter margin of error, a relatively minute distance that satisfies biomedical engineers. The team also took a biopsy (extracting a small tissue sample with a needle) of a simulated tumor, and directed the robot to insert the tip of the needle inside the tumor. It has successfully completed this task, and inserted the needle almost always in the exact center of the tumor.

This is a major step forward from the last cutting-edge advance in surgery, endoscopes, said Mr. Smith, who built this robot in three months with the help of two graduate students and one undergraduate. Endoscopes are cameras that are inserted into the body to allow surgeons to see the organs, enabling them to perform minimally invasive operations.

“The endoscopic camera is a lot like a conventional camcorder,” Mr. Smith said. “The endoscope simply takes pictures of whatever it’s placed in front of because light can’t penetrate into tissue itself.”

In the near future, Mr. Smith said a surgeon could receive the ultrasound images from the robot and direct the robot to conduct surgery. Eventually, with better hardware, the doctor would be better informed and see deep into the tissue before making the first incision.

This new research isn’t completely revolutionary. Three-dimensional imaging technology is widely available commercially. What is different, Mr. Smith said, is that his project marks one of the first instances of applying 3-D ultrasound technology to this combination of artificial intelligence and robotics.

According to Mr. Smith, the most immediate application of the Duke team’s research is a stand-alone robot that can treat wounded soldiers caught in battle. In fact, the Pentagon is closely watching this latest project, titled “Autonomous Surgical Robotics Using Three-Dimensional Ultrasound Guidance: A Feasibility Study.”

In order to push the research forward, Mr. Smith is looking for private-sector funding, although so far companies that manufacture robotic devices used in surgery have yet to contact him. In any case, he predicts the use of autonomous robotic surgeons will become widespread in about a decade.

A professor at the Department of Surgery at the University of Washington who is the senior medical advisor for the Army’s Medical Research Command at the Pentagon, Richard Satava, said the eventual introduction of robotic surgeons with artificial intelligence is inevitable. “It’s a matter of when,” he said. “You have to ask when the amount of funding is going to be available to take the technology to where it needs to be, and I’m not sure the private or public sector is ready yet.”

As for Mr. Smith’s study, “What’s so promising about this scenario is that a robot will be able to do it in one-tenth the time and be 10 times as precise as a human surgeon,” Dr. Satava said. “The new generation of young surgeons is chomping at the bit to move forward with robotics.”


The New York Sun

© 2025 The New York Sun Company, LLC. All rights reserved.

Use of this site constitutes acceptance of our Terms of Use and Privacy Policy. The material on this site is protected by copyright law and may not be reproduced, distributed, transmitted, cached or otherwise used.

The New York Sun

Sign in or  Create a free account

or
By continuing you agree to our Privacy Policy and Terms of Use