Duke University researchers say their feasibility studies may represent the first concrete steps toward achieving such a space age vision of the future.
For their experiments, the engineers used a rudimentary tabletop robot whose "eyes" used a 3-D ultrasound technology. An artificial intelligence program served as the robot's "brain," taking real-time 3-D information, processing it and giving the robot commands to perform.
"In a number of tasks, the computer was able to direct the robot's actions," said Stephen Smith, director of the university's Ultrasound Transducer Group. "We believe this is the first proof-of-concept for this approach.
"Given that we achieved these early results with a rudimentary robot and a basic artificial intelligence program, the technology will advance to the point where robots -- without the guidance of the doctor -- can someday operate on people."
The research appears online in the journal IEEE Transactions on Ultrasonics, Ferroelectrics and Frequency Control. A second study, published in the April issue of the journal Ultrasonic Imaging, demonstrated the robot could successfully perform a simulated needle biopsy.