Surgical Robotics

Robotic Telesurgical Workstation for Laparoscopy

Minimally Invasive Surgery (MIS) is a revolutionary approach in surgery. In MIS, the operation is performed with instruments and viewing equipment inserted into the body through small incisions created by the surgeon, in contrast to open surgery with large incisions. This minimizes surgical trauma and damage to healthy tissue, resulting in shorter patient recovery time. Unfortunately, there are disadvantages due to the reduced dexterity, workspace, and sensory input to the surgeon which is only available through a single video image.

In this joint project between the Robotics and Intelligent Machines Laboratory of the University of California, Berkeley (UCB) and the Department of Surgery of the University of California San Francisco (UCSF), a robotic telesurgical workstation for laparoscopy is developed. Our Robotic Telesurgical Workstation for Laparoscopy is a bimanual system with two 6 DOF manipulators instrumented with grippers, controlled by a pair of 6 DOF master manipulators.

With the telesurgical workstation, the conventional surgical tools are replaced with robotic instruments which are under direct control of the surgeon through teleoperation. The goal is to restore the manipulation and sensation capabilities of the surgeon which were lost due to minimally invasive surgery. A 6 DOF slave manipulator, controlled through a spatially consistent and intuitive master, will restore the dexterity, the force feedback to the master will increase the fidelity of the manipulation, and the tactile feedback will restore the lost tactile sensation.

For more information please visit our Robotic Telesurgical Workstation for Laparoscopy page.

Our Second Generation Robotic Telesurgical System for Laparoscopy during tests in the Experimental Surgery Lab at UC San Francisco

Close-up View of the Laparoscopic Manipulators while Tying Knots in a Laparoscopic Training Box

Laparoscopic Manipulators

Current needle holders, graspers, and other surgical tools for minimally invasive surgery transmit surgeon's hand motions by passive mechanics. As the instruments slide, twist, and pivot through the point at which they enter the body wall, they are four-degree-of-freedom manipulators. Consequently, the surgeon can reach points within a three-dimensional volume but cannot fully control orientation. For simple tasks this is not a major hindrance, but it makes complex skills such as suturing and knot tying extremely difficult. Also, the instrument handles are anchored with respect to the patient and it is difficult for the surgeon to align the video display to the camera and instrument axes, resulting in misleading perspective cues. To address these difficulties, we are designing multi-degree-of-freedom end effectors with an appropriate surgeon-machine interface to build laparoscopic manipulators that are more versatile and dextrous.

For more information please visit our Robotic Telesurgical Workstation for Laparoscopy page.

Laparoscopic Manipulator with a Roll-Pitch-Roll Wrist and a Gripper

Tendon Hand
Tendon Driven Multifingered Manipulator for Laparoscopy

Endoscopic Manipulator

An endoscope is typically a 70-180 cm long flexible tube of 11mm diameter. Currently, endoscopic tools are positioned by sliding in and out, and by controlling the bending of the last 10 centimeters of the endoscope. This radius is too large for some tasks, and bending tends to displace surrounding tissue making positioning more difficult. The endo-platform, designed by Jeff Wendlandt, will allow finer positioning control for endoscopic tools. Closed-loop control of this device has been implemented; a movie of the endo-platform tracking a circle is available.
Endo-platform with Biopsy Forceps

Human Interfaces

We have developed a prototype glove-like device that senses the positions of the surgeon's fingers and wrist with its index, thumb, and wrist flex sensors and wrist rotation sensor. The glove provides a more natural means of control than current minimally invasive tools. It could be used as a master to drive the miniature slave robotic hand described above, if force feedback is not needed.
Prototype dextrous master

Virtual Environments Based Surgical Simulators

Virtual Environments for Surgical Training and Augmentation

The second application of this project is to develop a virtual reality simulator for minimally invasive surgery. Learning laparoscopic techniques is much more difficult for surgeons than learning open surgery procedures. Currently, surgeons are trained during actual operations or in the animal laboratory. Training in the operating room increases risk to the patient and slows the operation, resulting in greater cost. Animal training is expensive and cannot duplicate human anatomy. Computer-based training has many potential advantages. It is interactive, yet an instructor's presence is not necessary, so students may practice in their free moments. Any pathology or anatomical variation can be created. Simulated positions and forces can be recorded to compare with established performance metrics for assessment and credentialing. Students could also try different techniques and look at anatomy from perspectives that would be impossible during surgery.

In the context of this application, we are developing real-time finite element models for soft tissue behavior. We will incorporate these models in the VR simulations to generate a realistic environment for training. The visual and haptic displays developed in this project will be used as the human interface for high fidelity feedback.

You can refer to our Virtual Environments for Surgical Training and Augmentation (VESTA) page, the home page of the OPTICAL project, and the home pages of the virtual reality and computer animation classes taught at UC Berkeley, for related information.

Laparoscopic Interface

Laparoscopic Cholecystectomy Simulator

Berkeley Open Surgical Simulator Project

The objective of this effort is to study organ level modeling and simulation for surgical applications. The point to the proposed research is to explore the feasibility of developing open source, open architecture models of different levels of granularity and spatio-temporal scale for a project that has been labeled the Digital Human project. While the emphasis on this program will be on how the simulations that we develop will allow for the interconnections between individual organ simulations, and between different types of physical processes within a given organ, we will develop our tools on a specific test bed application: the construction of a heart model for simulation of heart surgery.

Please visit Berkeley Open Surgical Simulator Project page for more information.

Tactile Sensing and Stimulation

Tactile sensation is extremely important in open surgery to allow the surgeon to feel structures embedded in tissue. Important vessels and ducts are usually shrouded in connective tissue; their presence must be felt rather than seen to avoid damage. Tumors within the liver or colon must be removed without exposure that would allow the spread of cancerous cells. Teletaction allows sensing and display of tactile information to the surgeon. In teletaction, a tactile sensor array can be used to sense contact properties remotely. To provide local shape information, an array of force generators can create a pressure distribution on a finger tip, synthesizing an approximation to a true contact.

As capacitive tactile sensors are fairly mature, our recent work has concentrated on building practical, low-cost, moderate spatial resolution tactile displays. Our current tactile display (bottom figure) is one piece, and molded from silicone rubber. Advantages are ease of fabrication, no air leakage, and no pin friction (A Compliant Tactile Display for Teletaction).

For further information, please see Teletaction Research.

Our Current Tactile Display Prototype

Bilateral Teleoperation

Minimally invasive surgery (MIS) can be of great benefit to the patient, but places great demands on the surgeon's perceptual motor skills. Teleoperation technology can restore some of the lost dexterity and sensation in MIS.

In this research project we studied teleoperation controller design for haptic exploration and telemanipulation of soft environments. Our research has three components: (a) experiments to determine human capability to discriminate changes in compliance displayed through a haptic interface, (b) analysis and design of teleoperator control algorithms to optimize the transmission of compliance, (c) experiments to evaluate operator performance using teleoperation systems in a task more representative of surgery, complementing the control design procedure. The paradigm used in all the cases is the ability to detect a change in compliance of a surface, as would occur due to a lesion or vessel embedded in soft tissue.

Our research have shown that humans' sensitivity to sinusoidal variations in compliance across a surface at high spatial frequencies is much better than discrimination between two compliant surfaces. Based on this result, we have introduced a new measure for fidelity in teleoperation which quantifies the teleoperation system's ability to transmit changes in the compliance of the environment. This sensitivity function also incorporates the experimentally measured frequency dependent compliance discrimination sensitivity of the human operator. The bilateral teleoperation controller design problem is then formulated in a task-based optimization framework as the optimization of this metric with constraints on free space tracking and robust stability of the system under environment and human operator uncertainties. We have also used this analysis to evaluate the effectiveness of using a force sensor in a teleoperation system.

Please refer to our related publications for more information.

High Fidelity Teleopration Experimentation Setup

Teleoperation for High Fidelity Haptic Exploration in Soft Environments

Haptic Training

We are currently exploring the use of haptics, or force feedback, to effectively and accurately train tasks without the use of verbal or visual instruction. Haptic training allows the subject to obtain a "feel" for successfully accomplishing the task. This technique, combined with virtual environments should be a useful training paradigm, particularly for complex tasks. We believe that the role of haptics in virtual environments can be extended well beyond that of simulation.

We have developed a virtual environment that uses haptic training to teach the use of the angled laparoscope (on the right). The environment uses a combination of haptic training strategies to promote the learning of this complex cognitive skill. For example, the subject's hand may be guided through the correct motion, or the subject's motion may be restricted so that success on the task is assured. Giving the subject a direct feel of successfully accomplishing the task bypasses many issues in verbal and visual training.

We have shown that haptics can be used to more effectively train basic perceptuo-motor skills that are beleived relevant to the performance of a surgeon (Feygin et al. 2002). We are currently investigating whether haptic training can also be used to more effectively train high level cognitive skills related to laparoscopic surgery.

Please refer to our related publications for more information.

Angled Laparoscope Simulator

Spatial Cognition in Surgery

The Role of the Hand Assist in Surgery

Laparoscopic techniques require surgeons to operate within a perceptually difficult environment using a translated two-dimensional image that projects unclear and ambiguous images of anatomical structures. In light of these limitations, a hand-assisted method has been developed. It allows a surgeon to perform "closed," insulflated surgery while still being able to haptically explore internal tissues with their non-dominant hand in vivo. Physicians claim that the use of the hand assist method enhances their abilities, possibly because tactile cues help them generate a better and more accurate three-dimensional respresentation of the anatomy. We are currently running a series of experiments to empirically test this claim, by examining experimentally the potential benefits of the hand assisted technique. The studies utilize a simulation of the laparoscopic environment with and without the hand present in order to establish whether or not haptic cues significantly improve laparoscopic performance. If the data reveal a benefit, this finding can be extrapolated to real-world medical applications.

For more information please visit the Spatial Cognition in Surgery page.

Last updated: January 2002. MCC.