Take a look at the Recent articles

Virtual reality and augmented reality: Advances in surgery

David B Douglas

Department of Radiology, Stanford University, Palo Alto, CA, USA

E-mail : bhuvaneswari.bibleraaj@uhsm.nhs.uk

Clifford A Wilke

Federal Health Segment, DXC Technology, Herndon, VA, USA

David Gibson

Digital Context Aware, Austin, TX, USA

Emanuel F Petricoin

Center for Applied Proteomics and Molecular Medicine, George Mason University, Manassas, VA, USA

Lance Liotta

Center for Applied Proteomics and Molecular Medicine, George Mason University, Manassas, VA, USA

DOI: 10.15761/BEM.1000131

Article
Article Info
Author Info
Figures & Data

Abstract

Nearly 3% of the U.S.’s gross domestic product is spent on surgical care. Many are hopeful that technological advances will reduce morbidity and lower costs. One such technology that is beginning to play a role in many different surgical procedures is augmented reality (AR) where a virtual image can supplement the real-world scene to enhance surgery. In this article, we will discuss the role of AR in pre-operative planning and will discuss the types of AR used in surgery. We will examine the challenges of bringing AR into the operating room such as meeting the surgeons’ requirement of optimum ergonomic environments to exercise precise movements. Despite the challenges, many studies are showing significant benefits of the virtual image to the surgeon. As a result, more and more surgeons are attempting AR enhanced operations, even complex neurosurgical procedures.

Key words

Augmented Reality, Virtual Reality, 3D imaging, D3D, Radiology

Introduction

According to 2015 data, the United States spends 17.2% of its gross domestic product on healthcare and 32.3% of that is spent on hospital stays [1]. Nearly one third of hospital stays include a surgery, and those procedures and post-operative care account for almost half of hospital costs [2]. Thus, nearly 3% of the nation’s GDP is spent on care involving surgical procedures.  Naturally then, there is a drive to improve efficiency in operating room, not only to improve patient outcomes, but also to bring down costs.  Many factors play a role in determining hospital pricing including the imaging and lab work-up, medications, operating room instruments, post-surgical care, labor and specialist’s fees [3]. Any advance in technology that decreases morbidity of the procedure, improves outcomes and decreases length of hospital stay will ultimately also have a beneficial outcome on a given hospital’s bottom line. Technologies that hold promise in accomplishing these aims include virtual reality (VR) and augmented reality (AR) enhanced surgery.

VR technologies can be characterized as either non-immersive, such as desktop computers, semi-immersive or fully immersive VR [4]. In fully immersive VR, the head mounted display (HMD) presents a virtual image that completely occludes the real-world from the user's field of view, as is seen in the Oculus Rift and HTC Vive [5]. In semi-immersive VR, the HMD presents a virtual image that partially occludes the real-world from the user's field of view, as is seen in Samsung Gear VR [5]. In VR, the user can maneuver through the virtual world by head movements (via HMD tracking) or walking (via external camera tracking). The user can interact with the virtual environment through handheld devices with haptic feedback or voice gestures.  However, one of the challenges of VR is lack of accurate head-tracking and motion sickness [6].

AR technologies can be characterized into AR or mixed reality (MR). Both AR and MR have a simultaneous display of a virtual image and a real-world image allowing the user to simultaneously interact with the real-world and the virtual image [7]. In both of these technologies, the user wears a HMD to display the virtual image and the real-world image. In AR, the virtual image is transparent like a hologram as is seen in Meta and DAQRI systems.  In MR, the virtual image appears solid as is seen in the Microsoft Hololens.

Surgeons are increasingly recognizing the benefits of VR/AR. The objectives of this article are to provide a review of VR/AR technologies and techniques used in surgery, to illustrate by example VR/AR in pre-operative planning, and to illustrate state-of-the-art display techniques available for surgeons.  This paper will be divided into three sections. First, we will discuss the role of VR/AR in pre-operative planning. Second, we will discuss the role of VR and AR during surgery. Third and finally, we will conclude with a discussion on future uses of VR/AR in the field of surgery.

Pre-Operative planning

Pre-operative planning can be extremely complex.  A surgeon must take into account the patient's medical condition (e.g., age, co-morbidities, vital signs, labs, etc.), the anatomy based on physical examination, the patient's prior surgeries (if any), and pre-operative imaging to determine the surgical approach.

Vignette

A brain aneurysm is a balloon-like outpouching extending from a small blood vessel in the brain. The anatomy of the blood vessels of the brain is extremely complex and varies from one patient to the next, posing an additional challenge. If the aneurysm ruptures, the hemorrhage into the brain can manifest as a severe headache, coma or death.  During the pre-operative planning period, the neurosurgeon must review all of the patient's pertinent clinical information and determine how to best care for the patient with surgical options including open surgery or closed endovascular surgery.  In an open surgery, the surgeon makes an incision at the skin, performs a craniotomy through the skull, dissects down to the aneurysm and places a clip as a closure device so that the ruptured aneurysm sac stops bleeding.  The neurosurgeon can also perform a closed endovascular procedure by accessing the femoral artery in the leg and traversing a catheter up to the brain where the aneurysm can be treated with a coil patch, which is analogous to patching a tire from the inside. How should the surgeon proceed?

Many times, such a decision relies on analysis of the precise anatomy of the aneurysm.  Is it physically accessible from an open approach or is it in a location that is too hard to reach?  How far and how deep will the dissection need to be?  What is the width of the neck of the aneurysm sac like?  Is it narrow where the coil patch has good purchase and can reliably stay in place? Or is it too wide such that the coil patch risks slipping out and risk either aneurysm bleeding or the coil patch causing a stroke by blocking a further downstream vessel?  If there are multiple aneurysms, how does this affect the approach? Many questions run through the neurosurgeon's mind when making the decision on surgical approach.

Given the high stakes, complex anatomy and multiple possible treatment approaches, good pre-operative planning is essential. Many times, the most critical component of the planning is review of the pre-operative diagnostic imaging. It is of paramount importance that the surgeon has the very best possible image quality and viewing methods. During the pre-operative imaging assessment, the surgeon can view the patient's images with conventional methods (e.g., 3 viewing planes including axial, sagittal or coronal imaging planes) or with advanced imaging methods to include VR/AR.

Conventional imaging-aided pre-operative planning

Conventional imaging includes axial, sagittal and coronal imaging planes. The surgeon reviews these 2D images on a high-resolution display monitor and mentally constructs a 3D image, which can be quite challenging [8,9]. As the complexity of the anatomy increases, it becomes ever increasingly difficult in mentally constructing the 3D image; yet the surgeon's mental map of the anatomy is critical for successful operations. The surgeon must also memorize the anatomy for recall of the precise details of the complex anatomy at a later time point while in the actual operating room. Thus, there has been an increasing trend for the use of advanced imaging in pre-operative planning.

AR/VR aided pre-operative planning

There are some FDA approved VR/AR systems for viewing of medical images.  A quick view of the Brainlab Inc. website will give the reader a feel for the state of the technology. Navigation is a critical element of operations involving complex anatomy and there have been numerous approaches documented on ways to enhance that aspect of surgery. More can be done. As an example, VR/AR is currently undergoing research by DXC Technology (formerly Hewlett-Packard) and D3D Enterprises. As opposed to volume rendering images, which provide a 3D image on a 2D monitor, depth-3-dimensional (D3D) provides a unique image to each eye with accurate depth perception and capabilities for head tracking and joystick interface. This improves human machine interface [10-13]. See Figure 1.

Figure 1. Overview of the geometry used in generating the Depth-3-Dimensional (D3D) images. Reprinted from Journal of Medical Devices: Evidence and Research, Volume 9, Douglas et al. " D3D augmented reality imaging system: proof of concept in mammography", 277-283, 2016 with permission from Dove Medical Press Ltd [13].

Figure 2. Depth-3-Dimensional (D3D) showing the cerebral vasculature, which would be viewed with either VR or AR.  Note the image on the left is left eye viewing perspective and the image on the right is the right eye viewing perspective, which provides depth perception. This depth perception cannot be appreciated on the paper format of this journal and requires a VR or AR headset. The red boxes represent the 3D cursors used.  The white arrows point to middle cerebral artery branches, which course at varying depths. 

Figure 3. In a conventional operating room, the surgeon alternates the direction he or she is looking.  For example, the surgeon looks down at the patient and then looks up across the room to the image. Note that the diagnostic image shown is a volume rendered image of the bones centered on the cervical spine.

Figure 4. Illustration of projection-type AR enhanced surgery.  (A) Illustration of a projector projection an image onto the patient's skin. (B) Illustration of the projected image onto the patient's skin. A requirement for this technique being effective is accurate registration of the patient's anatomy with the projected image. 

Figure 5. Illustration of heads up display (HUD) type AR enhanced surgery. Note that the surgeon looks through a mounted HUD to see both the real and virtual images [47]. 

Figure 6. Illustration of operating microscope type AR enhanced surgery. Note that the microscope is positioned above the patient as would naturally be done in surgery.  The image that is viewed through the microscope includes simultaneous display of both a real and virtual image. 

Figure 7. Illustration of HMD type AR enhanced surgery.  As the surgeon's head moves, both the real image and virtual image should move together in synchrony with accurate registration.

Advantages of VR in pre-operative planning include an enhanced viewing of the complex anatomy.  The surgeon experiences an immersive experience in viewing of the complex anatomy with 3D and depth perception. With external cameras, the surgeon could take a virtual walk through the patient's anatomy to better understand the anatomy. This would enable the surgeon to notice finer details that would not otherwise be noticed with conventional imaging.  Furthermore, the surgeon may have a better ability to memorize the complex anatomy given the immersive experience and be able to recall it when in the operating room.

Segmentation is an important step in the AR/VR process. Through segmentation, organs can be isolated and individually examined. Segmentation can also help the surgeon visualize possible trajectories to pursue on the area being operated. Possible dissection pathways could be color coded, compared and evaluated. This will assist in simulation for the selected trajectory. False color can be added to indicate the need for caution for high-risk regions such as major vascular structures, which would be delineated through the segmentation process.

Multiple studies have demonstrated that virtual reality can improve the training and performance of intra-operative procedures.  Surgical residents showed not only improved error rate when using virtual reality training to remove a gallbladder, but also shorter surgical time [14]. This translates into less time under general anesthesia and improved patient safety.  One criticism of VR training has been that skills learned in the virtual setting do not necessarily transfer over to an actual surgical case.  However, two different studies of 20 novice surgical trainees showed that basic surgical knot tying abilities and other basic laparoscopic skills could be significantly improved using only virtual reality simulators [15,16]. Again, this saves patients time under general anesthesia, improves safety in the training setting and reduces cost. 

In the neurosurgical setting, very small or even microscopic distances can have profound impacts in critical outcomes, such as language processing and speech.  The use of a virtual reality simulator has shown improvement in the learning curve for more complicated neurosurgical tasks, such as ventricular cannulation, again highlighting potentially improved patient outcomes [17].

Advantages of AR in pre-operative planning include similar advantages as in VR including depth perception, but also provide the ability to simultaneously view a real-world image. The real image could include dedicated pre-operative planning laboratories or even the actual patient at a time period prior to the surgery. VR/AR with D3D technology would be well suited in pre-operative planning as adequate image resolution, depth of field, depth of focus, field of view (FOV) and position tracking are achieved [5].

VR/AR technologies and techniques used to enhance surgical operations

Surgical procedures enhanced by VR/AR can be viewed as ‘mixed procedures’. They are mixed in that on the one hand there is the reality of the patient present in front of the surgeon and on the other hand, there is VR/AR aspects, which involve use of data that generates the VR/AR environment. This creates both challenges and opportunities. The challenge is how does the surgeon best integrate the real and virtual environments. The opportunity created is potentially a safer and more efficient operation.

In recent years, VR systems have helped the surgeon navigate during the course of the operation. Some systems show in near real time the location of surgical instruments with respect to the patient’s anatomy. Example systems have a 2D display with 3D volume rendering, which the surgeon can refer to during the course of the operation. However, during surgery the surgeon's attention is primarily focused on the operating environment real image. AR is an excellent option given the simultaneous presentation of the virtual and real image. 

AR has also been used in a wide variety of surgical procedures and is gaining significant popularity as it has helped make surgical interventions easier, faster and arguably safer [18]. Augmented reality has already been used for a wide variety of procedures, such as robotic liver resection, cholecystectomy (removal of the gallbladder), other laproscopic procedures and even neurosurgical procedures [18,19]. The anatomy of the human body is complex and varies from one person to the next. Understanding the patient's unique anatomy is a major challenge in surgery and an opportunity for enhancement with image guidance. Specifically, AR can enhance the surgeon's understanding of the patient's anatomy during the surgery. 

There have been several excellent review articles regarding VR and AR as these technologies apply to planning for and conducting surgery including the ones by Kersten-Oertel [20], Nicolau [21], and Meola [21,22]. Table 1 first provides a brief summary of requirements associated with VR/AR as it applies to surgical procedures and then outlines the technologies involved with key elements of the VR/AR system [23-38]. Table 2 provides examples of basic techniques by which VR/AR enrich the surgeon's understanding of how best to proceed with the operation. References are provided for those interested in further investigation [23,24,26,28,29,31,39-48].

Table 1. Requirements and Technologies Associated with Virtual Reality/Augmented Reality Support of Surgical Procedures

VR/ AR Aspect

Requirements/Technologies

References

 

Tasks

  • Navigate through a virtual scene
  • Use cutting planes
  • Rotate and translate objects
  • Toggle object visibility on/ off
  • Change opacity and colors
  • Numerous

 

  • Soler [23] Soler [24]
  • Grimson [25] Trevisan [26]

 

  • Numerous
  • Numerous

Interactive Tools/ Methods

  • Keyboard and mouse
  • 3 button mouse and Space Mouse (6DoF)
  • HMD with pointer
  • 3D pointer
  • Speech – gesture-based interaction
  • Virtual mirror
  • Many commercial
  • Splechtna[27]

 

  • Salb[28]
  • Katic[29]
  • Sudra[30]

 

  • Bichlmeier[31]

Displays

  • Monitor with live (or canned) video feed
  • Large screen display
  • Surgical microscope – including head mounted
  • Head mounted display
  • Stereo microscope
  • Projectors to project image directly on patient
  • See through displays
  • Many commercial

 

  • Many commercial
  • Many commercial

 

  • Many commercial
  • Many commercial
  • Edwards[32]

 

  • Tang[33] Ghanai[33]

Tracking

  • Infrared optical with reflectors
  • Tag video

 

  • Structural light optical
  • Electro-magnetic
  • Commercial (15 years)
  • Hostettler [34] Harders[35] Nicolau[36]
  • Nicolau[37] Albitar[38]
  • Commercial

Table 2. Techniques Associated with Virtual Reality/ Augmented Reality Support of Surgical Procedures

VR/ AR Category

Techniques

References

Anatomy

  • Wire frame or mesh
  • Isotropic risk potential and anisotropic tissue field
  • Numerous
  • Salb[32]

Visualization

  • Transparency of objects between surgeon and target
  • Occlusion to aid in the perception of the order of objects
  • Linear gradient texture (light closer to viewer - dark further)
  • Lighting and shading cues
  • Bichlmeier[39]39

 

  • Fuchs[40]

 

 

  • Wimmer[41]

 

 

  • Numerous

Locate Object

  • Camera image is augmented with wireframe preoperative models and displayed
  • Preoperative models of vessels/ROIs are displayed on patient
  • Fuchs[40]

 

 

  • Paolis[42]

Target Marking

  • Colored points to mark target
  • Different colors for hardness of target
  • Target registration error – e.g., 95% confidence ellipsoid
  • Wagner[43]
  • Suzuki[44]

 

  • Linte[45]

Navigate

  • Yellow lines planned osteotomy a blue line for the saw tool
  • Planned trajectories for bone cutting, boreholes and biopsy
  • Wagner[43]

 

 

  • Worn[46]

Distance data

  • Surgical tool to tumor by bar graph
  • Numerical distance to tumor
  • Surgical needle changes color when pointed at target
  • Dynamic sphere for distance of tool to target
  • Change target from solid to wireframe at specified distance
  • Color code objects in field of regard
  • Kawamata[47]

 

  • Soler[23]
  • Soler[24]

 

  • Trevisan[26]

 

  • Birkfellner[48]

 

 

  • Katic[29]

In this section, we will review the conventional operating room setup and four types of AR enhanced surgeries, including projection type AR, heads up display (HUD) type AR, operating microscope type AR and head mounted display (HMD) type AR. 

Conventional operating room setup

The conventional operating room is equipped with an operating bed with an overlying bright light, table for equipment, anesthesia unit, multiple areas for storage, as well as a diagnostic imaging station.  The diagnostic imaging station is typically positioned across the room from the operating table and the surgeon.  The surgeon will alternate looking at the operating field of view and the diagnostic imaging monitor from across the room throughout the operation. Thus, surgeons perform innumerable switching - look up, look down, look up, look down, etc.

Projection type AR enhanced surgery

In projection-type AR, an image (e.g., CT scan) is projected by means of a mechanical arm and beamer directly onto the patient (e.g., skin surface) to co-display the virtual and the real-world images. Ready-to-project images can be prepared within 10 minutes [18]. Landmarks such as the patient's umbilicus (belly button) have been used for registration [18]. The virtual image can be altered to display different anatomical elements, such as bone, blood vessels or internal organs. By a precise understanding of where the patient's organs lie beneath the skin, the skin punctures for the placement of the laparoscopic instruments or robotic trocars can be more precisely placed so that extreme movements of the instruments are avoided. Ultimately, patient safety is improved. If the abnormality that the surgeon is aiming to operate on is small, the overlaid images can be extremely beneficial for navigation. Furthermore, precision dissection through solid organs is advantageous in saving as much normal tissue as possible.

In one study by Besharati et al, a series of 10 brain tumors of different sizes and location were visualized by projection type AR with an image projected directly onto the patient's skin [19]. The video projector was used in concert with 5 fiducial markers for spatial registration, with a mean time of image registration of 3.8 minutes and a mean projection error of 0.8 mm +/-0.25 mm [19]. In this study, there was no significant difference between the accuracy of the projection type AR system and standard neurosurgical navigation systems [19]. There is, however, a significant problem with this technique, which is parallax. In a letter to the editor of Journal of Neurosurgery, Ferrari points out the surgeon’s viewing angle cannot be co-incident with that of the projected image and thus parallax occurs [22]. The parallax error increases with the depth of the region being operated on and with surgeon/projector misalignment. However, there have been other approaches used to correct for this parallax problem [50,51]

Heads up display (HUD) type AR enhanced surgery

In the HUD type AR enhanced surgery, the surgeon looks into a HUD containing both the real and virtual images. Some of the HUDs contain semi-transparent screens such that a portion of the natural light from the real image is seen along with the virtual image displayed on a hologram [52]. Other methods use a digital display that includes both a real-world image provided by a video camera and the virtual image presented on the same digital display.

In one study by Marker et al, a total of 23 bilateral paravertebral sympathetic nerve plexus was injected using HUD type AR into the thoracic, lumbar and hypogastric regions [53]. In this study, 46 of 46 (100%) of injections were on target. The mean error at the needle tip was 3.9 mm +/- 1.7 mm and no critical non-target structures were hit.

Operating microscope type AR enhanced surgery

In operating microscope type AR enhanced surgery, the surgeon views both the real image and the virtual image through the operating microscope.  The operating microscope is commonly used in a wide variety of surgeries today in the fields of neurosurgery, ophthalmology, otolaryngology and plastic surgery. The AR-display system can be integrated into an operating microscope, which surgeons are already accustomed to using during operations [32]. The operating microscope is mounted to a stable gantry; thus, changes in the relative positions between the surgeon, microscope and patient are minimized and the requirement for real-time head position tracking is minimized [48].

Since the operating microscope is already accepted in the operating room and the position of the operating microscope is relatively fixed, the operating microscope is an excellent method to further introduce AR into the field of surgery [48]. In a study by Raya et al, the traditional optical operating microscope was replaced with a digital operating microscope [54]. The neuronavigator virtual image was overlaid onto the real image of the surgical field of view to assist the neurosurgeon in the operation and the initial trials performed in a laboratory were successful.

Head mounted display (HMD) type AR enhanced surgery

Prior to discussing HMD type AR, we will provide a brief background on the surgeon's visual environment while in the operating room. It is a requirement for surgeons to wear goggles during a surgery to maintain a sterile operating field and to protect against possible blood or bodily fluid splashes into the eye. Some surgeons wear eyeglasses to meet both personal vision requirements and the protection/sterility requirement. For finer detail, many surgeons use magnifying surgical loops, which provide greater detail of small anatomical structures. Thus, all surgeons are accustomed to using eyewear while operating.

In HMD type AR enhanced surgery, the surgeon uses a HMD with computer-generated virtual image superimposed over the real-world scene of the patient's anatomy during the surgery [48]. One of the requirements is that the virtual image and real-world scene of the patient's anatomy must be appropriately registered with their positions aligned [48]. This requires accurate tracking of the AR user's HMD position and orientation with respect to the real-world scene [48]. This enhances a surgeons visualization of surgical anatomy [55] and enables the surgeon to focus on the operating field without the hindrance of switching back and forth to the monitor displaying the radiology images.

In HMD type AR, the user wears a HMD containing an AR-display system, which has historically been bulkier than the surgical loupes. The early HMDs displayed the virtual image and the real-world image in different focal planes and therefore could not both be in focus at the same time [48,56]. Complex systems utilizing a video camera solved the problem of differing focal planes, but suffered from heavier weight, parallax effects, and lower quality of the real-world image [57]. Fortunately, these problems have been solved through the use of miniature head-mounted binocular AR systems, such that the virtual image and the real-world images can be merged [48].

Two key challenges are facing HMD type AR.  First is the challenge of surgeon acceptance of HMD into the operating room. Surgeons are accustomed to wearing glasses, but experience has suggested that bulky HMDs will not be accepted by surgeons [18]. Thus, improvements in making HMDs lighter and smaller are critical for advancing the field of HMD type AR surgery. Second is the requirement for synchronization of head position tracking so the virtual image displayed to the user aligns with the real-world image. Alignment preservation is necessary for precision of surgery and prevention of motion sickness [58]. Associated with this synchronization is the challenge of displaying a new virtual image at a frame rate adequate to match the changing real-world image.  References or fiducial markers have traditionally been used to optimize registration of the two co-displayed images.

AR technologies in Laparoscopic/Endoscopic Surgery

Bernhardt et al. provides a comprehensive review of the latest techniques in augmented reality enhanced surgery [59]. Because laparoscopic/endoscopic surgeries are performed on non-rigid, hollow structures, the majority of laparoscopic/endoscopic surgeries do not involve the use of image-guidance for surgery since registration of the virtual image and the mobile organs in the operating field of view becomes exceedingly difficult.  As a consequence, the majority of laparoscopic surgeries performed do not use AR for guidance. Surgeons typically rely solely on the laparoscopic image, but can refer to pre-operative imaging displayed separately.

As an example, laparoscopic liver surgery is inherently dangerous because large blood vessels can easily be hit and bleeding can be severe. Phutane et al. has performed preliminary trials of using an augmented reality guidance system (ARGS) in the assessment of hepatocellular carcinoma after it was resected laparoscopically [60]. In this trial, the authors found that the laparoscopic system aided in the finding of the transection plane, the nearby hepatic vein and the tumor. The authors have done 8 similar cases of viewing the tumor and now feel that the ARGS should be attempted intra-operatively.

Important considerations during AR enhanced surgery

The virtual image used in AR is most commonly a pre-operative image. Given the simultaneous presentation of both the virtual image and the real-world image, AR can be effective; however, the surgeon must proceed with the understanding that the virtual image represents the anatomy at the pre-operative state. New changes in the patient's anatomy (e.g., partial resection of tumor, new areas of bleeding, etc.) would not be reflected in the virtual image.  While AR enhanced surgery may be appropriate for the beginning of an operation, it may not be as ideal for the mid-portions or later portions of an operation, given the fact that the patient's anatomy has changed. Intra-operative CT or fluoroscopic imaging would be beneficial to track changes in the patient's anatomy and condition during the surgery and could provide an updated virtual image. For example, it is possible during the course of the operation to employ 3D rotational angiograms under fluoroscopy to determine how the lesion (e.g., cerebral arteriovenous malformation) may have changed so that a more accurate registration can be obtained.

An additional consideration is registration of any projected imagery with that the real-time image the surgeon sees during the operation. For non-rigid soft tissues, registration is significantly more difficult. There are techniques such as deformable registration, which can be applied to reduce the error.

Future

In the future, pre-operative planning performed with advanced imaging techniques of VR/AR will be intimately linked to the AR enhanced surgery. The radiologist and surgeon will flag certain aspects of an operative site to be noted during the surgery. Examples of flagged items include the neck of the aneurysm, the boundaries of the tumor, a fragile blood vessel in close proximity to the operative site, certain landmarks in complex anatomy, and critical brain structures. These structures will be flagged to the surgeon during the operation through AR and the surgeon can be reminded of critical tasks while in the operation.

As the field of surgery enhances through methods such as robotic surgery with refining movements through scaling down the magnitude of the gestures, the role of AR will be ever increasing. Undoubtedly, multiple types of intraoperative AR will be used, but the HMD type offers depth perception, head tracking and improved HMI to facilitate the surgeon navigate the complex challenges. Now is the time for advancing research with VR/AR to improve surgical success.

Conclusions

Surgeons face significant challenges of how to operate on complex anatomical structures. Pre-operative planning is progressing with advanced imaging techniques including VR and AR, which hold promise in superior appreciation of finer details of complex structures and better surgeon recall of the complex anatomy while operating. AR enhanced surgery with simultaneous display of real and virtual images will be important in improving surgical outcomes. It is hopeful that continued advances in virtual and augmented reality technologies, as applied to surgery, will decrease morbidity, decrease length of hospital stay, improve patient outcomes, and also decrease overall hospital expenditures.

Acknowledgements

None

Funding information

None

Disclosures

Author DD has a family member with a financial interest in D3D Technologies.  CW and DG work for DXC Technology which has teamed with D3D Technologies for further development. EP and LL have a direct financial interest in D3D Technologies.

Supplemental materials

None

References

  1. CDC.
  2. Insider B.
  3. Debt.org .
  4. Baus O, Bouchard S (2014) Moving from virtual reality exposure-based therapy to augmented reality exposure-based therapy: a review. Front Hum Neurosci. 8:112. [Crossref]
  5. MEREL T. The 7 drivers of the $150 billion AR/VR industry. Aol Tech 2015.
  6. Chen W, Chao J-G, Zhang Y, Wang J-K, Chen X-W, et al. (2017) Orientation Preferences and Motion Sickness Induced in a Virtual Reality Environment. Aerosp Med Hum Perform 88: 903-910. [Crossref]
  7. Lovo EE, Quintana JC, Puebla MC, Torrealba G, Santos JL, et al. (2007) A novel, inexpensive method of image coregistration for applications in image-guided surgery using augmented reality. Neurosurgery 60(4 Suppl 2): 366-371. [Crossref]
  8. Ferroli P, Tringali G, Acerbi F, Schiariti M, Broggi M, et al. (2013) Advanced 3-dimensional planning in neurosurgery. Neurosurgery 72(suppl_1): A54-A62. [Crossref]
  9. Pelargos PE, Nagasawa DT, Lagman C, Tenn S, Demos JV, et al. (2017) Utilizing virtual and augmented reality for educational and clinical enhancements in neurosurgery. J Clin Neurosci 35: 1-4. [Crossref]
  10. 2021 Copyright OAT. All rights reserv
  11. Douglas D; US Patent Office, assignee. US 8,384,771 Method and Apparatus for Three Dimensional Viewing of Images. USA2013.
  12. Douglas D; US Patent Office, assignee. US 9,349,183 Method and Apparatus for Three Dimensional Viewing of Images. USA2016.
  13. Douglas DB, Boone JM, Petricoin E, Liotta L, Wilson E (2016) Augmented Reality Imaging System: 3D Viewing of a Breast Cancer. J Nat Sci 2. [Crossref]
  14. Douglas DB, Petricoin EF, Liotta L, Wilson E (2016) D3D augmented reality imaging system: proof of concept in mammography. Med Devices (Auckl) 9: 277-283. [Crossref]
  15. Ahlberg G, Enochsson L, Gallagher AG, Hedman L, Hogman C, et al. (2007) Proficiency-based virtual reality training significantly reduces the error rate for residents during their first 10 laparoscopic cholecystectomies. Am J Surg 193: 797-804. [Crossref]
  16. Munz Y, Almoudaris AM, Moorthy K, Dosis A, Liddle AD, et al. (2007) Curriculum-based solo virtual reality training for laparoscopic intracorporeal knot tying: objective assessment of the transfer of skill from virtual reality to reality. Am J Surg.193:774-783. [Crossref]
  17. Aggarwal R, Ward J, Balasundaram I, Sains P, Athanasiou T, et al. (2007) Proving the effectiveness of virtual reality simulation for training in laparoscopic surgery. Ann surg 246:771-779. [Crossref]
  18. Lemole M, Banerjee PP, Luciano C, Charbel F, Oh M (2009) Virtual ventriculostomy with'shifted ventricle': neurosurgery resident surgical skill assessment using a high-fidelity haptic/graphic virtual reality simulator. Neuro res 31:430-431. [Crossref]
  19. Volonte F, Pugin F, Bucher P, Sugimoto M, Ratib O, et al. (2011) Augmented reality and image overlay navigation with OsiriX in laparoscopic and robotic surgery: not only a matter of fashion. J Hepatobiliary Pancreat Sci 18: 506-509. [Crossref]
  20. Besharati Tabrizi L, Mahvash M (2015) Augmented reality–guided neurosurgery: accuracy and intraoperative application of an image projection technique. J neurosurg 123: 206-211. [Crossref]
  21. Kersten-Oertel M, Jannin P, Collins DL (2013) The state of the art of visualization in mixed reality image guided surgery. Comput Med Imaging Graph 37: 98-112. [Crossref]
  22. Nicolau S, Soler L, Mutter D, Marescaux J (2011) Augmented reality in laparoscopic surgical oncology. Surg Oncol 20: 189-201. [Crossref]
  23. Meola A, Cutolo F, Carbone M, Cagnazzo F, Ferrari M, et al. (2016) Augmented reality in neurosurgery: a systematic review. Neurosurg Rev: 1-12. [Crossref]
  24. Soler L, Ayache N, Nicolau S, Pennec X, Forest C, et al. (2004) Virtual reality, augmented reality and robotics in surgical procedures of the liver: 476.
  25. Soler L, Nicolau S, Schmid J, Koehl C, Marescaux J, et al. (2004) Virtual reality and augmented reality in digestive surgery. IEEE Computer Society: 278-279.
  26. Grimson WEL, Ettinger GJ, White SJ, Gleason PL, Lozano-Pérez T, et al. (1995) Evaluating and validating an automated registration system for enhanced reality visualization in surgery. Springer: 3-12.
  27. Trevisan DG, Nedel LP, Macq B, Vanderdonckt J (2006) Detecting interaction variables in a mixed reality system for maxillofacial-guided surgery. SVR2006 39-50.
  28. Splechtna RC, Fuhrmann AL, Wegenkittl R. Aras-augmented reality aided surgery system description. VRVis Research Center Technical Report 2002.
  29. Salb T, Brief J, Burgert O, Gockel T, Hassfeld S, et al. (2002) Intraoperative augmented reality for craniofacial surgery: the INPRES system.
  30. Katic D, Sudra G, Speidel S, Castrillon-Oberndorfer G, Eggers G, et al. (2010) Knowledge-based situation interpretation for context-aware augmented reality in dental implant surgery. Medical Imaging and Augmented Reality: 531-540.
  31. Sudra G, Speidel S, Fritz D, Müller-Stich BP, Gutt C, et al. (2007) MEDIASSIST: medical assistance for intraoperative skill transfer in minimally invasive surgery using augmented reality. 2007. International Society for Optics and Photonics: 65091O-65091O-9.
  32. Bichlmeier C, Heining SM, Feuerstein M, Navab N (2009) The virtual mirror: a new interaction paradigm for augmented reality environments. IEEE Trans Med Imaging. 28: 1498-1510. [Crossref]
  33. Edwards PJ, King AP, Maurer CR, Jr., de Cunha DA, Hawkes DJ, et al. (2000) Design and evaluation of a system for microscope-assisted guided interventions (MAGI). IEEE Trans Med Imaging 19: 1082-1093. [Crossref]
  34. Tang A, Zhou J, Owen C (2003) Evaluation of calibration procedures for optical see-through head-mounted displays. IEEE: 161-168.
  35. Hostettler A, Forest C, Soler l, Marescaux J (2011) A cost effective simulator for education of ultrasound image interpretation and probe manipulation. Medicine Meets Virtual Reality 18: Next Med 163: 403.
  36. Harders M, Bianchi G, Knoerlein B, Székely G (2009) Calibration, registration, and synchronization for high precision augmented reality haptics. IEEE Trans Vis Comput Graph 15:138-149. [Crossref]
  37. Nicolau S, Goffin L, Soler L (2005) A low cost and accurate guidance system for laparoscopic surgery: Validation on an abdominal phantom. ACM: 124-133.
  38. Nicolau S, Brenot J, Goffin L, Graebling P, Soler L, et al. (2008) A structured light system to guide percutaneous punctures in interventional radiology: 700016.
  39. Albitar C, Graebling P, Doignon C (2007) Robust structured light coding for 3d reconstruction. IEEE. p 1-6.
  40. Bichlmeier C, Sielhorst T, Heining SM, Navab N (2007) Improving depth perception in medical ar. Bildverarbeitung für die Medizin 2007: Springer: 217-221.
  41. Fuchs H, Livingston MA, Raskar R, Keller K, Crawford JR, et al. (1998) Augmented reality visualization for laparoscopic surgery. Springer: 934-943.
  42. Wimmer F, Bichlmeier C, Heining SM, Navab N (2008) Creating a vision channel for observing deep-seated anatomy in medical augmented reality. Bildverarbeitung für die Medizin 2008:298-302.
  43. De Paolis LT, Pulimeno M, Aloisio G (2008) An augmented reality application for minimally invasive surgery. Springer: 489-492.
  44. Wagner A, Rasse M, Millesi W, Ewers R (1997) Virtual reality for orthognathic surgery: the augmented reality environment concept. J Oral Maxillofac Surg 55: 456-462. [Crossref]
  45. Suzuki N, Hattori A, Tanoue K, Ieiri S, Konishi K, et al. (2010) Scorpion shaped endoscopic surgical robot for NOTES and SPS with augmented reality functions. Medical Imaging and Augmented Reality: 541-550.
  46. Linte CA, Moore J, Wiles A, Lo J, Wedlake C, et al. (2009) In vitro cardiac catheter navigation via augmented reality surgical guidance. 2009: 72610-9.
  47. Wörn H, Aschke M, Kahrs L (2005) New augmented reality and robotic based methods for head-surgery. Int J Med Robot 1: 49-56. [Crossref]
  48. Kawamata T, Iseki H, Shibasaki T, Hori T (2002) Endoscopic augmented reality navigation system for endonasal transsphenoidal surgery to treat pituitary tumors. Neurosurgery 50: 1393-1397. [Crossref]
  49. Birkfellner W, Figl M, Huber K, Watzinger F, Wanschitz F, et al. (2002) A head-mounted operating binocular for augmented reality visualization in medicine--design and initial evaluation. IEEE Trans Med Imaging 21: 991-997. [Crossref]
  50. Birkfellner W, Figl M, Matula C, Hummel J, Hanel R, et al. (2003) Computer-enhanced stereoscopic vision in a head-mounted operating binocular. Phys Med Biol 48: N49. [Crossref]
  51. Liao H, Nomura K, Dohi T (2006) Long visualization depth autostereoscopic display using light field rendering based integral videography. IEEE: 314-314.
  52. Chen G, Zhang X, Fan Z, Liao H (2015) An innovative calibration based integral photography rendering algorithm for medical application and its evaluation. IEEE: 4226-4229.
  53. Marker DR, Paweena U, Thainual TU, Flammang AJ, Fichtinger G, et al. (2017) 1.5 T augmented reality navigated interventional MRI: paravertebral sympathetic plexus injections. Diagn Interv Radiol. 23: 227. [Crossref]
  54. Marker DR, P UT, Ungi T, Flammang AJ, Fichtinger G, et al. (2017) 1.5 T augmented reality navigated interventional MRI: paravertebral sympathetic plexus injections. Diagn Interv Radiol 23: 227-232. [Crossref]
  55. Raya MA, Marcinek HV, Saez JM, Sanchez R, Lizandra M, et al. (2003) Mixed reality for neurosurgery: a novel prototype. Stud Health Technol Inform 94: 11-15. [Crossref]
  56. Kang X, Azizian M, Wilson E, Wu K, Martin AD, et la. (2014) Stereoscopic augmented reality for laparoscopic surgery. Surg Endosc 28: 2227-2235. [Crossref]
  57. Watzinger F, Wanschitz F, Rasse M, Millesi W, Schopper C, et al. (1999) Computer-aided surgery in distraction osteogenesis of the maxilla and mandible. Int J Oral Maxillofac Surg 28: 171-175. [Crossref]
  58. Maurer C, Sauer F, Hu B, Bascle B, Geiger B, et al. (2001) Augmented reality visualization of brain structures with stereo and kinetic depth cues: System description and initial evaluation with head phantom. Medical Imaging: 445-456.
  59. Bangay S, Preston L (1998) An investigation into factors influencing immersion in interactive virtual reality environments. Stud Health Technol Inform 58: 43-51. [Crossref]
  60. Bernhardt S, Nicolau SA, Soler L, Doignon C (2017) The status of augmented reality in laparoscopic surgery as of 2016. Med Image Anal 37: 66-90. [Crossref]
  61. Phutane P, Buc E, Poirot K, Ozgur E, Pezet D, et al. (2017) Preliminary trial of augmented reality performed on a laparoscopic left hepatectomy. Surg Endosc 1-2. [Crossref]

Editorial Information

Editor-in-Chief

S C. Batterman
University of Pennsylvania

Article Type

Review Article

Publication history

Received date: October 16, 2017
Accepted date: November 06, 2017
Published date: November 10, 2017

Copyright

©2017 Douglas DB. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Citation

Douglas DB, Wilke CA, Gibson D, Petricoin EF, Liotta L (2017) Virtual reality and augmented reality: Advances in surgery. Biol Eng Med 3: DOI: 10.15761/BEM.1000131

Corresponding author

David B Douglas M.D.,

Department of Radiology, 300 Pasteur Drive, Room S047, Stanford, CA 94305-5105, USA, Tel: 650-723-4000; ext 14257;

E-mail : bhuvaneswari.bibleraaj@uhsm.nhs.uk

Figure 1. Overview of the geometry used in generating the Depth-3-Dimensional (D3D) images. Reprinted from Journal of Medical Devices: Evidence and Research, Volume 9, Douglas et al. " D3D augmented reality imaging system: proof of concept in mammography", 277-283, 2016 with permission from Dove Medical Press Ltd [13].

Figure 2. Depth-3-Dimensional (D3D) showing the cerebral vasculature, which would be viewed with either VR or AR.  Note the image on the left is left eye viewing perspective and the image on the right is the right eye viewing perspective, which provides depth perception. This depth perception cannot be appreciated on the paper format of this journal and requires a VR or AR headset. The red boxes represent the 3D cursors used.  The white arrows point to middle cerebral artery branches, which course at varying depths. 

Figure 3. In a conventional operating room, the surgeon alternates the direction he or she is looking.  For example, the surgeon looks down at the patient and then looks up across the room to the image. Note that the diagnostic image shown is a volume rendered image of the bones centered on the cervical spine.

Figure 4. Illustration of projection-type AR enhanced surgery.  (A) Illustration of a projector projection an image onto the patient's skin. (B) Illustration of the projected image onto the patient's skin. A requirement for this technique being effective is accurate registration of the patient's anatomy with the projected image. 

Figure 5. Illustration of heads up display (HUD) type AR enhanced surgery. Note that the surgeon looks through a mounted HUD to see both the real and virtual images [47]. 

Figure 6. Illustration of operating microscope type AR enhanced surgery. Note that the microscope is positioned above the patient as would naturally be done in surgery.  The image that is viewed through the microscope includes simultaneous display of both a real and virtual image. 

Figure 7. Illustration of HMD type AR enhanced surgery.  As the surgeon's head moves, both the real image and virtual image should move together in synchrony with accurate registration.

Table 1. Requirements and Technologies Associated with Virtual Reality/Augmented Reality Support of Surgical Procedures

VR/ AR Aspect

Requirements/Technologies

References

 

Tasks

  • Navigate through a virtual scene
  • Use cutting planes
  • Rotate and translate objects
  • Toggle object visibility on/ off
  • Change opacity and colors
  • Numerous

 

  • Soler [23] Soler [24]
  • Grimson [25] Trevisan [26]

 

  • Numerous
  • Numerous

Interactive Tools/ Methods

  • Keyboard and mouse
  • 3 button mouse and Space Mouse (6DoF)
  • HMD with pointer
  • 3D pointer
  • Speech – gesture-based interaction
  • Virtual mirror
  • Many commercial
  • Splechtna[27]

 

  • Salb[28]
  • Katic[29]
  • Sudra[30]

 

  • Bichlmeier[31]

Displays

  • Monitor with live (or canned) video feed
  • Large screen display
  • Surgical microscope – including head mounted
  • Head mounted display
  • Stereo microscope
  • Projectors to project image directly on patient
  • See through displays
  • Many commercial

 

  • Many commercial
  • Many commercial

 

  • Many commercial
  • Many commercial
  • Edwards[32]

 

  • Tang[33] Ghanai[33]

Tracking

  • Infrared optical with reflectors
  • Tag video

 

  • Structural light optical
  • Electro-magnetic
  • Commercial (15 years)
  • Hostettler [34] Harders[35] Nicolau[36]
  • Nicolau[37] Albitar[38]
  • Commercial

Table 2. Techniques Associated with Virtual Reality/ Augmented Reality Support of Surgical Procedures

VR/ AR Category

Techniques

References

Anatomy

  • Wire frame or mesh
  • Isotropic risk potential and anisotropic tissue field
  • Numerous
  • Salb[32]

Visualization

  • Transparency of objects between surgeon and target
  • Occlusion to aid in the perception of the order of objects
  • Linear gradient texture (light closer to viewer - dark further)
  • Lighting and shading cues
  • Bichlmeier[39]39

 

  • Fuchs[40]

 

 

  • Wimmer[41]

 

 

  • Numerous

Locate Object

  • Camera image is augmented with wireframe preoperative models and displayed
  • Preoperative models of vessels/ROIs are displayed on patient
  • Fuchs[40]

 

 

  • Paolis[42]

Target Marking

  • Colored points to mark target
  • Different colors for hardness of target
  • Target registration error – e.g., 95% confidence ellipsoid
  • Wagner[43]
  • Suzuki[44]

 

  • Linte[45]

Navigate

  • Yellow lines planned osteotomy a blue line for the saw tool
  • Planned trajectories for bone cutting, boreholes and biopsy
  • Wagner[43]

 

 

  • Worn[46]

Distance data

  • Surgical tool to tumor by bar graph
  • Numerical distance to tumor
  • Surgical needle changes color when pointed at target
  • Dynamic sphere for distance of tool to target
  • Change target from solid to wireframe at specified distance
  • Color code objects in field of regard
  • Kawamata[47]

 

  • Soler[23]
  • Soler[24]

 

  • Trevisan[26]

 

  • Birkfellner[48]

 

 

  • Katic[29]