Intra-operative visualization has been a key driver of surgical advancement since the beginnings of modern surgery. Indeed, the first operating room, built in 1804 at Pennsylvania Hospital, relied upon overhead skylights and a bright, cloudless day for surgery to be possible (1). Since then, great strides have been made in the advancement of surgical visualization, ushering in a parallel improvement in surgical success (Figure 1). Spine surgery has particularly benefited from these innovations in intra-operative visualization, as the complex and intricate anatomy of the spine and its neighboring structures requires exceptional discernment to avoid catastrophic complications. Over time, these new technologies have not only gained visual acuity and power, but have also become smaller and more portable, with many devices now able to be worn by surgeons themselves. From the first X-ray machines to sleek head-mounted displays, a detailed understanding of the evolution and impact of these visualization tools can help us to predict how emerging innovations will continue to revolutionize spine surgery.
Past: early wearables enhance vision
The first ‘wearable’ in surgery, defined as a miniaturized device that can easily be put on or taken off the body (2), was the surgical loupe. This device, consisting of decentered convex lenses attached to a spectacle frame, was simultaneously invented by Carl Von Hess and Edwin Theodore Saemisch, both German ophthalmologists, in 1876 (3,4). In the years since, surgical loupes were refined to be lighter and more powerful, and were augmented with built-in headlamps to enhance image quality. The budding industry of surgical optics soon converged with microscope manufacturing under Carl Zeiss and Ernst Abbe, the original founders of the optical device company. Zeiss developed new materials to improve existing surgical loupes, and, in 1922, pioneered the first binocular operating microscope with the attachment of a light source to an existing Zeiss dissecting microscope (5). The success of this early operating microscope in otolaryngology procedures quickly led to its widespread adoption by other ENT surgeons, ophthalmologists, and neurosurgeons, and is largely credited for giving rise to the field of microsurgery (6).
Magnification of the surgical field by loupes and microscopes allows for better visualization and maneuvering, and has been shown to benefit surgical outcomes in several specialties (7,8); to this day, both are a mainstay in many operating rooms. Recently, in an attempt to improve the ergonomics of operating microscopes, exoscopes have been engineered to deliver light and magnification uniformly across a wide depth of field with the resulting image displayed on an external digital screen rather than solely through the microscope’s objective lenses (9,10). While neurosurgeons are primarily trained to use microscopes, several studies have shown wearable loupes to be comparable or superior to operating microscopes in many neurosurgical applications such as peripheral nerve repairs and microvascular anastomoses, especially when minimizing operative time is a priority (11,12). Binocular magnification through loupes and microscopes are also used extensively in spine procedures, as enhanced visualization is necessary when making an approach to the deep narrow field inside the disc space. This assistance is especially helpful in procedures such as cervical, thoracic, and lumbar laminoplasty, craniovertebral decompression and fixation, spinal tumor removal, and anterior cervical discectomy and fusion (13).
Loupes, and their close relative—though not truly a wearable instrument—the operating microscope, were the sole mechanisms of surgical visualization until the end of the 19th century. In 1895, Wilhelm Conrad Roentgen’s accidental discovery of X-rays ushered in a new era of surgical visualization, one that moved beyond the visible wavelength (14). In the ensuing century, radiography, fluoroscopy, magnetic resonance imaging (MRI), and other imaging modalities have defined surgical visualization. Though the bulky equipment needed for these techniques contrasts starkly with earlier wearables, radiography and related imaging technologies allowed for unprecedented visualization into the human body. Spine surgeons have arguably benefitted disproportionately compared to other specialties, given the complex and often difficult-to-access anatomy of the spine. By mapping pathologies to sub-millimeter accuracy without exposing surrounding anatomy, spine surgeons have been able to plan and execute procedures that would otherwise be tales of science fiction, from percutaneous pedicle screw placement to spinal deformity correction.
Present: head-up display (HUD) and augmented reality (AR) technology
Though the tremendous surgical benefit brought along by medical imaging techniques cannot be overstated, until recently, the development of wearable visualization devices has largely plateaued. Instead, almost all current pre-operative and intra-operative imaging techniques display their associated images on an external monitor besides the surgical field rather than directly along the surgeon’s line of sight, requiring the surgeon to shift attention away from the patient every time they refer to the image guidance. This back-and-forth fluctuation of a surgeon’s focus between the surgical field and the image monitor, known as ‘alternating attention’, creates unnecessary distractions and movements, and can lead to surgeon fatigue and an increased risk of intra-operative complications (15).
To solve these issues and enhance the utility of navigated and radiographically assisted procedures, the surgical community has recently become interested in HUD and AR technologies as methods of overlaying holograms of the patient’s anatomy directly over the surgical field. HUD technology, implemented in surgeries as early as 1995, uses mini-projectors or screens mounted on a head frame to display relevant imaging to the operator (16,17). These early devices were found to reduce strain and vertigo experienced by the surgeon when referencing external monitors for imaging, improving operative workflow. While pioneering HUDs, consisting of small screens connected to spectacle frames, were still quite bulky and caused ergonomic discomfort, newer LED and projector-based HUDs made this technology quite attractive for displaying pre-operative radiographic scans to the surgeon. Common platforms include Google Glass (Google Inc., Mountain View, CA, USA) and Moverio Smart Glasses (Epson Inc, Suwa, Japan).
AR devices built upon these advancements by allowing surgeons to superimpose imaging scans directly onto the surgical field. AR achieves this by registering the patient’s physical body with anatomical imaging using fiducial markers during an intra-operative computed tomography (CT) or MRI scan. External cameras then track the fiducial markers fixed to the patient’s body, working together with head and eye tracking sensors on the HUD to overlay images onto the surgical field in the operator’s line of sight (18). Thus, AR wearables can help surgeons visualize complex bony anatomy underneath the body’s surface, which can be critical when performing complex spine procedures such as multi-level fusions and deformity corrections. Newer AR modalities capable of more advanced pre-operative image processing have even been used to approximate tissue margins in intra-dural tumor resections, expanding the indications for AR assistance in surgery beyond bony manipulation (19). AR can also be combined with navigation platforms, enabling real-time tracking of tools and instrumentation with respect to the virtual holographic display. This can further improve operative workflow and save a significant amount of radiation exposure to the patient and the operative team, by helping surgeons maintain proper trajectory alignment for pedicle screw placement or identify the correct vertebral level for fusion procedures. Newer innovations still are continuing to supplement existing AR technology, such as voice and gesture recognition which can allow for hands-free device control with voice commands or hand gestures without contaminating the sterile field (20). Together, AR and HUD devices represent a paradigm shift in wearable visualization, by combining the concept of loupe-style spectacles with image-guidance and intra-operative navigation.
AR implementation in spine surgery is not only feasible and safe but has been shown to improve surgical accuracy and performance. One of the most salient use cases for wearable AR technology in spine surgery is pedicle screw fixation, as screw breaches can lead to significant complications including weakness, sensory loss, radicular pain, neurological impairment, and even paralysis. Several studies have compared AR-guided screw placement to fluoroscopically assisted and freehand techniques in both patients and cadavers. Across the board, AR-guidance was shown to be comparable to traditional screw placement, with AR implementation almost always associated with significantly shorter alignment times, higher accuracy, and reduced post-operative complications (21,22). AR is especially valuable in surgeries of the thoracic spine, as the smaller pedicle and crowded bony anatomy makes instrumentation inherently more difficult compared to the lumbar spine (23).
The most common AR platform used in previous feasibility studies is the Microsoft HoloLens (Microsoft Corp., Redmond, WA, USA), an affordable, consumer-grade wearable AR device that can be adapted to fit into many different surgical workflows. Our group, for example, has previously used the Microsoft HoloLens device to build and implement a low-cost pipeline to construct, visualize, and register intraoperative holographic models of patients during spinal fusion surgery (24). Other platforms, while significantly more expensive, include pre-built registration programs, effectively allowing surgeons to begin utilizing AR immediately. These include XVision (Augmedics, Arlington Heights, IL, USA) and OpenSight (Novarad, American Fork, Utah), which rely upon custom head-mounted displays for AR visualization (25).
Future: expanding augmented reality indications and new technology development
There have been several key studies investigating different visualization technologies and summarizing the current literature landscape (Table 1). To date, most investigational studies using AR and HUD wearable devices for spine surgery have focused on pedicle screw placement. This is understandable, given the remarkable problem-solution fit between maintaining trajectory accuracy and wearable AR guidance. Still, there are numerous additional spine procedures where AR navigation can greatly assist surgical workflow and potentially improve surgical accuracy and outcome.
|Exoscopes||Langer et al. (10)||2020||A summary of current exoscope platforms and their applications in spine surgery, including pedicle screw instrumentation|
|Head-up display||Yoon et al. (26)||2018||A systematic review focusing on the usage of wearable head-up displays in surgery, including spine surgery|
|Augmented Reality||Burström et al. (27)||2021||A summary of the current evidence for using augmented reality navigation in spine surgery|
|Sumdani et al. (28)||2021||A systematic review of data regarding the use of augmented reality and virtual reality modalities in spine surgery|
|Chidambaram et al. (29)||2021||A review of augmented reality and its relationship with other symbiotic operative technologies such as robotics in spine surgery|
This table summarizes both systematic and narrative review articles that summarize the current landscape of advanced visualization technology in spine surgery.
The most notable future application of AR is minimally invasive spinal surgery (MIS), including procedures such as endoscopic decompression, lateral and anterior lumbar interbody fusion, and deformity correction. While traditional surgical approaches are still utilized in a number of clinical scenarios, MIS alternatives can result in decreased intra-operative blood loss, post-operative narcotic usage, and length of hospital stay (30). New surgical equipment such as robotic arms and endoscopic instruments are substantially expanding the capabilities of, and indications for, MIS; for this potential to be fully realized, however, novel methods to visualize anatomy and guide surgery are needed. While traditional open spine surgery exposes a significant amount of bony anatomy, MIS limits the degree to which traditional intra-operative landmarks can be seen and palpated, and thus relies heavily on imaging to guide surgical intervention. Virtual 3-dimensional reconstruction and visualization of patient anatomy through wearable AR can retain the natural visualization of these landmarks and enable surgeons to perform MIS procedures with minimal alteration to their existing operative workflow. This contrasts with screen-based fluoroscopic navigation currently used in MIS, which, while adequate, is not comparable to AR’s ability to simulate the spatial and proprioceptive experience of physically locating and using anatomical landmarks. Additionally, AR can minimize the significant radiation exposure in MIS, which is additive to the considerable radiation dose in traditional spine surgery. A meta-analysis of MIS versus open fusion surgery showed a significantly higher X-ray exposure in MIS (31), which can largely be avoided if using AR-assisted navigation. Thus, AR in MIS has substantial implications for both patient outcomes and surgeon safety.
Improvements to existing AR technology is on the horizon as well. New AR development should focus on enhancing the ergonomic feel of wearable AR frames, ensuring that size and bulk do not impede surgical workflow or distract the surgeon. Along with disruption to existing surgical workflow, lack of affordability is a major barrier to widespread AR adoption (32). AR platforms currently on the market can cost up to $250,000 USD and have significant per-case disposable costs as well. Lastly, technical difficulties with image rendering and registration need to be solved. Current registration pipelines require active insertion of fiducial markers into the patient’s axial skeleton before running an intra-operative O-arm scan, to register the external sensors with underlying patient anatomy. However, this process is invasive and requires surgical incisions beyond the primary area of pathology. New devices already under development are aiming to improve this process by securing the same degree of patient registration with skin-based fiducial stickers that can be placed on the surface of the patient’s body rather than surgically embedded. Infrared depth sensors are also being explored as a means of replacing the intra-operative O-arm scan necessary for superimposing existing pre-operative radiographic imagery onto the patient’s body (33,34). In the near future, it is not inconceivable that an intra-operative infrared depth scan of the patient and associated fiducial markers—possibly even with a smartphone such as the iPhone (Apple Inc., Cupertino, CA, USA), which has remarkably powerful infrared depth sensors—will provide the same level of registration accuracy to the patient’s pre-operative radiographic scan as a complete O-arm spin.
Yet, for all the tremendous potential of wearable visualization technologies, the pinnacle of surgical visualization will eventually transcend devices that surgeons must don and doff, and return to the simplicity with which surgery was first conducted. Just as the first operating rooms used a skylight to project light onto the surgical field, so too, we believe, will future operating rooms use projection-based AR to physically overlay patient anatomy onto the surgical field, obviating the need for wearable frames and HUDs. This innovative way of visualizing AR is already under development and uses sophisticated eye- and head-tracking cameras to dynamically move the projected images to match the surgeon’s line of sight. Simplicity is the ultimate sophistication, and the future of surgical visualization is no exception.
Wearable visualization technologies have spurred the progress and innovation in surgical intervention from the advent of modern surgery until today. This symbiotic relationship is still thriving, and future improvements in spine surgery, such as expanding indications for minimally invasive spine surgery, reducing patient and surgeon radiation exposure, and, above all, improving patient outcomes, will necessitate new developments in wearable and projector-based augmented reality visualization.
The authors would like to acknowledge Amera M. Ahmad, MD, MS, for her contribution to the ideation and development of this manuscript.
Provenance and Peer Review: This article was commissioned by the Guest Editors (Ralph J. Mobbs, Pragadesh Natarajan and R. Dineth Fonseka) for the series “Objective Monitoring and Wearable Technologies including Sensor-Based Accelerometers and Mobile Health Applications for the Spine Patient” published in Journal of Spine Surgery. The article has undergone external peer review.
Peer Review File: Available at https://jss.amegroups.com/article/view/10.21037/jss-21-95/prf
Conflicts of Interest: Both authors have completed the ICMJE uniform disclosure form (available at https://jss.amegroups.com/article/view/10.21037/jss-21-95/coif). The series “Objective Monitoring and Wearable Technologies including Sensor-Based Accelerometers and Mobile Health Applications for the Spine Patient” was commissioned by the editorial office without any funding or sponsorship. JWY reports standard stock ownership in Kinesiometrics LLC and MedCyclops LLC, unrelated to this manuscript. The authors have no other conflicts of interest to declare.
Ethical Statement: The authors are accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.
Open Access Statement: This is an Open Access article distributed in accordance with the Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International License (CC BY-NC-ND 4.0), which permits the non-commercial replication and distribution of the article with the strict proviso that no changes or edits are made and the original work is properly cited (including links to both the formal publication through the relevant DOI and the license). See: https://creativecommons.org/licenses/by-nc-nd/4.0/.
- Henderson AR. A Note on the “Circular Room” of the Pennsylvania Hospital. J Hist Med Allied Sci 1964;19:156-60. [Crossref] [PubMed]
- Smuck M, Odonkor CA, Wilt JK, et al. The emerging clinical role of wearables: factors for successful implementation in healthcare. NPJ Digit Med 2021;4:45. [Crossref] [PubMed]
- Kwitko ML, Kelman CD. The History of Modern Cataract Surgery. Amsterdam: Kugler, 1998.
- Barraquer JI, Barraquer J, Littman H. A new operating microscope for ocular surgery. Am J Ophthalmol 1967;63:90-7. [Crossref] [PubMed]
- Dohlman GF. Carl Olof Nylén and the birth of the otomicroscope and microsurgery. Arch Otolaryngol 1969;90:813-7. [Crossref] [PubMed]
- Schultheiss D, Denil J. History of the microscope and development of microsurgery: a revolution for reproductive tract surgery. Andrologia 2002;34:234-41. [Crossref] [PubMed]
- Davidson BJ, Guardiani E, Wang A. Adopting the operating microscope in thyroid surgery: safety, efficiency, and ergonomics. Head Neck 2010;32:154-9. [PubMed]
- Magera JS Jr, Inman BA, Slezak JM, et al. Increased optical magnification from 2.5x to 4.3x with technical modification lowers the positive margin rate in open radical retropubic prostatectomy. J Urol 2008;179:130-5. [Crossref] [PubMed]
- Carl B, Bopp M, Saß B, et al. Spine Surgery Supported by Augmented Reality. Global Spine J 2020;10:41S-55S. [Crossref] [PubMed]
- Langer DJ, White TG, Schulder M, et al. Advances in Intraoperative Optics: A Brief Review of Current Exoscope Platforms. Oper Neurosurg (Hagerstown) 2020;19:84-93. [Crossref] [PubMed]
- McManamny DS. Comparison of microscope and loupe magnification: assistance for the repair of median and ulnar nerves. Br J Plast Surg 1983;36:367-72. [Crossref] [PubMed]
- Shenaq SM, Klebuc MJ, Vargo D. Free-tissue transfer with the aid of loupe magnification: experience with 251 procedures. Plast Reconstr Surg 1995;95:261-9. [Crossref] [PubMed]
- Kim P, Joujiki M, Suzuki M, et al. Newly designed ergonomic surgical binocular telescope with angulated optic axis. Neurosurgery 2008;63:ONS188-90; discussion ONS190-1. [PubMed]
- Toledo-Pereyra LH. X-rays surgical revolution. J Invest Surg 2009;22:327-32. [Crossref] [PubMed]
- Ghazanfar MA, Cook M, Tang B, et al. The effect of divided attention on novices and experts in laparoscopic task performance. Surg Endosc 2015;29:614-9. [Crossref] [PubMed]
- Ewers R, Schicho K. Augmented reality telenavigation in cranio maxillofacial oral surgery. Stud Health Technol Inform 2009;150:24-5. [PubMed]
- Levy ML, Day JD, Albuquerque F, et al. Heads-up intraoperative endoscopic imaging: a prospective evaluation of techniques and limitations. Neurosurgery 1997;40:526-30; discussion 530-1. [PubMed]
- Vávra P, Roman J, Zonča P, et al. Recent Development of Augmented Reality in Surgery: A Review. J Healthc Eng 2017;2017:4574172. [Crossref] [PubMed]
- Carl B, Bopp M, Saß B, et al. Augmented reality in intradural spinal tumor surgery. Acta Neurochir (Wien) 2019;161:2181-93. [Crossref] [PubMed]
- Kocev B, Ritter F, Linsen L. Projector-based surgeon-computer interaction on deformable surfaces. Int J Comput Assist Radiol Surg 2014;9:301-12. [Crossref] [PubMed]
- Elmi-Terander A, Burström G, Nachabé R, et al. Augmented reality navigation with intraoperative 3D imaging vs fluoroscopy-assisted free-hand surgery for spine fixation surgery: a matched-control study comparing accuracy. Sci Rep 2020;10:707. [Crossref] [PubMed]
- Liu H, Wu J, Tang Y, et al. Percutaneous placement of lumbar pedicle screws via intraoperative CT image–based augmented reality–guided technology. J Neurosurg Spine 2019;32:542-7. [Crossref] [PubMed]
- Elmi-Terander A, Burström G, Nachabe R, et al. Pedicle Screw Placement Using Augmented Reality Surgical Navigation With Intraoperative 3D Imaging: A First In-Human Prospective Cohort Study. Spine (Phila Pa 1976) 2019;44:517-25. [Crossref] [PubMed]
- Buch VP, Mensah-Brown KG, Germi JW, et al. Development of an Intraoperative Pipeline for Holographic Mixed Reality Visualization During Spinal Fusion Surgery. Surg Innov 2021;28:427-37. [Crossref] [PubMed]
- Molina CA, Theodore N, Ahmed AK, et al. Augmented reality-assisted pedicle screw insertion: a cadaveric proof-of-concept study. J Neurosurg Spine 2019; Epub ahead of print. [Crossref] [PubMed]
- Yoon JW, Chen RE, Kim EJ, et al. Augmented reality for the surgeon: Systematic review. Int J Med Robot 2018;14:e1914. [Crossref] [PubMed]
- Burström G, Persson O, Edström E, et al. Augmented reality navigation in spine surgery: a systematic review. Acta Neurochir (Wien) 2021;163:843-52. [Crossref] [PubMed]
- Sumdani H, Aguilar-Salinas P, Avila MJ, et al. Utility of Augmented Reality and Virtual Reality in Spine Surgery: A Systematic Review of the Literature. World Neurosurg 2021; Epub ahead of print. [Crossref] [PubMed]
- Chidambaram S, Stifano V, Demetres M, et al. Applications of augmented reality in the neurosurgical operating room: A systematic review of the literature. J Clin Neurosci 2021;91:43-61. [Crossref] [PubMed]
- Imada AO, Huynh TR, Drazin D. Minimally Invasive Versus Open Laminectomy/Discectomy, Transforaminal Lumbar, and Posterior Lumbar Interbody Fusions: A Systematic Review. Cureus 2017;9:e1488. [Crossref] [PubMed]
- Tian NF, Wu YS, Zhang XL, et al. Minimally invasive versus open transforaminal lumbar interbody fusion: a meta-analysis based on the current evidence. Eur Spine J 2013;22:1741-9. [Crossref] [PubMed]
- Yoon JW, Spadola M, Blue R, et al. Do-It-Yourself Augmented Reality Heads-Up Display (DIY AR-HUD): A Technical Note. Int J Spine Surg 2021;15:826-33. [Crossref] [PubMed]
- Ortega M, Ivorra E, Juan A, et al. MANTRA: An Effective System Based on Augmented Reality and Infrared Thermography for Industrial Maintenance. Applied Sciences 2021;11:385. [Crossref]
- Gsaxner C, Pepe A, Wallner J, et al. Markerless image-to-face registration for untethered augmented reality in head and neck surgery. Springer: International Conference on Medical Image Computing and Computer-Assisted Intervention, 2019.