Researchers have recently found that spinal surgery outcomes are improved when the surgeon uses virtual reality (VR) during the operation. The procedure evaluated in this study, percutaneous kyphoplasty (PKP), is a commonly used treatment for osteoporotic vertebral compression fracture (OVCF). It involves the injection of bone cement through the skin into the damaged vertebrae and requires tremendous precision from the surgeon. The findings of this new study, published in the Journal of Orthopaedic Surgery and Research, suggest that this precision is enhanced when the surgeon is wearing VR goggles.
Though PKP has given many patients the relief they are seeking from the operation, there are still patients who are unsatisfied after the procedure. Complaints amongst these patients include continued compression in the spine or recurring fractures. Patients with OVCF have been found to display a defect known as the intravertebral vacuum cleft (IVC), which has been reported as the main cause of OVCF pain and unsatisfactory pain relief after PKP.
To properly treat these cases with bone cement, a balloon can be injected into the IVC area to improve cement diffusion. This procedure requires extremely accurate navigation to the IVC region to yield the best patient outcomes.
Being that previous research has found mixed reality to be effective in displaying the three-dimensional structures within the human body, a team of researchers from the Nanjing Medical University in China decided to use the technology for surgical navigation during the PKP procedure. These researchers are also affiliated with the Nanjing First Hospital.
Background of the VR Study
Patients included in the study were those with single-level OVCF in the thoracic and lumbar levels of the spine, severe back pain, osteopenia or osteoporosis, no damaged vertebral posterior wall or nerve lesion. Each patient was over 50 years of age and underwent plain radiography, computed tomography (CT), and MRI before their surgery.
Before the operation, these CT and MRI images were used to create a computer-aided design (CAD) that modeled the patient’s anatomy. This CAD image was imported into a computer and was relayed to a Microsoft Hololens device that the surgeon wore. The surgeon sees these 3D virtual images of the patient’s spine in combination with the patient’s real body in a mixed reality environment.
This virtual projection is used to accurately guide needles into the patient’s injured vertebra, positioning it perfectly within the IVC. This VR-infused operation also utilized the traditional C-arm fluoroscopy system for imaging during the operation.
The researchers recruited forty patients in total, with half receiving the VR operation described above (treatment group) and the other half receiving only the traditional C-arm fluoroscopy for intraoperative imaging (control group).
Success of Augmented Reality in the Operating Room
The team found that both the operation duration and fluoroscopy times were significantly lower in the treatment group than they were in the control. Back pain and movement scores improved immediately after the operation in both groups, but more so in the treatment group. The researchers noted a gradual significant increase in these scores at 1-year follow-up in the control group, indicating worse pain and mobility. No neurological defects, re-fractures, cement leakage, or other complications were observed in either group.
The researchers concluded that the mixed reality technique could effectively help surgeons navigate to the IVC region during operation, cutting back on surgery time and increasing patient outcomes. They also noted that the use of mixed reality imaging greatly reduced the patient’s exposure to x-rays from the C-arm fluoroscopy machine during the operation, minimizing the use of harmful radiation.
“We consider this technology as a perfect approach for guidance in minimally invasive surgery,” the authors concluded.
The lead author of this study is Peiran Wei, Department of Orthopaedics, Nanjing First Hospital, Nanjing Medical University. Co-authors include the department’s Qingqiang Yao, Yan Xu, Huikang Zhang, Yue Gu and Liming Wang.