In the operating room, a surgeon’s every move is critical. Some surgeries, such as spinal procedures, require not only focus and precision, but also for a physician to simultaneously look at both the patient and a computer to guide surgical instruments during procedures. To make surgeries safer for the patient, an AdventHealth physician invented innovative eyeglasses for surgeons that tap into augmented reality (AR) technologies. Called iSight, these surgical eyeglasses overlay critical data, like vital signs, anesthesia and imaging, onto a surgeon’s eyeglasses, eliminating the need for surgeons to turn away from the patient to look at a screen during a procedure.

DocWire News spoke with Dr. Chetan Patel, orthopedic surgeon, spine specialist and the executive medical director for spine surgery at AdventHealth Neuroscience Institute, saw a major need for this technology while in surgery and worked to patent this AR technology. See what Dr. Patel had to say.

DocWire News: Can you provide us with some background on yourself?

Dr. Chetan Patel: Sure. This is my 20th year in practice now, so I’ve been doing this for a while. I’ve spent most of my career, not only practicing spine surgery, but also dedicated to research, and teaching, and education. So I’ve taught residents and fellows. I routinely teach surgeons that are in practice through my North American Spine Society commitments and American Academy of Orthopedic Surgery. I specialize in technology, so I’m the section chair for North American Spine Society for robotics and navigation. So exactly the sort of thing you’re talking about is how to use technology to achieve the best outcome possible?

What are some of the challenges surgeons face as it pertains to having to simultaneously focus their sight on different things?

As you can imagine, spine surgery is delicate and requires high accuracy. And one of the parts where we’re the most worried about accuracy is when we put in a screw. Our margin of error is one to two millimeters in structures that we can’t really see with our naked eye. So we rely on either x-ray or advanced technologies, such as navigation and robotics, to ensure that high level of accuracy to the best of our abilities. But that introduces a second problem, which is in order for us to see that screen, we’ve got to position it where it would be ideal for us. Which would, typically, be directly across from us, but that’s where our assistant needs to be. And even if we place it there, we would have to look up and the monitor is small. What we end up doing, typically, is putting the screen, whether it be x-ray or navigation or robotics, off axis about 45 degrees to the side. And we end up having to look away from the patient, up and twist our neck, or twist our body so we can actually see that screen.

We’re going back and forth, so it creates several problems. One, now you’re taking a 20 to 30 inch screen and you’re putting it farther away from you, and you can oftentimes get a glare, so it’s really not an ideal position. Two, you’re wasting time. In the operating room, when the patient’s asleep, the patient’s under anesthesia, and the longer you take to do the surgery, the more the morbidity of the operation, the longer they’re under anesthesia. So we’re wasting time looking back and forth. And the third thing is, oftentimes, we are inaccurate. Now, thankfully, with these advanced technologies that doesn’t happen too frequently. But when we look into those cases where things are not accurate, it turns out it’s because we’re not looking at the instrument and the patient because we’re turned away looking at the screen.

If you think about it, a lot of those technologies track the accuracy by tracking the end of the instrument. But the part that you really want to track is the one that’s inside the body. But what’s being tracked is what’s in our hand. For example, when we’re putting in a screw that I was talking to you about, the screw is attached to the screwdriver, and that point where it connects is the point where it’s going to wobble, if it hits something. If I’m looking at the patient, looking at my hand, and I’m putting that screw in. And let’s say, that it hits a retractor, it’s going to deflect, but I’m going to see it, so I’m going to say, “Well, I can’t do that. I’ll correct that.” But if I’m looking away at a screen, I can’t see that. So the screen may show that it’s perfect, but in reality that’s not the trajectory that’s going in the patient.

That’s one of the potential errors is looking away at a screen, not at a patient, creates. And finally, there’s the discomfort to the surgeons from having to twist and turn. And I’ve had other physicians as well as news reporters ask me, “Can you help us understand what that means to an average person?” And the analogy that I usually use is think about driving on a freeway at 70 miles an hour. You’re looking straight ahead and you’ve got your steering wheel. Well, what if I said, “I’m going to rotate your seat so it’s 45 degrees with your steering wheel.” Now, you’re actually looking at the side of the freeway, while you’re driving 70 miles an hour. You’re going to be looking straight ahead, so you know where you’re going, but you’re steering wheel is off to the side and you’re twisted. How comfortable is that? Would you really be proficient at driving 70 miles an hour without getting into a car accident?

I think most of us that do spine surgery take it for granted that’s what we have to do and accept it and move on. But when you take a step back, really it begs the question, do you really have to?

Talk to us about the iSight surgical eyeglasses – how do they work, and what benefits have you noticed from using them?

Yeah, I do. I’ve been using it since May of 2020, and I’m over 75 cases using it at Advent Health Altamonte, which is where I practice. And the reason I created the glasses is all the problems that I just shared with you occur with having to look away at small screen and turn are solved by eyesight. What you do, as a surgeon, is essentially put on these glasses and there’s some technology that has to go into it behind the scenes to transport that screen to your glasses, but the bottom line is you can maintain your view on the patient and you can position whatever screen you want to see immediately above your working area. Now, that other screen’s actually in your surgery view, and it’s actually the equivalent of a 16 screen. Remember I mentioned before 20 to 30, now we’re doubling the size and it’s a high fidelity, crisp, OLED picture that you’re looking at, and you don’t have to twist and turn.

The advantage is, number one, it’s easier to see, more comfortable to do the surgery, because you don’t have to twist and turn. But more importantly, my use, what I found, I did a prospective study that it cuts the amount of time it takes to place that screw in half. Think about that. That’s a huge improvement in surgery. If you think about it, for an average surgeon that translates to about 10 minutes of operating room time. That’s not an insignificant amount of time, when you think about 10 minutes less aesthesia time, 10 minutes less of blood loss. And that equates to, hopefully, better outcomes for the patient and better recovery. So more comfortable, certainly, a lot easier to see, a lot easier to do the surgery. Quite frankly, a lot less stressful to do that.

So imagine now rotating that car analogy, giving you the seat again, so you’re looking straight ahead is basically what I’m trying to do for all surgeons.

How will augmented reality innovations such as iSight shape the future of surgery?

Well, this is just the tip of the iceberg. I think that when we take a look at the problem that I’m trying to solve, which is bringing all the information to the surgeon in their view. You can start looking at other aspects of what this technology can do. We talked about the x-ray, robotics and navigation, but there are other critical pieces of information that I also look at. For example, if I’m doing tumor surgery, I’d want to look at that MRI or CT scan before surgery to know exactly where it is. And oftentimes what I have to do is I actually have to walk away from the operating table to go look at those images and then come back. Imagine now having the opportunity to see multiple screens, so I can switch back and forth. It’s on-demand information.

Now, I can say, “Well, I’m doing this tumor surgery. I think I’m at the right spot. Let me see that image before surgery to make sure it confirms that.” Let’s say that I’m worried about how a patient is doing aesthetic wise, and I need to maintain tight blood pressure to minimize bleeding. Having that in my view, that always shows where I am to ensure that I’m operating under optimal conditions. Those are some of the phase 1A, or 1B, that you can do with existing technology. I think beyond that, we can get even more sophisticated by introducing holograms into the images we’re looking at. I think that’s where it’s headed. I think this is what’s the tip of the iceberg, in terms of what we can achieve today.

But the good news is, even if what we can achieve easily today, we can already show value in improving the efficiency and driving better outcomes. And again, what I will tell you is that, quite frankly, after doing this, asking myself the question of why did I ever have to look away? It didn’t make any sense to begin with. I just did it because that’s what I was taught. That’s what I started to expect, because there was not a better solution. But now, being at the other end, it is interesting to look back and say, “Why did I ever have to do that? I shouldn’t have had to do that to begin with.” I guess technology wasn’t there yet, so there wasn’t the opportunity to have that.

Any closing thoughts?

What I would tell you is that, at the end of the day, this is an opportunity, not just in spine surgery, but in all of surgery, and even other procedures to use this to enhance patient outcomes. If you think about my partner is an orthopedic surgeon, he does fracture surgeries. He does total knees, where he’s in robotic and navigation. I’ve got colleagues that are anesthesiologists that are wanting to monitor multiple patients at the same time, and my technology allows you to actually see what’s in the other room. You’re not even in the room. The other opportunities do include, for example, going to the ICU. So, right now, there’s a lot of COVID patients, unfortunately, that are suffering and they’re in the ICU and they get really sick. The ICU doctors have to put in these invasive lines, so they can see what the heart function is, what the lung function is, and keep a close eye on patients to make sure that they do well.

And in order to put that in, they have the same problem as I do as a surgeon. They’re actually using a different technology, ultrasound, where they’re having to look at a different screen and do that. I do think that the opportunity to use this everywhere is broad and a wonderful opportunity, I think, for patients and physicians alike. Because this can really make a big difference, and it’s very simple to use. All you have to do is put it on, boom, you’re ready to go. You plug it in. You put it on. Oftentimes, when I’m as a spine surgeon looking at new technology, one of the questions I have to ask, “Well, how hard is it to use?” And then you have to go through that learning. And then, “How hard is it to actually consistency use in surgery to see improved outcomes?”

And the study that I did, you could actually get the benefit from day one. And I’ve never experienced that before with any technology that I’ve used as a surgeon. I think I’m really excited about offering this, not just to the spine surgeons, but really anyone that needs to use a second screen to do their job.