How Microsoft is getting under the skin of surgery with HoloLens

Mixed reality may be more synonymous with Pokémon Go than plastic reconstructive surgery, but virtual limbs projected onto real bodies have the potential to revolutionise the operating theatre.  

How Microsoft is getting under the skin of surgery with HoloLens

That, at least, is what a collaboration between Microsoft and Imperial College is aiming to prove. In London’s St Mary’s hospital, the technology giant’s HoloLens is being used to help surgeons reconstruct damaged tissue, overlaying a virtual version of a patient’s anatomy before they start slicing. The technique has already amassed eight successes.  

Dimitri Amiras, a musculoskeletal consultant radiologist on the project, is convinced: “You talk about a picture being worth a thousand words, I can’t tell you how many words mixed reality is worth.”

Plumbing flaps

When a person loses a large chunk of flesh, one way to fix the wound is to transplant tissue from another part of the body. This piece, called a flap, isn’t simply pressed into the hole, but instead needs to be carefully plumbed into the blood vessels around the wounded area.

“The key in the operation is blood supply,” explains Dr Philip Pratt, research fellow at Imperial College’s department of surgery and cancer. “You can’t just transfer a piece of tissue to the wound. It will die. It needs to be oxygenated. So these vessels supplying oxygen to the flap need to be plumbed into the existing blood supply.

“But you can’t see those vessel superficially, looking at the [skin’s] surface. This is the key: can we use mixed reality to accurately locate those vessels and save time during surgery?”

READ NEXT: The VR companies applying game design to industry

Currently, surgeons use Doppler ultrasound readings, CT scans and pen markings to locate blood vessels. This isn’t always accurate, and means that time can be wasted in hunting for vessels that aren’t adequate for reconnection, or may have been damaged by whatever caused the wound. For a procedure that can take 10-12 hours to complete, those wasted moments are crucial in directing the success of the surgery.

microsoft_hololens_3

(CTA imaging showing the location of perforating artieries with yellow arrows. Credit: Imperial College)

Enter HoloLens. Instead of translating scan data into two-dimension images for surgeons to consult, the mixed-reality headset allows this imaging to be projected as a 3D hologram onto the body. In a paper published in January, in European Radiology Experimental, the researchers behind the project describe it as “allowing the surgeon to ‘see through’ the patient’s skin”.

The work has resulted in eight successful clinical cases, all focused around operations on limbs. “We’ve proven feasibility,” says Pratt. “Now it’s time to scale it up.”

Automation and deformation

While projecting holograms from patient scans has demonstatably helped increase accuracy in the operating theatre, Pratt admits that the process is currently “too long”. A human expert still needs to segment the scan data and identify potential vessels before a 3D model can be made. With the use to artificial intelligence, the researchers hope to be able to automate this part of the procedure.

“The idea is that you could take your CT scan straight from the scanner, stick it through an algorithm, and out pops out a bunch of voxel models,” Amiras tells Alphr. “You’ll have all the anatomy segmented out, with a tumour identified so the surgeon knows where to cut. That’s the dream.”microsoft_hololens_2

(An virtual overlay on a patient’s leg, using a HoloLens. Credit: Imperial College)

The floating virtual limb also currently needs to be manually controlled by the surgeon, moved around and rotated using hand gestures. The hope is that future iterations will automatically snap the 3D model to its real-world counterpart, letting the surgeon focus on where they need to cut instead of wafting a 3D model through the air. There’s even scope for the models to be more reactive to changes in position, shifting tissue and blood vessels depending on whether the body part if being held up, or squished against the operating table.

“You can essentially simulate the effect of gravity,” says Pratt. “So you start in a particular position, then you turn [virtual] the leg upside down and the simulation will change the position of the tissue.

“I always go for the simplest solution”

“Mathematically it’s quite complicated, whereas we can solve 70% of the problem by having Dimitri putting a cushion under the leg,” he adds. “It may be a really interesting problem to solve, but I always go for the simplest solution.”

With eight cases under their belt, and potential to start building HoloLens into medical school training, the researchers at Imperial are adamant that mixed reality has a crucial role to play in the future of surgery.

“You’ve got to find a problem and fix the problem,” says Amiras. “Don’t go looking for problems where there aren’t any. People take out appendixes all the time for appendicitis. You don’t need a HoloLens to do that. But find a pinch point, and see where it can be used to fix something.

“In some ways it’s no different to any other industry: don’t try and fix what isn’t broken.”

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.