Microsoft’s HoloLens 2 Puts a Full-Fledged Computer on Your Face

Microsoft wants the $3,500 headset to be the most advanced mixed reality computer out there.
Microsoft Hololens 2
Microsoft wants the HoloLens 2 to be the most advanced mixed reality computer out there.Quinn Russell Brown

If you ask Alex Kipman to name the most significant advancement in the brand-new version of HoloLens, Microsoft’s mixed-reality headset, he’ll say the answer is yes. It's not an evasion of the question—it's evidence of his excitement.

Kipman, Microsoft’s technical fellow for AI and mixed reality, gets excited about “all of the things” in the HoloLens 2. When pressed, though, it comes down to three key improvements: It's more comfortable, it’s more immersive, and it offers more out-of-box value than the first HoloLens. Kipman uttered this mantra—“comfort, immersion, out-of-box value”—frequently during my day-long visit to Microsoft's headquarters last month, like someone who had been well coached by his communications staff. Later, when an editor asked me what was new about the new HoloLens 2, I realized the mantra was still rattling in my brain, as though it had been transmitted through the headgear.

The new HoloLens 2 is more comfortable than the first headset, and more immersive. Its diagonal field of view has more than doubled, with Microsoft wielding a new kind of patented imaging technology. It has an AI processing unit and now connects to Azure, Microsoft’s cloud service.

Alex Kipman, Microsoft’s technical fellow for AI and mixed reality, wearing the HoloLens 2

Quinn Russell Brown

Whether the $3,500 headset provides more out-of-box value is a call for its commercial customers to make. This isn’t a headset you’ll use for gaming or for sending interactive poop emoji to friends, or one that the average consumer will ever wear at all. It’s not for “knowledge workers” like me and Kipman, people who sit at their desks all day, he says. It’s for people whose jobs are being digitally transformed—people who work in design or manufacturing, who fix gear shifts and work on oil rigs, military personnel.

Try to forget, for a second, that HoloLens is a headset. Kipman thinks about it more as a full-fledged computer for a futuristic world of remote workers in need of expertise. And Microsoft is determined to make it the most advanced mixed-reality computer out there. That much is clear, even if all of the use cases for it haven’t crystallized yet.

HoloLens History

To grasp the significance of HoloLens 2, it helps to know its origin. The earliest seeds for HoloLens were planted as far back as 11 years ago. It was borne out of Kinect, the Xbox peripheral product that used a variety of sensors to compute depth maps and recognize humans within its field of view. Kipman is credited with inventing Kinect, and in 2010, he began channeling some of the Kinect's technology into a head-mounted holographic computer. It was known then as Project Baraboo, but it would later become HoloLens.

When HoloLens officially launched in 2016, it was a 1.3-pound head-mounted display with depth-sensing cameras and an optical projection system that beamed holographic images directly into your eyes. While wearing one, you might see anything from a floating web browser to a cartoon fish in a bathtub to a three-dimensional motorcycle—all while still seeing the real world around you. Or you might see a remote technician pop up in your eye frame and show you how to fix a light switch. It isn’t a consumer device now, and it certainly wasn’t then, but Microsoft was trying to show off a wide variety of applications that could be easily grasped by regular people.

The HoloLens was available only to developers when it first launched, since Microsoft wanted to spur development of new apps. (No AR or VR headset is worth the money without compelling apps; that was true then and is still true now.) Later that year, a version of HoloLens started shipping to any consumer in the US or Canada who had $3,000 to spend.

The first HoloLens wasn’t a “success” in the way that you might describe the success of other technology products, whether that’s based on sales, ecosystem lock-in, or pure cachet. In some ways, it wasn’t meant to be a blockbuster hit in a public-facing way. But it was the first mixed-reality wearable that ran on a holographic-specific operating system—and it wasn’t a pair of lightweight smart glasses. It was an untethered headset running Windows 10, which meant it was an actual working face computer.

Still, early customers had their complaints: It was heavy, it was unwieldy, it didn’t feel immersive enough. And Microsoft heard them, loud and clear.

Put Your Heads Together

One of the most obvious updates to HoloLens 2 is its build. The first HoloLens was front-heavy, a whole bunch of components loaded onto your forehead. For this new version, Microsoft split up the pieces, positioning the lenses and some computing power in the front and moving the rest of it to the back.

Microsoft’s senior director of design, Carl Ledbetter, calls this a split-architecture design. It came loaded with its own engineering challenges, because cables had to run between the front and back parts of the headset. These are now built into the arms of HoloLens 2. Ledbetter says this new form factor was critical to achieving a certain level of comfort and balance on the new model. “With HoloLens version one, there were just a lot of things we didn’t know we didn’t know,” Ledbetter says as he leads me around Microsoft’s Human Factors lab. “But luckily, since it’s been out there for three years, we’ve been able to talk to a lot of customers.”

The Human Factors lab is a cavernous space filled with as many mannequin heads as human ones; the latter are bent over their desks, toiling on the latest designs. There are also ear molds, gesture-control wristbands, custom-made eye depth gauges. For the past three and a half years, Lebetter and his team have used these tools to design a new HoloLens headset that would fit well on 95 percent of heads, regardless of gender, ethnicity, or age. It’s not just about finding the right fit, Ledbetter says, but about having empathy for the wearer. At one point, he hands me an intentionally oversized Xbox gaming controller. “There,” he says. “You’re five years old.”

Ledbetter and his team have scanned over 600 human heads in the Human Factors lab. A hundred other people have been put through “stress tests” with HoloLens 2 prototypes—asked to watch a long movie or play the tabletop game Jenga or converse with other humans. The goal was to have people forget they were wearing it, ideally for up to two hours. In some cases, “we were getting more than two hours, and people weren’t taking it off at all,” Ledbetter says. Some tests involved sensors, attached to subjects’ necks, that measured muscle load or fatigue. Ledbetter claims, based on this data, that the new HoloLens is three times as comfortable as the old one.

I wore HoloLens 2 for a few brief demos during my visit to Microsoft, and it’s undeniably more comfortable than the first version. It also weighs less, though in mere grams. The click-wheel on the back of the headset, which loosens or tightens the HoloLens around around your face, is less clicky than the first one. Microsoft says the battery life should match the first HoloLens, so, around three and a half hours. Kipman says he looks forward to the day when people run out of battery life on the HoloLens, meaning, they've worn it for a session that long.

A series of 3-D printed molds of what would become the brow pad on HoloLens2

Quinn Russell Brown

There are material upgrades too. The front enclosure is made of carbon fiber, which is supposed to keep it cool and light. It has anodized aluminum cooling channels that dissipate heat from the headset’s custom-made processor. The silicone back pad, the part that’s affixed to the back of your head now like a gray piece of toast, has a microtexture that’s designed to give just the right amount of grip without ripping your hair out.

The thing that might make the most difference, at least for Microsoft’s target audience, is an old trick applied to a new headset: The front enclosure can now be flipped up, like cool-dude clip-on sunglasses. If you’re working in the field or on an assembly line and need to quickly switch between holographic instructions and conversing with a real live human being, you can just lift the lens up. Kipman delights in showing this off, lifting the “visor” up, pulling it down again. “Holograms everywhere!” he says when the lens enclosure resumes its downward position.

These new features—the split architecture, its cooling mechanisms, the hinge that made visor mode possible—were in the works before the optics on HoloLens 2 were finalized, Ledbetter says. But the optics are what make holograms happen. The optics are by far the most interesting part of this new HoloLens.

Beam in Your Eye

Last summer, news reports emerged that Microsoft had filed a patent with the US Patent and Trademark Office back in 2016 that described expanding the field of view on a display using MEMS laser-scanning technology. MEMS refers to microelectromechanical systems, which involve miniaturized electrical and mechanical components. According to academic journals, lasers have been a part of MEMS research and applications for decades. That part of Microsoft’s patent filing wasn’t new. What was new was Microsoft’s proposed method of modulating MEMS mirrors to direct lasers in a way that created greater angles, and as a result, a larger field of view.

On the original HoloLens, the field of view—that virtual eye box through which you see holographic content—was not very big. At all. The stuff you were looking at often got cut off or edged out of frame if the object was too big for the window, or if you moved your head a certain way. Jeremy Bailenson, the founding director of the Virtual Human Interaction Lab at Stanford University, cowrote a 2017 white paper about the social influence of “virtual humans,” having used AR and VR headsets as part of the study. In it, the writers describe in painful detail the limitations of the HoloLens’ narrow field of view.

“From an empirical standpoint, we know that field of view is critically important,” Bailenson tells WIRED. “It causes people to have a better overall experience, because they can move their head in a natural way.”

So this was obviously one of the aspects of HoloLens that Microsoft had to improve upon. And it did. The first HoloLens had a 34-degree diagonal FOV; the new headset’s field of view has “more than doubled,” Kipman says, to a 52-degree diagonal field of view. (Microsoft declined to share exact measurements for this new eyebox, saying that the x-axis and y-axis are not the best way to think about the FOV improvements. But much of the expansion was in the vertical dimension.)

The HoloLens optics team also managed to maintain a resolution of 47 pixels per degree while expanding the box. This means that, while the first HoloLens had the equivalent of two 720p displays, one for each eye, this new face computer has the equivalent of a 2K display for each eye. And the lens stack has been reduced, going from three lens plates down to two.

In my own experience wearing HoloLens 2, I still found myself coming up against the edges of the eye box. A hologram of a woman named Hannah, who gave me a rundown of Microsoft’s campus construction project in a taped holographic video, still became headless or footless if I moved too close to her. Same with the tops of windmills, which were part of a topography demo later in the day, one in which I could use my (real) hands to pinch and zoom the holograms in a new way. So, while the field of view has been improved and the content appears to be more crisp, we’re not quite at the point of holograms, uninterrupted just yet.

Both Kipman and Zulfi Alam, who manages Microsoft’s optics engineering team, acknowledge that the visual experience on HoloLens 2 isn’t totally immersive yet. It’s the mechanical method by which they’ve increased the field of view, though, that they appear to be most excited about. The MEMS mirrors that Microsoft is using are the “largest small mirrors in the world,” Alam says. The mirror looked like a speck of debris on a conference room table; when I picked it up to peer at it, I could see it was a tiny reflective disc on my finger tip.

Normally, with a DLP, LCD, or LCoS projector in a headset, light particles are spit out, refracted, bounced off of lenses, and beamed back into your eyes, essentially tricking them into seeing holograms. (The original HoloLens used a LCoS projector.) And HoloLens has enough sensors to know your head position in space, so it knows where to beam these images into your pupils into order to convince you you’re seeing things. Using the MEMS mirrors, which are strobing 54,000 times per second, HoloLens 2 is now splitting that light apart and reconstituting it at every single pixel. “It’s replicating your pupil multiple times,” Alam says.

The benefit to doing it this way, he says, is that when you want to increase the field of view, you just change the angles of the mechanical system. You don’t have to build a bigger backplane to create a bigger field of view, which would then increase the overall size of the product. Like HoloLen’s physical redesign, this innovation also presents new challenges—such as developing the software to make this all work properly. “The control logic becomes very complicated,” Alam says.

Toward a New Reality

That’s all on the inside. Externally, HoloLens has to make itself useful to the enterprise customers who will be using this thing out in the wild. Microsoft has done some work there too.

For example, HoloLens 2 supports more advanced gesture controls. Before, you could use your finger in a Redrum-like way to select holographic tiles that appeared before your eyes. You could also use a “bloom” gesture, a kind of hand-cupping motion, to go back to the Start menu on your holographic desktop. If you gazed at a holographic app icon long enough, you could highlight it.

Now you can walk up to a virtual object in HoloLens 2 and manipulate it with your hands, twirl it, resize it, even press or punch it. The headet’s new eye-tracking tech means you can read a news story on a holographic browser, and the page will scroll for you—look, Ma, no hands. All of the earlier gestures still work, but it’s these new kinds of interactions that Microsoft believes will help usher us into a reality where mixed reality feels more natural.

The author wearing HoloLens 2

Quinn Russell Brown

Microsoft is also touting new cloud-based “spatial anchors,” designed to let people access holographic app features even if they’re not wearing a HoloLens. Let’s say I’m wearing a HoloLens 2, but you’re not; you’re on an iPhone or Android smartphone. Both of us should be able to look at that holographic rendering of Microsoft’s campus construction project at the same time, provided that the app developer built the app that way.

Since app developers are still a critical piece of the HoloLens ecosystem, Microsoft is rolling out what it calls Dynamics 365 Guides, a prebuilt set of software features that will slot right into instructional apps for HoloLens. Want to teach someone how to fix the gear shift on an ATV? It should take you mere minutes to build that HoloLens app, Kipman says, not months.

But Kipman, who has been at Microsoft for 18 years, sees HoloLens as something much bigger than just a headset that runs hologram apps. To him, it’s part of a technological revolution, one that happens every 30 years or so. In the 1950s there was the CPU; in the 1980s, the GPU. Each was responsible for handling a certain amount of computing workload.

“Thirty years later, notice the pattern,” he says. “You can call it whatever you want, and we happen to call it the holographic processing unit, but the devices of the future will all have a CPU, a GPU, and some type of AI unit.” HPU 1.0 was the first instance of Microsoft’s holographic processing unit. HPU 2.0, present in the new HoloLens, is “perfect for algorithms, for machine learning,” Kipman says. “We also created deep neural-network cores that are optimized to run these models.”

HoloLens 2 also now connects directly to Microsoft’s Azure cloud service, which Kipman says makes the head computer “light up in a different way.” This means that certain AI tasks are offloaded to the cloud, which results in more precision—like the difference between one-centimeter spatial mapping and one-millimeter spatial mapping—but might also take a few extra seconds for the headset to process things. Kipman insists that certain enterprise customers are OK with that latency.

“I think if you’re talking about vision picking, the hot new thing in the logistics industry, where front-line workers process packages without scanners in their hands, you might go with something much lighter,” says J. C. Kuang, an analyst at Greenlight Insights who closely covers AR and MR. “That’s when you might go with Google Glass or an older model of Vuzix. But if you move into, say, architectural engineering construction to look at data on a work site, a much more involved computational process, then there are benefits to using HoloLens with AI operations running in the cloud.”

Plus, Kuang adds, it makes sense that Microsoft would use Azure in any way it could. “In a vacuum, without even talking about augmented reality, Azure is evolving into a more and more important revenue stream for Microsoft,” he says.

Attaching HoloLens to Azure might also be part of a larger strategy: one that allows Microsoft to avoid the “hype cycle,” as Kipman puts it. There are products, he says, that everybody believes will take over the universe overnight, which then leads to a “trough of disillusionment, because it doesn’t do that.” Some products make it to the other side of the chasm; some find their place in a niche market. But they’re not going to take over computing.

“Then there are those things that are transformative,” Kipman says. “They really do live side by side with other eras of computing and push forward democratization and innovation to an order of magnitude. I do believe mixed reality is that. But, you know, we haven’t---and we’re not going to---overhype it.” Kipman, his mantra temporarily forgotten, was suddenly crystal clear.


More Great WIRED Stories