BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Hololens 2: Microsoft's Enterprise AR Power Play For Windows Mixed Reality

Following
This article is more than 4 years old.

Alex Kipman, Technical Fellow, Cloud & AI at Microsoft

Anshel Sag

Article by Anshel Sag.

After months of anticipation, Microsoft recently launched its Hololens 2 headset at MWC 2019 in Barcelona. It was quite a launch, complete with surprises and some controversy. Many developers and enterprises were eagerly awaiting this headset’s arrival to determine whether they should develop for platforms like Magic Leap or Hololens. The fact that Satya Nadella himself started the press conference shows that Microsoft is taking Hololens 2 very seriously—he typically doesn’t do that for hardware announcements. Let’s take a closer look at the new headset.

The hardware

Microsoft announced that the new Hololens 2 a 43-degree horizontal and 29-degree vertical view, which represents an increase of 2.37x over the previous generation. As someone who used the original Hololens, with its severely limited field of view, I was quite impressed by this significant improvement. I still believe that Hololens has room for improvement with its field of view, but every AR headset has that going against it right now. AR headsets and the optics that power them are fundamentally more limited by their size and optical technology than VR; as a result, VR headsets have a considerably wider field of view at 110 degrees and beyond.

As expected, the Hololens 2 is lighter and considerably more compact than the previous generation. While there are no official weight or size figures yet, according to Microsoft the headset is designed to fit on your head like a baseball cap. The company accomplished this, in part, by moving the batteries to the back of the device to counterbalance the optics and PCB in the front. This allows the headset to sit more evenly on your head, without neck or head fatigue. The previous generation sat more on the bridge of the nose and forehead, and it left a line across my head after wearing it for a while. Admittedly, I have not worn the Hololens 2 for an extended period of time, but I’m optimistic that it can be worn comfortably for longer periods of time.

Two things that I was pleasantly surprised to see in Hololens 2 were eye-tracking and hand-tracking. I have written extensively about the future of XR, and I believe that both eye-tracking and hand-tracking are absolute musts for future devices. If interested in reading more, I wrote a paper about my thoughts on eye-tracking for augmented reality and virtual reality. The key benefit of a technology like eye-tracking is the ability to use your eyes as an interface, requiring little to no movement of the hands. A good example of this is if you are in a training demonstration and are taught how to fix something. If you look at the wrong place to plug in the cable, the application is aware of that and can preemptively caution you from doing so. In addition to the user interface benefits, eye-tracking can also be used for user authentication—once you log into the headset, you never have to use a password again. This is effectively what Windows Hello already does, but it makes using the headset and switching between users even easier. Eye-tracking and AI can also be used to infer head movements before they actually happen.

As a baseline interface, I also believe every standalone headset should have hand-tracking. We use our hands a lot in day to day interactions, so being able to use them in augmented reality should be possible as well. Microsoft does this well with the Hololens 2, although the applications demonstrated were limited. Hand tracking is a great way to establish a base level of user interaction; it can be improved upon with other devices, such as a controller, but does not require any additional hardware use the device effectively. I cannot describe how many times I have tried to use a headset and then realized that I didn’t have a controller, or it wasn’t charged. Hand tracking can also be paired with haptic feedback systems, from companies like Ultrahaptics, to create physically tangible holograms in AR.

In order to power all of these major upgrades, Microsoft also made a significant improvement to the internal hardware of the Hololens. Microsoft upgraded the Hololens 2 from the original’s Intel Atom x5-Z8100 (which operates at 1 GHz with four cores) to a Qualcomm Snapdragon 850 (which runs up to 2.96 GHz with eight CPU cores). I believe that Microsoft’s decision to go with the Snapdragon 850 was the right one; it is capable of a considerable amount of performance at very low power, an area where the previous generation got some complaints. The Adreno 630 GPU inside of the Snapdragon 850 is also more than capable of displaying high-poly graphics in AR; the previous generation was severely lacking in this area, and many graphics lacked detail or looked cartoonish. The Hololens 2 truly addresses most of the concerns people had with the first-generation headset—and at a much-reduced price point of $3,500 (down from $5,000).

Hololens 2 also features the ability to flip the visor up and down easily so that you don’t have to take it off to talk to people or look at something in the real world. This became a bit of a meme after the event launch with tons of people posting GIFs and videos of themselves flipping the headset up and down, but it is also a testament to how well-balanced and easily worn the headset is.

The Software

Microsoft made a lot of investments with the Hololens 2 in terms of being open and embracing the industry. That includes support for open standards (like OpenXR) which are designed to make cross-platform spatial computing more user and developer friendly. I believe that Microsoft is wise to take this approach; the reality is that the market needs more cohesion. Microsoft clearly understands its role as an enabler, which is why it spun up things like Guides and Remote Assist using Microsoft Dynamics 365 for Mixed Reality. This application lets virtually anyone build a guided application that walks someone through a step-by-step process like a multi-point inspection or the assembly of something complex. Microsoft is also launching a companion smartphone application to allow users not in headsets to also utilize the platform.

In addition to Microsoft Dynamics 365, Microsoft also utilizes Azure to power Spatial Anchors, which supports ARKit, ARCore, and Hololens. Spatial Anchors will allow all devices, Microsoft or not, to share the same digital assets in the same space (an ability that has great potential for collaboration). Microsoft also announced Azure Remote Rendering for Hololens. With Hololens squarely targeted for the enterprise, it's important to remember the size of the datasets that many of Microsoft’s customers work with—some of these assets can reach as high as 100,000,000 polygons. Remote Rendering gives the Hololens 2 additional graphical horsepower than what it has onboard. It can be thought of as a type of split rendering, where the heavy rendering is done in the cloud and then streamed to the headset. This allows for much higher graphics than the internal low-power GPU can handle.

Azure is a huge part of what Microsoft is doing and I believe it is ultimately what is driving the company’s vision for spatial computing and the Windows Mixed Reality platform. Naturally, Microsoft wants to welcome as many application developers and headsets onto its different Azure platforms to drive more and more Azure cloud usage. The more non-Hololens application developers Microsoft can get to use its mixed reality tools, the more Azure utilization it will realize. That is what I believe is driving Microsoft’s open approach to this launch and why it partnered with Epic Games on Unreal Engine support. Additionally, Hololens 2 utilizes the Mozilla Firefox browser, which was built as an open operating system and utilizes a lot of OpenXR.

The demos

Microsoft had four demo booths at MWC 2019, which showed a wide array of Hololens applications that its partners have already developed for Hololens 2: a training app powered by Microsoft Guides for Hololens inside of Microsoft Dynamics 365, a medical education app by Pearson, a multi-user architecture app by Bentley Systems, and an AR app experience built with PTC’s Vuforia Studio platform. I managed to try out all the different demonstrations except Bentley Systems’.

Before all the demonstrations, Microsoft had everyone put on their headsets and adjust the fit. Then, Microsoft fired up the calibration program, which calibrates the eye-tracking by having the user look at a polygon that moves from corner to corner. Once eye tracking is calibrated, users stick out their hands to ensure that hand tracking is working. If it is, a little digital hummingbird will hover over your open hand wherever you move it.

The first demo I tried was PTC’s Vuforia Studio demonstration. Vuforia is an extremely flexible tool that purports to make it easier to design and build spatial computing apps. Vuforia has been in the augmented reality space for a very long time—in fact, it was one of the first AR SDKs in existence when it was introduced to me in 2012 by Qualcomm. Back then, AR was mostly focused on smartphones—we could barely envision the wearable technology we have today. Vuforia’s demonstration gave users an idea of the kinds of enterprise apps one could easily build with Vuforia Studio. I believe that the partnership between Vuforia and Microsoft will help enable more businesses to take advantage of AR and Hololens 2. Ultimately, Microsoft and PTC need to create as many tools as they can to enable the building of AR apps.

The Pearson booth featured an educational demonstration in which medical students are taught how to interact with patients. Pearson is one of the leading providers of education material in the world and is without a doubt one of the most ambitious when it comes to adopting new technologies in the classroom. Using both the hand tracking and voice recognition capabilities of the headset, students question a patient on their symptoms, taking note of their answers and body language to diagnose properly. While I don’t think I’ll be a doctor any time soon, it was pretty clear how higher education could benefit from the use of Hololens and the Windows Holographic platform. Obviously, at the current price, this probably isn’t a K-12 technology. Still, there may be subsequent versions targeted towards a younger demographic.

Last but certainly not least was the demo of Guides, Microsoft’s platform for building step-by-step instructions on how to complete certain tasks. This app falls within Microsoft’s Dynamics 365 platform inside of Azure, so it's optimized for Azure. Additionally, it is simplified so that almost anyone can build a guided tour or training guide for an intern or new employee. The demo I experienced had me assemble a part of what I believe was a plane, by routing a cable through different clamps and connecting it to the right connector. Afterwards, I got to see all of the powerful analytics that comes with the application, which allows the administrator to see where people are getting caught up or where there is the most variance in time taken to complete a task. I found it very interesting and compelling—it was both immersive and data-rich in terms of actionable insights. I would have liked to see integrated eye-tracking with this application so that users could be guided based on their gaze, preempting any mistakes. This would serve two purposes: first, it would make the user experience feel more seamless and natural, and secondly, it could reduce the rendering of unnecessary graphics.

The future

I think Hololens 2’s rollout and ultimate success will be heavily dependent on how many partners Microsoft works with to make the headset accessible to developers that are interested in AR. Many developers already develop to Hololens, which works in Microsoft’s benefit, but the tools need to be developer friendly and encourage easier application development. This is why Guides and Vuforia Studio are going to be so important for Microsoft and Hololens. The eye-tracking and hand-tracking capabilities have yet to be fully realized, and I believe that it would be smart for Microsoft to host hackathons with developers to explore the possibilities now that we have hardware that can do it well. The lack of decent eye-tracking and hand-tracking capabilities, in my opinion, is a huge part of what has been holding AR technology back—Hololens 2 does both better than any previous AR headset.

I believe that the Hololens 2 solves many of the pain points of the previous generation. As such, I think it will have much broader success and bring AR much further into the enterprise. I am excited to see what other partnerships Microsoft announces, especially in the areas of medicine, education, and research. I believe that there are still many untapped markets for AR and Hololens to shine. The company should also start to think about how it can drive 5G and utilize the mobility of Hololens to empower enterprises. I was a bit surprised that Microsoft didn’t mention 5G at its MWC press conference, especially considering that 5G was the talk of the show. Both the 5G operators and Microsoft have something to gain from partnering to accelerate 5G with AR. A 5G-enabled Hololens might be in our future—possibly as soon as next year once Qualcomm ships its next-gen SoC with built-in 5G modem. Ultimately, just tethering to 5G connectivity could be a huge boost for Hololens. Perhaps Microsoft should even consider partnering with HTC on its 5G hub, which has 5G connectivity and WiGig for low latency mixed reality.

That aside, Hololens 2 brings AR into the next phase of spatial computing, in which most of the necessary experiential problems are solved. Now the technology just requires applications to use it properly. It will take some time for this to happen, but Microsoft seems quite committed and understands the importance of building a platform that developers want to use. Between the Hololens 2 and the HP Reverb (which I recently reviewed here), the Windows Mixed Reality platform is starting to look much more mature and capable. The next step is enticing more ISVs and partners to buy in and start building real business applications for this great hardware.

Anshel Sag is a Moor Insights & Strategy associate analyst focusing on mobility and virtual reality

Disclosure: Moor Insights & Strategy, like all research and analyst firms, provides or has provided research, analysis, advising, and/or consulting to many high-tech companies in the industry, including Microsoft, Intel, and Qualcomm. The authors do not hold any equity positions with any companies cited in this column.

Follow me on Twitter or LinkedInCheck out my website