Skip to main content

Apple Maps has surpassed Google Maps in detail… in 3.1 percent of the US

Apple Maps has surpassed Google Maps in detail… in 3.1 percent of the US

/

More detail, less information

Share this story

An image of someone using Apple Maps on their phone
Photo by Amelia Holowaty Krales / The Verge

iOS 12 began the rollout of Apple Maps’ long-awaited redesign, which will deliver maps with far more detail using data collected by Apple directly. The updated maps currently only cover around 3.1 percent of the USA, focused around Northern California, but already some interesting differences are starting to emerge between Apple’s maps and those that Google uses for its own navigation software. The differences are documented in excruciatingly fine detail in a post by digital cartography blogger Justin O’Beirne.

The good news for Apple is that the sheer amount of natural cartographical detail its map contains far outstrips what Google currently offers. Vegetation detail is a particular highlight, with Apple’s maps even showing grass between two lanes of a highway, or around the borders of individual houses.

Apple’s new maps contain a much finer level of vegetation detail than the competition.
Apple’s new maps contain a much finer level of vegetation detail than the competition.
Image: Justin O’Beirne

But it’s not just vegetation where Apple appears to have the detail edge. Its maps tend to show the footprint of buildings with more accuracy, and even maps golf courses, sports pitches, and tennis courts with surprising accuracy.

Sports fields are mapped in phenomenal detail by Apple’s new software.
Sports fields are mapped in phenomenal detail by Apple’s new software.
Image: Justin O’Beirne

To a certain extent, more detail is a good thing, but in some cases the amount of geographical detail actually obscures useful pieces of information, like in this section of rural California.

The increase in detail occasionally means that useful information can be harder to see at a glance.
The increase in detail occasionally means that useful information can be harder to see at a glance.
Image: Justin O’Beirne

Away from mapping landscape, mapping man-made structures is a little more hit and miss. Apple mislabelled some buildings, for example, even though their physical footprints are more accurately depicted than Google’s. The mapping of roads themselves appears to have significantly improved however, showing certain minor roads that even Google doesn’t have. Disused historical routes have also been (correctly) removed.

O’Beirne also notes several instances where the heights of various buildings in Apple’s maps are incorrect relative to each other (such as the fourth and fifth tallest buildings in San Francisco), or where roof details captured by Google (such as the dome on top of San Jose City Hall) is missing entirely. The post speculates that this might have happened because Apple is manually creating its maps, resulting in big disparities between different buildings. Google, meanwhile, relies more on algorithmic extraction, which is more consistent even if it misses finer details in some places.

Mistakes like these are minor in the grand scheme of things, but they’re important based on Google’s transformation of its mapping app from a piece of software that shows you how to get somewhere, to a service that tries to tell you where you should be going in the first place. Google’s recent presentations suggest that the search giant is pretty happy with the quality of its mapping data, and that its focus is increasingly on providing accurate information about businesses.

O’Beirne’s post contains numerous examples of each of these phenomena along with many more comparison images, and its definitely worth reading in its entirety. Apple’s clearly on the right track in this small bit of California, but it has a long way to go to match Google, which has benefited from years of community submissions and data collected and refined globally.