Apple’s New Map
Has Apple closed the gap with Google’s map?
2018 | Expired


❗️ This essay no longer reflects the current state of Apple Maps

⚠️ Tap or click any image to enlarge


Perhaps the biggest surprise about Apple’s new map is how small it is:

Four years in the making, it covers just 3% of the U.S.’s area and 4.9% of its population:

But don’t let its size fool you—it’s a dramatically different map from before, with a staggering amount of vegetation detail.

Here’s Marin County, just north of San Francisco:1

Napa Valley:

And Carmel Valley:

But Apple hasn’t just mapped the wilderness.

Cities are also noticeably more green, like San Jose:

And Sacramento:

But the most striking differences are in smaller cities farther away from the Bay Area, like Crescent City:

Crescent City is one of the 52 county seats located within the new map’s coverage area. Surprisingly, 25% of these county seats had no vegetation or green areas whatsoever on the old map—and now they look completely different.

Here’s Yuba City, county seat of Sutter County:

And Susanville, county seat of Lassen County:

And hundreds of other cities have equally dramatic differences.

But what’s really remarkable about this new vegetation detail it how deep it all goes—all the way down to the strips of grass and vegetation between roads:

And inside of cloverleafs:

And even around the corners of homes:

In an exclusive interview, Apple told TechCrunch:

We don’t think there’s anybody doing this level of work that we’re doing.

And that’s certainly true of this house-resolution vegetation detail. Nobody else has it:

Nor the cloverleaf vegetation:

Nor the green in the smaller cities, like Crescent City:

So where’s Apple getting it?

In “Google Maps’s Moat”, we saw that Google has been algorithmically extracting features out of its satellite imagery and then adding them to its map. And now Apple appears to be doing it too:

All of those different shades of green are different densities of trees and vegetation that Apple seems to be extracting out of its imagery.

But Apple isn’t just extracting vegetation—Apple seems to be extracting any discernible shape from its imagery:

And this is giving Apple many other details, like beaches:

Harbors:

Racetracks:

Parking lots:

Golf course details, like fairways, sand traps, and putting greens:

School details, like baseball diamonds, running tracks, and football fields:

Park details, like pools, playgrounds, and tennis courts:

And even backyard tennis courts:

But look again at that last image. The new map also has building footprints it didn’t have before.

And in addition to adding new building footprints, Apple is also upgrading many of the old ones—including most of San Francisco:

And as TechCrunch showed in its exclusive, some of these upgraded buildings are spectacularly detailed:

Looking at that specific building (Five Embarcadero Center) on the old and new maps, it’s a big difference from before:

But look at what’s happening to the tall building to the right (Four Embarcadero Center). It’s noticeably shorter now, and it looks about the same height as Five Embarcadero Center—which is peculiar because it’s actually twice as tall:

Even stranger, its height doesn’t match Apple’s imagery:

And notice what’s happening to the two towers on the left. On the imagery, the right tower is taller—but on the map, the left tower is taller. (The imagery is correct: the right tower is 48 feet taller than the left tower—but Apple’s new map shows the opposite.)

There’s a similar situation with San Francisco’s 4th and 5th tallest buildings:2

On the new map, San Francisco’s 4th tallest building is now shorter than San Francisco’s 5th tallest building:

And like we saw at Embarcadero Center, the heights don’t match the imagery:3

There are also detail inconsistencies between the imagery and the buildings—even with the buildings that most closely match the imagery:

So maybe Apple isn’t algorithmically extracting these buildings from its imagery?

And if that’s the case, it might explain why Apple’s buildings are missing the rooftop details that Google’s have, like the fans and air conditioners:

Or perhaps Apple is algorithmically extracting these buildings—but Apple’s algorithms just aren’t as advanced as Google’s yet? (Google has been extracting buildings from its imagery since at least 2012—so it has been working on this for twice as long as Apple.)

But if that’s the case, it doesn’t explain why the perimeters of Apple’s buildings are now more precise than Google’s:

This suggests that Apple’s extraction algorithms are more advanced than Google’s. But how can that be, given the inaccuracies and inconsistencies we saw earlier?

Then again, Apple told TechCrunch that its vans have been collecting ground-level lidar imagery—so maybe this explains the greater precision?

But if Apple’s buildings are lidar-derived, it doesn’t explain the shapes of certain buildings, like Salesforce Transit Center in San Francisco:

Salesforce Transit Center looks as if it was created by looking down from the air, rather than up from the ground.

And so do other buildings across San Francisco:

So if not from lidar, then where are Apple’s buildings coming from?

TechCrunch isn’t specific—which is surprising because we’re told so much about everything else, even the computer models inside of Apple’s vans (Mac Pros). But TechCrunch does indicate that Apple’s buildings, vegetation, and sports fields are all made the same way:

Apple is also gathering new high-resolution satellite data to combine with its ground truth data for a solid base map. It’s then layering satellite imagery on top of that to better determine foliage, pathways, sports facilities, building shapes and pathways.

Looking back even further, there’s a clue buried inside of a 2016 Apple press release that announces the opening of a new office to “accelerate Maps development”:

The press release mentions RMSI, an India-based, geospatial data firm that creates vegetation and 3D building datasets. And the office’s large headcount (now near 5,000) suggests some sort of manual / labor-intensive process.

Could this be the source of Apple’s buildings?

It’s not as far-fetched as it sounds. If RMSI is creating Apple’s buildings by manually tracing them from satellite imagery, it would explain how Apple’s building perimeters could be more precise than Google’s algorithmically-generated buildings:

Manual creation would also explain the wide varation in detail from building to building. For example, AT&T Park in San Francisco is modeled to such a degree that even the CocaCola bottle (a children’s slide) is included:

But San Jose City Hall (often pictured as a background image in Bay Area newscasts) is so coarsely modeled that it’s missing its iconic dome:

If these buildings are the work of different modelers, it would explain the variations—and also the height inconsistencies we saw earlier.

And manual creation might also explain why still so few of Apple’s buildings are as detailed as Google’s (because manual creation doesn’t scale as quickly as automated algorithmic extraction):4

We saw earlier (via TechCrunch) that Apple’s buildings, vegetation, and sports fields are all products of the same process. Assuming that at least some of Apple’s buildings are manually created (though we can’t be sure), how many of these other shapes are also manually created?

And is this why Apple’s new map—four years in the making—only covers half of a state?

* * *

Regardless of how Apple is creating all of its buildings and other shapes, Apple is filling its map with so many of them that Google now looks empty in comparison:5

And all of these details create the impression that Apple hasn’t just closed the gap with Google—but has, in many ways, exceeded it...

...but only within the 3.1% of the U.S. where the new map is currently live.

So it’s a good thing then that Apple’s data collection effort seems to be accelerating...

* * *

CRISSCROSSING THE COUNTRYSIDE

It took Google’s Street View vehicles eight years to drive 99% of U.S. roads.6 But Apple might be doing it even faster:

Roughly 86% of U.S. roads lie inside of the counties that Apple says its vehicles have visited since June 2015. And though we don’t know if Apple has driven 100% of each county, Apple’s pace seems to be accelerating.7

All of this driving is giving Apple the data it needs to replace the road data it licenses from TomTom. And Apple appears to be doing just that—like here in San Francisco:

But Apple isn’t just replacing TomTom’s data—it’s improving upon it. For instance, look at how many road-related improvements Apple has made in this suburban neighborhood:

Surprisingly, the neighborhood above is just a few miles south of Apple’s headquarters—an area where Apple executives once thought its map was in good shape:

To all of us living in Cupertino, Maps seemed pretty darn good.

So if this was the state of TomTom’s road data in the Bay Area, imagine the state of its data elsewhere—especially in remote areas. And there are few California communities as small and remote as the tiny—but seismically fascinating—community of Parkfield:

Notice how many of Parkfield’s roads disappear on Apple’s new map.

When Apple’s vans visited, they likely saw nothing but empty fields where those roads were supposed to be:

Apple got those roads from TomTom—so why did TomTom think they were there?

Although Parkfield’s population is just 18 today, it was once a boomtown of 900 people at the end of the 1800s. But shortly after World War I, its mines were exhausted and its population plummeted. And by the 1940s, Parkfield had shrunk to its current size.

Notice that Parkfield’s 1943 street grid looks the same as it does today:

In other words, TomTom’s database somehow has roads from Parkfield’s boomtown days—roads that have been gone for more than 75 years. No wonder why Apple removed them.

But in most communities, Apple is adding roads rather than removing them. And Markleeville, California’s smallest county seat, is a good example:

Notice that even Google doesn’t have all of the roads that Apple has added here:

But for all of the detail Apple has added, it still doesn’t have some of the businesses and places that Google has:

And there are also places that Apple labels differently from Google, like this one:

Apple says it’s the courthouse, but Google says it’s the general store. Who’s right?

Street-level imagery from Google and Bing confirm it’s the general store, while the courthouse is across the street:

It’s surprising that Apple mislabels the general store because TechCrunch said that Apple’s vans were capturing addresses and points of interest along the roads:

After the downstream data has been cleaned up of license plates and faces, it gets run through a bunch of computer vision programming to pull out addresses, street signs and other points of interest.

But what’s even stranger is that “Markleeville General Store” is written on both the front and the side of the building—and according to TechCrunch:

The computer vision system Apple is using can absolutely recognize storefronts and business names.

Yet the businesses that Apple is missing—but that Google has—all have signs along the road:

This suggests that Apple isn’t algorithmically extracting businesses and other places out of the imagery its vans are collecting.

Instead, all of the businesses shown on Apple’s Markleeville map seem to be coming from Yelp, Apple’s primary place data provider:

Meanwhile, all of the businesses that Apple is missing are also missing Yelp listings (or have Yelp listings that are missing street addresses):

Yelp’s place database is only a third as large as Google’s:8

So if Apple’s place data is still coming from Yelp, it would explain why Apple has fewer places than Google here.

That said, there’s a place on Apple’s map with no Yelp listing at all: the “Alpine County District Attorney”. Even stranger, it appears to be a garage:

Alpine County’s website lists the D.A.’s address as a P.O. box at the community center, two miles down the road from the garage. So Apple seems to be misplacing both of the places it added to Markleeville—the D.A. and the courthouse:

Perhaps there’s some sort of larger issue with Markleeville.

But if that’s the case, it doesn’t explain why Bridgeport—the next county seat over from Markleeville—also has these issues. For example, watch what happens to Bridgeport’s police station between Apple’s old and new maps:

The old location was correct—and the new location is a shuttered gas station:

There’s a similar issue with Bridgeport’s post office—notice below that Apple and Google label it in different locations:

Google is correct, while Apple’s location is a trailer:

Meanwhile, Apple and Google also label Bridgeport’s library in different locations, two blocks apart:

And again, Google is correct and Apple isn’t:

And similar to what we saw in Markleeville, all of Apple’s misplaced places have writing on their exteriors:

So it just doesn’t seem as if Apple’s vans are “seeing” these buildings.

Nor does it seem as if these misplacement issues are confined to Bridgeport and Markleeville. Back in Parkfield, for instance, the cafe has shifted further away from its actual location:

And remember the neighborhood we saw earlier with all of the road improvements?

Here’s a tweet from one of its residents:

And there are even misplacement issues in San Francisco. For instance, Apple labels San Francisco’s emergency command center across the street from its actual location:

But what makes all of these misplacement issues so surprising is Apple’s confidence it had resolved them:

When you look at places like San Francisco or big cities from that standpoint, you have addresses where the address name is a certain street, but really, the entrance in the building is on another street. They’ve done that because they want the better street name. Those are the kinds of things that our new Maps really is going to shine on. We’re going to make sure that we’re taking you to exactly the right place, not a place that might be really close by.

* * *

Everything above suggests that, at least in some areas, Apple isn’t extracting place information from the imagery its vans are collecting.9

And if that’s true, it’s unclear how Apple is going to build up a place database of its own—because Apple also isn’t doing a number of other things that Google is doing, such as its Local Guides program:

Google’s Local Guides program, started in 2015, now has 50+ million contributors continually creating and updating Google Maps’s place information.10

But 50 million is minuscule compared to the billions of people who can access and contribute to Google Maps via its website. But Apple Maps’s website has no map—only pictures of them:

This is a problem for Apple because there are an estimated 4.2+ billion internet users worldwide—but only 1.3 billion active Apple devices (and Apple Maps can only be accessed via Apple devices):

So Google has a much larger pool of potential Google Maps contributors. And then on top of that, there’s all of the information scraped by its search engine:

More than 20% of Google searches are location-related. And though it’s unclear how much information Google’s web crawlers add back to Google Maps (addresses? phone numbers? hours? URLs?), Google has used search citations to prioritize Google Maps’s place icons.

Local Guides, a web presence, and a search engine—without these, and without extracting place information from street-level imagery, it’s unclear how Apple will amass a place database as accurate and comprehensive as Google’s.11

And this is a problem for Apple because an increasing number of Google Maps features are built upon place data:

But many of these features are also built upon the data that Google collects about its users—and according to TechCrunch:

Apple is working very hard here to not know anything about its users.

And this includes the places they visit:

Neither the beginning or the end of any trip is ever transmitted to Apple.

Given that places are the start and end points of every trip, this suggests that Apple wouldn’t be able to replicate Google’s “Popular Times” feature:12

Nor does it seem as if Apple could offer Google-style place recommendations because it isn’t capturing users’ location histories:13

We specifically don’t collect data, even from point A to point B. We collect data—when we do it—in an anonymous fashion, in subsections of the whole, so we couldn’t even say that there is a person that went from point A to point B.

But even ignoring Apple’s competitiveness with Google, Apple’s inferior place database also impacts its own stated ambitions in augmented reality (AR) and autonomous vehicles (AVs)—both of which heavily rely on accurate and comprehensive place information.

For instance, if Apple offered AR glasses today, would they correctly label Markleeville’s courthouse?

And would an Apple AV take you to Bridgeport’s post office? Or a trailer two blocks away?

But what’s interesting about AVs is that if Apple were to realize its ambitions here, they would eliminate the need for Apple Maps’s dominant use case: turn-by-turn driving directions.

AVs navigate themselves—so all we’ll really need to know is where we want to go. And Google, with a rapidly-growing autonomy project of its own, seems to have caught on to this.

If you zoom out on Google Maps’s recent features, you’ll notice that they’re increasingly about figuring out “where to go?”:

And even Google’s map seems to be following this pattern. Over the last two years, Google has gradually been turning it inside out, from a road map to a place map:14

Is Google future-proofing itself against a not-too-distant world that has little need for driving directions? Whether or not that’s true, it does seem as if place information might be even more important tomorrow than it is today.15

Of course, Google Maps offers more than just directions for drivers. And here again Google seems to be preparing for the future, and AR appears to be a big part of its plans:

But if you watch Google’s AR navigation demo, you’ll notice that there isn’t much of a “map”; it’s mainly just labels—especially place labels:

In that sense, AR maps are less like traditional maps and more like the “satellite” maps we already have:

Traditional maps are half shapes, half labels—but satellite and AR maps drop the shapes, and keep just the labels. And this spells trouble for Apple...

Remember what we saw earlier: Apple is making lots of shapes out of its imagery:

But Apple doesn’t appear to making labels out its imagery:

Nor does Apple appear to be making labels out of its shapes. For instance, here in San Francisco, Apple has added shapes for these baseball fields—but the baseball fields don’t appear in Apple’s search results, nor are they labeled on the map:

In other words, Apple doesn’t appear to have added these shapes to its place database.

The same goes for these San Francisco basketball courts: Apple has added shapes for them, but they don’t appear in Apple’s search results—nor do they have labels:

And the same for these tennis courts:

Unless they’re already listed on Yelp, none of the shapes Apple has added appear in its search results or are labeled on its map. And this is a problem for Apple because AR is all about labelsbut Apple’s new map is all about shapes.

So is Apple making the right map?16



__

1  Apple’s new map was released to the public on September 17, 2018 as part of iOS 12.

Unless otherwise noted, all screenshots of Apple’s old map were taken between September 10, 2018 and September 17, 2018. And all screenshots of Apple’s new map were taken between September 17, 2018 and September 24, 2018.

By the time you read this, Apple’s map may have changed. ↩︎


2  As recently as 2016, these two buildings were San Francisco’s second and third tallest. ↩︎


3  These building height regressions are surprising because they contradict TechCrunch’s claim that Apple’s buildings are now “more accurate”:

Better road networks, more pedestrian information, sports areas like baseball diamonds and basketball courts, more land cover, including grass and trees, represented on the map, as well as buildings, building shapes and sizes that are more accurate. A map that feels more like the real world you’re actually traveling through. ↩︎


4  Consider that just two years after it started adding algorithmically extracted buildings to its map, Google had already added the majority of the U.S.’s buildings. But after four years, Apple has only added buildings in 64% of California and 9% of Nevada. ↩︎


5  All of this new detail is not without cost. In many areas, Apple Maps’s roads are now harder to see than before. ↩︎


6  In 2014, Google told Wired that its Street View vehicles had “now driven more than 7 million miles, including 99 percent of the public roads in the U.S.”

Given that Google started driving in 2006, this tells us that it took Google’s Street View vehicles eight years to drive 99% of the U.S. ↩︎


7  If you watch the timelapse closely, you’ll notice an almost year-long pause in driving between February 2016 and December 2016. Did Apple hit some sort of technical snag during this period? And is this why, in the middle of this period, Apple partnered with RMSI? ↩︎


8  Part of the reason why Yelp’s place database is so much smaller than Google’s is because Yelp is largely focused on businesses with consumer-facing storefronts. And you can see the consequences of this on Apple’s map, especially with government-related places. ↩︎


9  Or maybe the issue is that Apple’s extraction algorithms just aren’t as good as Google’s yet?

Of course, part of the reason why Google’s algorithms are so good is because Google has been using all of us to train them. ↩︎


10  Another advantage of the Local Guides program is that Google owns everything that’s contributed, including all of the photos.

(It wouldn’t surprise me if Google was scanning these photos for additional information—e.g., accessibility information, menus, prices, etc.—to add back to Google Maps.) ↩︎


11  Part of the reason why these other forms of data collection are so important is because Apple’s vans can’t go everywhere, like inside of theme parks.

For instance, here’s the California’s Great America theme park that’s just seven miles away from Apple’s headquarters. ↩︎


12  It’s odd that Apple refuses to track trip start/end points but sees nothing wrong with mapping tennis and basketball courts in people’s backyards. ↩︎


13  Apple Music competes with Spotify’s algorithmically-generated playlists by offering human-curated playlists. So maybe Apple Maps can compete with Google Maps by offering human-curated “playlists of places” for neighborhoods and cities? ↩︎


14  Even the map’s icons seem to symbolize a larger change. Pin-shaped icons were once given exclusively to search results and trip destinations:

But now they’re given to every place:

It’s almost as if Google is saying this is now a map of destinations—all of the places it’ll be able to take you to, someday soon. ↩︎


15  Google’s ambitions here seem to run far deeper than being just another Yelp or Foursquare. If you zoom out on everything Google is doing, you see the makings of a much larger, end-to-end travel platform↩︎


16  Even though I’m questioning whether Apple is making the “right” map, I’m very excited that Apple is building its own map. The world needs a high quality, privacy-focused mapping platform more than ever, and I very much want to see Apple succeed in this space. ↩︎