Facebook shares are taking it on the chin today as the Cambridge Analytica story unfolds and we learn just how insecure our Facebook data has been. The mainstream press has — as usual — understood only parts of what’s happening here. It’s actually worse than the press is saying. So I am going to take a hack at it here. Understand this isn’t an area where I am an expert, either, but having spent 40+ years writing about Silicon Valley, I’ve picked up some tidbits along the way that will probably give better perspective than what you’ve been reading elsewhere.

Much of this is old news. There are hundreds — possibly thousands — of companies that rely on Facebook data accessed through an Application Programming Interface (API) called the Graph API. These data are poorly protected and even more poorly policed. So the first parts of this story to dispel are the ideas that the personality test data obtained by Cambridge Analytica were in any way unusual or that keeping those data after their sell-by date was, either. That doesn’t necessarily make the original researcher without blame, but the Cambridge folks could have very easily found the same data elsewhere or even generated it themselves. It’s not that hard to do. And Facebook doesn’t have a way to make you throw it away (or even know that you haven’t), either.

Facebook never really tried to protect its data in any big way. They have a rate limiter to slow down the number of pulls through the API, but it is (maybe was depending on events of this week) all very lenient.  The only trick is getting Facebook members to authorize you. Facebook’s safe harbor, you see, is the fact that you have authorized this specific release of personal data. Often, however, the Facebook member has no idea they have authorized anything.

Much like Nigerian spammers purposely including spelling errors in their emails to trap “dumb” people —the quizzes on Facebook about “Which Star Wars character are you?” are there just to get you to authorize them. Then they go harvest your data. The authorization is built into the terms of service when you take the test.

So don’t take any Facebook quizzes, surveys, or tests — EVER.

The aspect of this story that ought to be of most concern to Facebook members is that once I have authorized someone to use my Facebook data I have authorized them to use not only my data but also that of all my Facebook friends!

As of this morning I have 2,980 Facebook friends. If I was stupid enough to authorize the release of my data I’d be authorizing the release of all the data on 2,980 other people, too. Now maybe I have more Facebook connections than most people, but you can see how getting only a few thousand survey responses can yield hundreds of thousands of records.

I’m told the average active Facebook member has 250 friends, so one person signing up is like getting 250 full and complete profiles. It’s such a broken system with no way to ever opt out.

One thing Facebook needs to do, then, is to give its users an easy way to opt out of any and all such data scams after-the-fact.

Why would Facebook allow such a system to even exist? Some reporters have pointed out that selling data is, itself, a business for Facebook. Actually, it’s not that big a business — certainly not big enough to justify this shit storm.  The more likely reason for such lax behavior is that it drives up the numbers that are of such interest to Wall Street. Total number of users and activity on the Facebook site are important to Wall Street along with ad revenue. And while Facebook is opposed to driving higher numbers with robots: hey, accessing the data on 2,981 members after I click send isn’t a robot, it’s just stupid old me — a longtime member. No bots here.

Wall Street always wants higher and higher activity so Facebook has had little impetus to be good.

I’ll end on a rumor. I have no idea whether this is true or not but the story is going around and it has a ring of truth to me. Just as Cambridge Analytica crunched the numbers and figured out Facebook data could be a valuable tool for influencing voter behavior, Facebook CEO Mark Zuckerberg reportedly came to the same conclusion. Zuck, who has harbored dreams of public office, apparently found the inspiration for those dreams by realizing the incredible manipulation tool he had at his disposal. He could manipulate his way into office. 

Regular readers may remember that one of my predictions for 2018 was that Zuckerberg would this year give up his Presidential ambitions.

I’ll mark that prediction as correct.