Microsoft Refuses to Answer Key Questions About Child Porn in Bing's Search Results

We may earn a commission from links on this page.

Microsoft’s Bing search engine is facing heat for reportedly including images of child pornography in its search results. The issued was described on Thursday by TechCrunch, which said it commissioned a study on the images propagated by Bing from AntiToxin Technologies, an Israeli startup focused on online safety, in response to an anonymous tip.

Images of child pornography, the definition of which does not require a minor to be engaged in a sexual act, are considered illegal contraband under U.S. federal law and is further considered illegal in at least 93 other countries. Attempting to duplicate the results of AntiToxin’s study, or otherwise access child pornography online, may lead to your arrest. You have been warned.

Advertisement

According to the report, searches on Bing for known pedophile terms return a wide range of illicit content depicting child abuse. Worse, an algorithm employed by the site to recommend similar search terms reportedly functioned normally in response to these searches, assisting the researchers in locating additional child porn images.

Advertisement

The study noted, for example, that researchers searched for a chat website popular among teens along with the word “kids.” Bing reportedly suggested other related searches, including one for 13-year-old girls. That term reportedly surfaced “extensive child pornography when searched.”

Advertisement

Gizmodo did not independently verify the results of AntiToxin’s research, which was reportedly conducted in conjunction with law enforcement. Microsoft confirmed there was an issue, but refused to answer any of Gizmodo’s questions and only provided a brief statement, the veracity of which seems questionable.

From TechCrunch’s report:

The evidence shows a massive failure on Microsoft’s part to adequately police its Bing search engine and to prevent its suggested searches and images from assisting pedophiles. Similar searches on Google did not produce as clearly illegal imagery or as much concerning content as did Bing. Internet companies like Microsoft Bing must invest more in combating this kind of abuse through both scalable technology solutions and human moderators. There’s no excuse for a company like Microsoft, which earned $8.8 billion in profit last quarter, to be underfunding safety measures.

Advertisement

TechCrunch reporter Josh Constine, who has previously reported on child exploitation over WhatsApp, said that Microsoft responded to inquiries by saying that an engineering team was currently working to address the problem. While some of the search terms used by AntiToxin appear to be banned now, he wrote, “others still surface illegal content.”

Microsoft appeared to claim otherwise. A public relations firm forwarded Gizmodo the following statement, attributed to Microsoft Corporate Vice President Jordi Ribas:

“Clearly these results were unacceptable under our standards and policies, and we appreciate TechCrunch making us aware. We acted immediately to remove them, but we also want to prevent any other similar violations in the future. We’re focused on learning from this so we can make any other improvements needed.”

Advertisement

When Gizmodo asked Microsoft to confirm that it had actually blocked all of the search terms that returned sexually explicit images of children—because TechCrunch was clear it had not done so—the company’s PR representative replied: “I checked with Microsoft and the company has nothing further to share.”

In light of the seriousness of its dilemma, you’d expect Microsoft to try to be as transparent as possible about the steps it’s taking to remedy the problem. That doesn’t seem to be the case. The company reportedly took a more defensive posture initially in response to the report, comparing itself to Google—never miss an opportunity!—and saying dumb shit like, “we do the best job we can.”

Advertisement

Microsoft’s best is apparently inadequate. Of the company’s initial response, Constine added:

Microsoft’s spokesperson refused to disclose how many human moderators work on Bing or whether it planned to increase its staff to shore up its defenses. But they then tried to object to that line of reasoning, saying, “I sort of get the sense that you’re saying we totally screwed up here and we’ve always been bad, and that’s clearly not the case in the historic context.” The truth is that it did totally screw up here, and the fact that it pioneered illegal imagery detection technology PhotoDNA that’s used by other tech companies doesn’t change that.

Advertisement

Circulated images of children being abused is a plague on many platforms. It’s been an issue for Facebook, it was a problem on Vine, and it reportedly continues to be a problem for WhatsApp. How Microsoft responds to this report in both the short and long term is crucial. And simply blocking the obvious list of search terms AntiToxin’s team was able to conjure up is in no way sufficient. In fact, it may only be burying the problem.

Advertisement