Chevron icon It indicates an expandable section or menu, or sometimes previous / next navigation options. HOMEPAGE

Microsoft took an ethical stand on facial recognition just days after being blasted for a sinister AI project in China

Brad Smith
Microsoft President Brad Smith. Pedro Fiúza/NurPhoto via Getty Images

  • Microsoft announced on Tuesday that it rebuffed a request from a US police agency to install its facial recognition software on officers' car and body cameras.
  • President Brad Smith cited human rights concerns, saying that AI bias could mean a disproportionate number of women and ethnic minorities would end up being held for questioning.
  • While Microsoft is taking an ethical stand on AI, less than a week ago it was accused of being complicit in helping China develop facial analysis AI, which could be used to oppressed its Uighur Muslim population.
  • Visit BusinessInsider.com for more stories.
Advertisement

Microsoft President Brad Smith announced on Tuesday that the company refused a request from a US police department to install its facial recognition software, citing human rights concerns, Reuters reports.

Speaking at a Stanford University conference on ethical AI, Smith said Microsoft had received the request from a California law enforcement agency to install the technology in officers' cars and body cameras.

"Anytime they pulled anyone over, they wanted to run a face scan," Smith said, adding the officer would check the person's face against a database.

Read more: Artificial intelligence experts from Facebook, Google, and Microsoft called on Amazon not to sell its facial recognition software to police

Advertisement

He said the company concluded that the inherent bias in facial recognition — which is largely trained on white male faces — meant that it would be less accurate identifying women's and people from ethnic minorities' faces, therefore they would end up being held for questioning more frequently.

Smith called for tighter regulation on facial recognition and AI in general, warning that data-hungry companies could end up in a "race to the bottom." His comments come as pressure is building on Amazon to stop selling its facial recognition "Rekognition" software to law enforcement. Amazon shareholders are due to hold a vote on the issue on May 22.

However, Smith said Microsoft had provided the software to a US prison. Smith also told Business Insider in February that an all-0ut ban on selling facial recognition software to government agencies would be "cruel in its humanitarian effect."

Less than a week ago, the company's reputation took a bruising when it was accused of being complicit in helping a Chinese military-run university develop AI facial analysis, which critics said China could then use to oppress its citizens — specifically the country's Uighur Muslim minority.

Advertisement

Sen. Marco Rubio of Florida, one of the US government's most vocal China critics, described Microsoft's partnership with the Chinese military as "deeply disturbing" and "an act that makes them complicit" in China's human rights abuses.

Microsoft did not immediately respond to Business Insider's request for comment. 

Microsoft AI Amazon
Advertisement
Close icon Two crossed lines that form an 'X'. It indicates a way to close an interaction, or dismiss a notification.

Jump to

  1. Main content
  2. Search
  3. Account