In a letter to Congress today, IBM will no longer offer general-purpose facial recognition technology or analysis software, IBM CEO Arvind Krishna said. Also, the company will not research or develop the technology anymore. Cory Booker (D-NJ) and Kamala Harris (D-CA), and Reps. Karen Bass (D-CA), Hakeem Jeffries (D-NY), and Jerrold Nadler (D-NY), Krishna answered the letters.
Krishna suggested new reforms for responsible use of technology and the fight against systematic racial injustice and police misconduct in a letter addressed to Congress, supporting the Justice in Policing Act of 2020.
“IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency,” Krishna said in the document. Krishna wrote in the letter, “We agree that now is the time to launch a national discussion about how and how domestic law enforcement agencies can use facial recognition technology.”
Krishna, who takes over the chief executive position earlier in April this year, suggested that it’s time for Congress to launch a national discussion on the effects of facial recognition technology and how “domestic law enforcement agencies should be employed.”
The CEO also expressed his feelings about racial bias, often found in today’s artificial intelligence systems. Krishna also called for greater oversight in the audit of artificial intelligence technologies, mainly when used in law enforcement and national policies that “bring greater transparency and accountability to police, such as body cameras and modern data analytics.”
Thanks to developments in artificial intelligence, facial recognition software has advanced dramatically in the last decade. At the same time, the technology — as it is often controlled by private companies with little regulation or federal oversight — was shown to endure from bias along the lines of age, race, and ethnicity, which can render the technologies inconsistent for law enforcement and security and adult for possible future civil rights abuses.
For example, a study by the National Institute of Standards and Technology in December 2019 discovered that “empirical evidence for the existence of a wide range of accuracy across demographic differences in the majority of the current facial recognition algorithms that were evaluated,” Also, the technology has come under fire for its involvement in infringement of privacy.
For example, a study by the National Institute of Standards and Technology in December 2019 found “empirical evidence for the existence of a wide range of accuracy across demographic differences in the majority of the existing facial recognition algorithms being tested.” Also, the technology has come under fire for its involvement in the infringement of privacy.
For example, the NIST report didn’t include Amazon’s technology, which is one of the few giant technology companies to offer facial recognition tools to law enforcement. Yet, the program’s name, Rekognition, has also been criticized for its precision. In 2018, the American Civil Liberties Union found that Rekognition wrongly matched 28 members of Congress to faces picked from 25,000 public mugshots.
IBM has sought to deal with the problem of facial recognition bias, releasing in 2018 a public data collection intended to help mitigate bias as part of the training data for a standard for facial recognition. But IBM was also found to share a different training data collection of nearly one million images taken from Flickr in January 2019 without the subjects’ permission — even though the pictures were shared under a Creative Commons license.
IBM told in a statement at the time that only verified researchers would access the data set and included only images that were available to the public. The company has also said that the data set can be opted out by people.
People familiar with the situation told CNBC that George Floyd’s death and the subsequent change in focus on police reform and racial inequity forced IBM to stop its facial recognition devices.