Should Research into Facial Recognition continue? Expert View

0
737

Three companies have dropped their research into advanced facial recognition software. This decision was largely influenced by the political situation in America. Let’s look at how facial recognition software is connected to the “Black Lives Matter” protests.

Facial recognition has been excessively used by several companies to advance their fields in various ways, it has also been used by governments and law enforcement all over the world to catch and subdue criminal activities, sometimes pre-emptively. But as always with great power comes great responsibility and the easy potential to misuse it. In the past year, 2 major civil protests have been subdued by their respective governments. In Hong Kong, against China, and in Ecuador. The Chinese government has been using facial recognition to suppress civil unrest for years now, but it came to a head in the Hong Kong protests last year, with violent suppression assisted by the city-wide monitoring system. In Ecuador, the same system was used to suppress protests against the actions taken by their President. This is clearly only the beginning of the potential dangers of this technology and it hasn’t even matured.

In America, there is huge political unrest regarding the near-invincible authority of a policeman and their treatment of minorities, African Americans in particular. The incident in which a policeman killed a defenseless George Floyd has touched the hearts and minds of people everywhere. The American Law Enforcement also use Facial Recognition in their regular operations, we’ve all seen over the top movie scenes where they match faces with sci-fi-like technology and accurately identify the suspect. It is surprisingly effective if you’re white. Data shows that technology has misidentified numerous people of the colour time and time again. An article published in December 2019 in the Washington Post states that Asian and African people were up to a hundred times more likely to be misidentified by the system depending on the algorithm used. The article was based on Ivy League studies and is largely accurate.

Amazon, IBM, and Microsoft decided to move away from this technology due to the politics associated with it. But the result is ultimately good, the potential for misuse is extremely high. The world and its respective governments are not ready for advancements in this technology. Let us hope that other companies follow suit, we will obviously eventually revisit this technology when it has more useful applications.