By Aaron Kesel
Amazon has released an update to its controversial Amazon Rekogntion software which “improves the ability to detect more faces, increases the accuracy of facial matches and decreases the potential for false matches.”
The update is available for all AWS Regions previously leaked by WikiLeaks supported by Amazon Rekognition – U.S. East (N. Virginia), US East (Ohio), U.S. West (Oregon), AWS GovCloud (U.S.), EU West (Ireland), Asia Pacific (Tokyo), Asia Pacific (Mumbai), Asia Pacific (Seoul), and Asia Pacific (Sydney) according to the update. Amazon wrote in a press release written on its blog that,
Today we are announcing updates to our face detection, analysis, and recognition features. These updates provide customers with improvements in the ability to detect more faces from images, perform higher accuracy face matches, and obtain improved age, gender, and emotion attributes for faces in images. Amazon Rekognition customers can use each of these enhancements starting today, at no additional cost. No machine learning experience is required.
Amazon has also added better “face detection” for those not-so-clear images due to low contrast, shadows, low-quality images or bright lighting.
“Face detection” tries to answer the question: Is there a face in this picture? In real-world images, various aspects can have an impact on a system’s ability to detect faces with high accuracy. These aspects might include pose variations caused by head movement and/or camera movements, occlusion due to foreground or background objects (such as faces covered by hats, hair, or hands of another person in the foreground), illumination variations (such as low contrast and shadows), bright lighting that leads to washed out faces, low quality and resolution that leads to noisy and blurry faces, and distortion from cameras and lenses themselves. These issues manifest as missed detections (a face not detected) or false detections (an image region detected as a face even when there is no face). For example, on social media different poses, camera filters, lighting, and occlusions (such as a photobomb) are common. For financial services customers, verification of customer identity as a part of multi-factor authentication and fraud prevention workflows involves matching a high-resolution selfie (a face image) with a lower resolution, small, and often blurry image of face on a photo identity document (such as a passport or driving license). Also, many customers have to detect and recognize faces of low contrast from images where the camera is pointing at a bright light.”
Amazon also states it has improved the problem that the ACLU found with its software of false positives, mismatching faces.
With the latest updates, Amazon Rekognition can now detect 40 percent more faces – that would have been previously missed – in images that have some of the most challenging conditions described earlier. At the same time, the rate of false detections is reduced by 50 percent. This means that customers such as social media apps can get consistent and reliable detections (fewer misses, fewer false detections) with higher confidence, allowing them to deliver better customer experiences in use cases like automated profile photo review. In addition, face recognition now returns 30 percent more correct ‘best’ matches (the most similar face) compared to our previous model when searching against a large collection of faces. This enables customers to obtain better search results in applications like fraud prevention. Face matches now also have more consistent similarity scores across varying lighting, pose, and appearance, allowing customers to use higher confidence thresholds, avoid false matches, and reduce human review in applications such as identity verification. As always, for use cases involving civil liberties or customer sentiments, where the veracity of the match is critical, we recommend that customers use best practices, higher confidence level (at least 99%), and always include human review.
Earlier this year, American Civil Liberties Union of Northern California tested Amazon’s facial Rekognition software and the program erroneously and hilariously identified 28 members of Congress as people who have been arrested for crimes.
According to Jake Snow, an ACLU attorney, the ACLU downloaded 25,000 mugshots from a “public source.”
The ACLU then ran the official photos of all 535 members of Congress through Rekognition, asking it to match them up with any of the mugshots—and it ended up mismatching 28 members to mug shots.
Out of those 28, the ACLU’s test flagged six members of the Congressional Black Caucus, including Rep. John Lewis (D-Georgia.)
Facial recognition historically has resulted in more false positives for African-Americans.
The test came just two months after the Congressional Black Caucus wrote to Amazon CEO Jeff Bezos expressing concern over the “profound negative consequences” of the use of such technology.
The ACLU is rightfully concerned that faulty facial recognition scans, particularly against citizens of color, would result in a possible fatal interaction with law enforcement. Amazon’s Rekognition has already been used by a handful of law enforcement agencies nationwide.
Because of these substantive errors, Snow said the ACLU as a whole is again calling on Congress to “enact a moratorium on law enforcement’s use of facial recognition.”
Activist Post has previously reported on another test of facial recognition technology in Britain which resulted in 35 false matches and 1 erroneous arrest. So the technology is demonstrated to be far from foolproof.
Numerous civil rights organizations have also co-signed a letter demanding Amazon stop assisting government surveillance; and several members of Congress have expressed concerns about the partnerships.
Amazon responded by essentially shrugging off the employees’ and shareholder concerns by the head of the company’s public sector cloud computing business, stating the team is “unwaveringly” committed to the U.S. government.
“We are unwaveringly in support of our law enforcement, defense and intelligence community,” Teresa Carlson, vice president of the worldwide public sector for Amazon Web Services, said July 20th at the Aspen Security Forum in Colorado, FedScoop reported.
Amazon has publicly promoted how police have used its face recognition software to identify people of interest to law enforcement. On Amazon’s website, a systems analyst with Oregon’s Washington County explained how Rekognition was fed a database of 300,000 arrest photos to match against faces seen in surveillance images. It’s significant to note that when a person is arrested they are typically put into a database, whether they are convicted of a crime or not.
In May, the ACLU released troubling internal documents, including an email from a Washington County official, telling Amazon they were using Rekognition to identify “unconscious or deceased individuals” as well as “possible witnesses.”
The privacy concerns are obvious and growing as the U.S. deploys face recognition systems in airports and at borders; and even schools are installing cameras which soon could be equipped with facial recognition.
In June, 20 groups of Amazon shareholders sent its CEO, Jeff Bezos, a letter urging him to stop selling the company’s face recognition software to law enforcement.
“We are concerned the technology would be used to unfairly and disproportionately target and surveil people of color, immigrants, and civil society organizations,” the shareholders, which reportedly include Social Equity Group and Northwest Coalition for Responsible Investment, wrote. “We are concerned sales may be expanded to foreign governments, including authoritarian regimes.”
Amazon’s Rekognition software can analyze images from all types of sources—images or videos from any police surveillance tool—including CCTV, body cameras, and drones all matched against databases.
Activist Post reported in September that Amazon is considering opening 3,000 cashierless Amazon Go stores by 2021; these stores can only be assumed to utilize Amazon’s own facial Rekognition technology for security in the stores, seeing as hundreds of retail stores and soon thousands are investigating using biometric facial recognition software FaceFirst to build a database of shoplifters.
Amazon is planning to open 10 Amazon Go locations by the end of this year, according to the report, while Amazon aims to have at least 50 shops in “major metro areas” such as San Francisco and New York by next year.
Privacy advocate groups, attorneys, and even recently Microsoft, which markets its own facial recognition system, have all raised concerns over the technology, pointing to issues of consent, racial profiling, and the potential to use images gathered through facial recognition cameras as evidence of criminal guilt by law enforcement.
“We don’t want to live in a world where government bureaucrats can enter in your name into a database and get a record of where you’ve been and what your financial, political, sexual, and medical associations and activities are,” Jay Stanley, an attorney with ACLU, told BuzzFeed News about the use of facial recognition cameras in retail stores. “And we don’t want a world in which people are being stopped and hassled by authorities because they bear resemblance to some scary character.”
This is extremely worrying and would enable a “pre-crime” policed environment in establishment stores giving them the power to not only keep track of customers but store a digital profile on them without their consent.
This is conditioning for the Trump administration’s push for a Biometric Exit database at the nation’s borders.
Trump’s executive immigration order on January 27th — best known for suspending visitors to the U.S. from seven majority-Muslim countries — also included an article expediting the biometric exit program. The order further stated that there will be three progress reports to be made over the next year on the program. Trump’s executive order in March built on that specifically limiting biometric scans at the border to “in-scope travelers” or those who aren’t U.S. or Canadian citizens.
Six major U.S. airports including Boston, Atlanta, and New York’s Kennedy Airport, completed trials that were started under the previous Obama administration in airports with plans to roll out next year so now that biometric surveillance can expand to border crossings.
Customs and Border Protection began testing facial recognition systems at Dulles Airport in 2015, then expanded tests to New York’s JFK Airport last year.
In 1996, Congress authorized automated tracking of foreign citizens as they enter and exit the U.S. In 2004, DHS began biometric screening of foreign citizens upon arrival.
The privacy concerns are obvious and growing as the U.S. deploys face recognition systems in airports and at borders; and even schools are installing cameras which soon could be equipped with facial recognition.
Now, we can add retail stores to the mix. Increasingly our rights are decreasing with the help of big corporations like Amazon. But the more worrying problem is the fact these systems are flawed and inaccurate as has been proven time and time again. Now as this technology advances, Amazon is claiming to fix these problems; however, is that a good thing? Probably not. Echoing the words of Jay Stanley of the ACLU, “We don’t want to live in a world where government bureaucrats can enter in your name into a database and get a record of where you’ve been and what your financial, political, sexual, and medical associations and activities are.”
But that’s exactly where we are headed like it or not with zero opposition to the surveillance state agenda.
Aaron Kesel writes for Activist Post. Support us at Patreon. Follow us on Minds, Steemit, SoMee, BitChute, Facebook and Twitter. Ready for solutions? Subscribe to our premium newsletter Counter Markets.
Be the first to comment on "Amazon Releases Update And Improves Facial Rekognition Software As We Enter The Point Of No Return For Biometrics"