BlogHotel.orgAccueil | Créer un blog | Imprimer la page Imprimer | Blog hasard Au hasard | Chercher des blogs Rechercher | Entrer dans le chat du blog Chat | | Jeux Jeux | Adminitration et édition du Blog Manager

shinenewstop Accueil | Profil | Archives | Amis
shinenewstop

The fight for your face: The battles deciding the fate of facial recognition2/12/2019
With the increasingly pervasive presence of facial recognition technology in public spaces, the decision is now prompting others to follow in its footsteps.

The state of surveillance
In a few short decades, CCTV surveillance cameras have become commonplace in our cities. The UK boasts up to six million cameras, including half a million in London, and the U.S. has an estimated 30 million catching four billion hours of footage every week.

Facilitating the millions of cameras is the fast-evolving facial recognition technology that can record your face and attempt to identify you in real time. The technology works by mapping individual faces in a crowd, and automatically comparing them to a database that can include missing people, suspects, and persons of interest.

Facial recognition technology is silently entrenching itself into our public spaces, from doorbells and convenience stores to actual surveillance vans and body cameras.

New, biased, and error-prone
The apparent eagerness of countries and companies to use facial recognition while it’s still biased and error-prone is grim. The ACLU tested Amazon’s new facial recognition technology, Rekognition, on members of U.S. Congress and found 28 of them were falsely matched with mugshots—most of whom were women and people of color. Across the pond, London’s Metropolitan police misidentified members of the public as possible criminals in 96% of scans performed.

The dangerous consequences of this bias and inaccuracy in facial recognition technology justify the scrutiny surrounding it, and its intrusive nature has pushed more people to call for limitations in their scope.

San Francisco’s ban sets the bar
It’s no wonder, then, that advocacy groups welcomed San Francisco’s preemptive ban on facial recognition technology in government agencies as a good first step. With the tech hub taking the lead, other cities and states are taking note and churning legislation of their own. Oakland and Berkeley, California, and Somerville, Massachusetts are considering bans before government agencies can further implement the technology and the whole state of California itself is considering a total ban on facial recognition tech in police body cameras.

But the fight is facing new challenges. Despite protests from its workers, Amazon’s shareholders overwhelmingly rejected putting regulations on its facial recognition software, Rekognition, and the company continues to heavily market the system to police departments and government agencies in the U.S..

Can the Brits do better? Maybe.
Across the Atlantic, the UK’s police force have found themselves having to defend their use of facial recognition for the first time in court.

Ed Bridges, an office worker from Cardiff, is bringing the South Wales police into court for using facial recognition to capture his image several times in public spaces. Alleging an unlawful invasion of privacy, Bridges’ case is the first of its kind in the UK to challenge the police’s use of automated facial recognition.

This court case could not have been more timely—in the same week, the Metropolitan police in London fined a man £90 for disorderly behavior when he refused to have his face scanned by the Metropolitan police in London during a public trial of the technology.
When comes to the issue of online privacy and security, let recommend a toof called unblock websites. RitaVPN is a relatively new VPN service, but it’s already making a name for itself. Its features include split tunneling, double encryption, kill switch protection, makes it one of the best VPN in 2019. Qwer432
www.buyvpns.net/ www.buyvpnservices.com/ www.fastexpressvpn.com/

Poster un Commentaire

Entry 2542 of 3929
Précédent | Suivant

Blog suivant >> Signaler un abus?Haut de page