Found 45 bookmarks
By relevance
Silkie Carlo on Twitter
Silkie Carlo on Twitter
🚨NEWS: after the Met spent massive resources in central London on Saturday using live facial recognition cameras all day, scanning *36,420* people’s faces, guess what the outcome was?0 correct matches1 wrong match, requiring a member of the public to prove their innocence 🥴 https://t.co/gKDtT0Jjp5— Silkie Carlo (@silkiecarlo) July 18, 2022
tgyateng69·twitter.com·
Silkie Carlo on Twitter
Police want travel card data to track sus­pi­cious rail pas­sen­gers | The Times
Police want travel card data to track sus­pi­cious rail pas­sen­gers | The Times
Police should be able to mon­itor pas­sen­gers who spend hours on the rail­way net­work in case they are pick­pock­ets or sex offend­ers — or are in need of help — a chief con­stable has said. Lucy D’Orsi, the head of the Brit­ish Trans­port Police,...
tgyateng69·thetimes.pressreader.com·
Police want travel card data to track sus­pi­cious rail pas­sen­gers | The Times
Monish Bhatia on Twitter
Monish Bhatia on Twitter
(1/7) Home Office is now introducing facial recognition smart watches to monitor foreign nationals who have completed their sentences and released back into the community. Lucy Audibert (@privacyint) and I clearly explain as to why this is problematic and must be challenged. pic.twitter.com/1h5KrwLa38— Monish Bhatia (@DrMonishBhatia) August 5, 2022
tgyateng69·twitter.com·
Monish Bhatia on Twitter
Clearview AI's plan for invasive facial recognition is worse than you think
Clearview AI's plan for invasive facial recognition is worse than you think
Clearview AI's latest patent application reveals the firm's ongoing plans to use surveillance against vulnerable individuals. According to BuzzFeed News, a patent was filed in August which describes in detail how the applications of facial recognition can range from governmental to social — like dating and professional networking. Clearview AI's patent claims that people will be able to identify individuals who are unhoused and are drug users by simply accessing the company's face-matching system.
·inputmag.com·
Clearview AI's plan for invasive facial recognition is worse than you think
Training data that is meant to make predictive policing less biased is still racist
Training data that is meant to make predictive policing less biased is still racist
Arrest data biases predictive tools because police are known to arrest more people in Black and other minority neighborhoods, which leads algorithms to direct more policing to those areas, which leads to more arrests. The result is that predictive tools misallocate police patrols: some neighborhoods are unfairly designated crime hot spots while others are underpoliced.
·technologyreview.com·
Training data that is meant to make predictive policing less biased is still racist
How Data Can Map and Make Racial Inequality More Visible (If Done Responsibly) | by The GovLab | Data Stewards Network | Medium
How Data Can Map and Make Racial Inequality More Visible (If Done Responsibly) | by The GovLab | Data Stewards Network | Medium
Racism is a systemic issue that pervades every aspect of life in the United States and around the world. In recent months, its corrosive…
·medium.com·
How Data Can Map and Make Racial Inequality More Visible (If Done Responsibly) | by The GovLab | Data Stewards Network | Medium