“BREAKING: We're filing a complaint against Detroit police for wrongfully arresting Robert Williams, an innocent Black man — all because face recognition technology can't tell Black people apart. Officers hauled him away in front of his kids and locked him up for 30 hours. https://t.co/84XJs0XWq...
'The Computer Got It Wrong': How Facial Recognition Led To False Arrest Of Black Man
Robert Williams says his driver's license photo was incorrectly matched with a wanted suspect. He says he was arrested and detained. Though the case was dropped, Williams says its effect is lasting.
AI researchers say scientific publishers help perpetuate racist algorithms
The news: An open letter from a growing coalition of AI researchers is calling out scientific publisher Springer Nature for a conference paper it reportedly planned to include in its forthcoming book Transactions on Computational Science & Computational Intelligence. The paper, titled “A Deep Neural Network Model to Predict Criminality Using Image Processing,” presents a…
An image of @BarackObama getting upsampled into a white guy is floating around because it illustrates racial bias in #MachineLearning. Just in case you think it isn't real, it is, I got the code working locally. Here is me, and here is @AOC. pic.twitter.com/kvL3pwwWe1— Robert Osazuwa Ness (@osazuwa) June 20, 2020
High-tech surveillance amplifies police bias and overreach
Police forces across the country now have access to surveillance technologies that were recently available only to national intelligence services. The digitization of bias and abuse of power followed.
All I don't wanna do is zoom-a-zoom-zoom-zoom on Twitter
A few weeks ago, Zoom admitted that they literally NEVER CONSIDERED how their product would be used to harm vulnerable populations, but now they see us, so that’s progress ¯\_(ツ)_/¯ https://t.co/2bRk3IFFi4— All I don't wanna do is zoom-a-zoom-zoom-zoom (@hypervisible) June 1, 2020
You want to replace books with laptops? You might be doing more harm that good unless you have these contingent issues covered in advance. #edtech #edtechknow #Broussard pic.twitter.com/3bdyNZESfG— Abeba Birhane (@Abebab) April 11, 2020
HU facial recognition software predicts criminality
A group of Harrisburg University professors and a Ph.D. student have developed automated computer facial recognition software capable of predicting whether someone is likely going to be a criminal. With 80 percent accuracy and with no racial bias, the software can predict if someone is a criminal based solely on a picture of their face. The software is intended to help law enforcement prevent crime. Ph.D. student and NYPD veteran Jonathan W. Korn, Prof. Nathaniel J.S. Ashby, and Prof. Roozbeh Sadeghian titled their research "A Deep Neural Network Model to Predict Criminality Using Image Pro...
Over the past few weeks, Apple & Google have floated the idea of developing and distributing a digital contact-tracing app that will inform people when they’ve been exposed to someone who’s contracted COVID-19, and communicate to people that they’ve been exposed to you if you later test positive yourself (edit: since writing this, Apple has released a beta of iOS 13 that includes the SDK necessary to begin using this system). Writing this in late April and early May, it feels like we’re desperate for information and weary from not knowing who’s caught COVID-19, who’s still vulnerable, who g...
Making the user agree to legalese they haven't yet probably isn't great #UX, @Ferrari. #DarkPatterns #design pic.twitter.com/0SrmPr6biN— Doug Collins (@DougCollinsUX) May 1, 2020
Security Isn't Enough. Silicon Valley Needs 'Abusability' Testing
Former FTC chief technologist Ashkan Soltani argues it's time for Silicon Valley companies to formalize and test not just their products' security, but its "abusability."