
Digital Ethics
Germany to restrict Facebook's data gathering activities
Facebook has been ordered to curb its data collection practices in Germany after a landmark ruling on Thursday that the world's largest social network abused its market dominance to gather information about users without their consent.
ACLU on Twitter
“BREAKING: We're filing a complaint against Detroit police for wrongfully arresting Robert Williams, an innocent Black man — all because face recognition technology can't tell Black people apart. Officers hauled him away in front of his kids and locked him up for 30 hours. https://t.co/84XJs0XWq...
AI researchers say scientific publishers help perpetuate racist algorithms
The news: An open letter from a growing coalition of AI researchers is calling out scientific publisher Springer Nature for a conference paper it reportedly planned to include in its forthcoming book Transactions on Computational Science & Computational Intelligence. The paper, titled “A Deep Neural Network Model to Predict Criminality Using Image Processing,” presents a…
Robert Osazuwa Ness on Twitter
An image of @BarackObama getting upsampled into a white guy is floating around because it illustrates racial bias in #MachineLearning. Just in case you think it isn't real, it is, I got the code working locally. Here is me, and here is @AOC. pic.twitter.com/kvL3pwwWe1— Robert Osazuwa Ness (@osazuwa) June 20, 2020
All I don't wanna do is zoom-a-zoom-zoom-zoom on Twitter
A few weeks ago, Zoom admitted that they literally NEVER CONSIDERED how their product would be used to harm vulnerable populations, but now they see us, so that’s progress ¯\_(ツ)_/¯ https://t.co/2bRk3IFFi4— All I don't wanna do is zoom-a-zoom-zoom-zoom (@hypervisible) June 1, 2020
HU facial recognition software predicts criminality
A group of Harrisburg University professors and a Ph.D. student have developed automated computer facial recognition software capable of predicting whether someone is likely going to be a criminal. With 80 percent accuracy and with no racial bias, the software can predict if someone is a criminal based solely on a picture of their face. The software is intended to help law enforcement prevent crime. Ph.D. student and NYPD veteran Jonathan W. Korn, Prof. Nathaniel J.S. Ashby, and Prof. Roozbeh Sadeghian titled their research "A Deep Neural Network Model to Predict Criminality Using Image Pro...