Digital Ethics

Digital Ethics

3593 bookmarks
Custom sorting
Data, Compute, Labour
Data, Compute, Labour
The media and the academic world are filled with stories and analyses of how AI will impact our economy. Yet with few exceptions, this work is focused on the impacts that might occur through...
·adalovelaceinstitute.org·
Data, Compute, Labour
Bloomberg QuickTake on Twitter
Bloomberg QuickTake on Twitter
“"Algorithms are racist because we are racist." Data scientist @mathbabedotorg warns that, despite the appearance of fairness and objectivity, most algorithms reinforce the old biases we hoped they'd eliminate https://t.co/gZKvkLFNSN”
·twitter.com·
Bloomberg QuickTake on Twitter
Germany to restrict Facebook's data gathering activities
Germany to restrict Facebook's data gathering activities
Facebook has been ordered to curb its data collection practices in Germany after a landmark ruling on Thursday that the world's largest social network abused its market dominance to gather information about users without their consent.
·reuters.com·
Germany to restrict Facebook's data gathering activities
ACLU on Twitter
ACLU on Twitter
“BREAKING: We're filing a complaint against Detroit police for wrongfully arresting Robert Williams, an innocent Black man — all because face recognition technology can't tell Black people apart. Officers hauled him away in front of his kids and locked him up for 30 hours. https://t.co/84XJs0XWq...
·twitter.com·
ACLU on Twitter
AI researchers say scientific publishers help perpetuate racist algorithms
AI researchers say scientific publishers help perpetuate racist algorithms
The news: An open letter from a growing coalition of AI researchers is calling out scientific publisher Springer Nature for a conference paper it reportedly planned to include in its forthcoming book Transactions on Computational Science & Computational Intelligence. The paper, titled “A Deep Neural Network Model to Predict Criminality Using Image Processing,” presents a…
·technologyreview.com·
AI researchers say scientific publishers help perpetuate racist algorithms
Robert Osazuwa Ness on Twitter
Robert Osazuwa Ness on Twitter
An image of @BarackObama getting upsampled into a white guy is floating around because it illustrates racial bias in #MachineLearning. Just in case you think it isn't real, it is, I got the code working locally. Here is me, and here is @AOC. pic.twitter.com/kvL3pwwWe1— Robert Osazuwa Ness (@osazuwa) June 20, 2020
·twitter.com·
Robert Osazuwa Ness on Twitter
High-tech surveillance amplifies police bias and overreach
High-tech surveillance amplifies police bias and overreach
Police forces across the country now have access to surveillance technologies that were recently available only to national intelligence services. The digitization of bias and abuse of power followed.
·theconversation.com·
High-tech surveillance amplifies police bias and overreach
All I don't wanna do is zoom-a-zoom-zoom-zoom on Twitter
All I don't wanna do is zoom-a-zoom-zoom-zoom on Twitter
A few weeks ago, Zoom admitted that they literally NEVER CONSIDERED how their product would be used to harm vulnerable populations, but now they see us, so that’s progress ¯\_(ツ)_/¯ https://t.co/2bRk3IFFi4— All I don't wanna do is zoom-a-zoom-zoom-zoom (@hypervisible) June 1, 2020
·twitter.com·
All I don't wanna do is zoom-a-zoom-zoom-zoom on Twitter
Fever-Detecting Drones Don’t Work
Fever-Detecting Drones Don’t Work
You’d get basically the same results if you mounted a thermal camera on a pole next to the grocery store.
·slate.com·
Fever-Detecting Drones Don’t Work
Abeba Birhane on Twitter
Abeba Birhane on Twitter
You want to replace books with laptops? You might be doing more harm that good unless you have these contingent issues covered in advance. #edtech #edtechknow #Broussard pic.twitter.com/3bdyNZESfG— Abeba Birhane (@Abebab) April 11, 2020
·twitter.com·
Abeba Birhane on Twitter
HU facial recognition software predicts criminality
HU facial recognition software predicts criminality
A group of Harrisburg University professors and a Ph.D. student have developed automated computer facial recognition software capable of predicting whether someone is likely going to be a criminal. With 80 percent accuracy and with no racial bias, the software can predict if someone is a criminal based solely on a picture of their face. The software is intended to help law enforcement prevent crime. Ph.D. student and NYPD veteran Jonathan W. Korn, Prof. Nathaniel J.S. Ashby, and Prof. Roozbeh Sadeghian titled their research "A Deep Neural Network Model to Predict Criminality Using Image Pro...
·harrisburgu.edu·
HU facial recognition software predicts criminality