All I don't wanna do is zoom-a-zoom-zoom-zoom on Twitter
A few weeks ago, Zoom admitted that they literally NEVER CONSIDERED how their product would be used to harm vulnerable populations, but now they see us, so that’s progress ¯\_(ツ)_/¯ https://t.co/2bRk3IFFi4— All I don't wanna do is zoom-a-zoom-zoom-zoom (@hypervisible) June 1, 2020
You want to replace books with laptops? You might be doing more harm that good unless you have these contingent issues covered in advance. #edtech #edtechknow #Broussard pic.twitter.com/3bdyNZESfG— Abeba Birhane (@Abebab) April 11, 2020
HU facial recognition software predicts criminality
A group of Harrisburg University professors and a Ph.D. student have developed automated computer facial recognition software capable of predicting whether someone is likely going to be a criminal. With 80 percent accuracy and with no racial bias, the software can predict if someone is a criminal based solely on a picture of their face. The software is intended to help law enforcement prevent crime. Ph.D. student and NYPD veteran Jonathan W. Korn, Prof. Nathaniel J.S. Ashby, and Prof. Roozbeh Sadeghian titled their research "A Deep Neural Network Model to Predict Criminality Using Image Pro...
Over the past few weeks, Apple & Google have floated the idea of developing and distributing a digital contact-tracing app that will inform people when they’ve been exposed to someone who’s contracted COVID-19, and communicate to people that they’ve been exposed to you if you later test positive yourself (edit: since writing this, Apple has released a beta of iOS 13 that includes the SDK necessary to begin using this system). Writing this in late April and early May, it feels like we’re desperate for information and weary from not knowing who’s caught COVID-19, who’s still vulnerable, who g...
Making the user agree to legalese they haven't yet probably isn't great #UX, @Ferrari. #DarkPatterns #design pic.twitter.com/0SrmPr6biN— Doug Collins (@DougCollinsUX) May 1, 2020
Security Isn't Enough. Silicon Valley Needs 'Abusability' Testing
Former FTC chief technologist Ashkan Soltani argues it's time for Silicon Valley companies to formalize and test not just their products' security, but its "abusability."
One of our classes has been the victim of some really intense zoombombing, and all I can think about is that this is exactly why ethical speculation around unintended consequences and bad actors is a CRITICAL part of the design process for any new technology. [Thread]— Casey Fiesler, PhD, JD, geekD (@cfiesler) April 8, 2020
The Video-conference service Zoom and its Data Security issues
Amidst the Corona crisis, the video communications service Zoom gained enormous popularity. The rate of daily Zoom users skyrocketed from 10 Mio in December 2019 to 200 Mio in March 2020. As it outshined many of its competitors, Zoom labels itself as “the leader in modern enterprise video communications”. However, the company has been facing […]
AI can’t predict how a child’s life will turn out even with a ton of data
Hundreds of researchers attempted to predict children’s and families’ outcomes, using 15 years of data. None were able to do so with meaningful accuracy.
This is why UX writing matters! @PichinteKevin fixed the original message from @seattletimes so it wouldn’t ask its users if they “wanted coronavirus” 🙄🤦🏻♀️#UXwriting #microcopy #UX pic.twitter.com/8L9xgFv0rg— Mar (@brightspaceux) March 7, 2020