Found 14 bookmarks
Custom sorting
Medical chatbot using OpenAI’s GPT-3 told a fake patient to kill themselves
Medical chatbot using OpenAI’s GPT-3 told a fake patient to kill themselves
We’re used to medical chatbots giving dangerous advice, but one based on OpenAI’s GPT-3 took it much further. Researchers experimenting with GPT-3, the AI text-generation model, found that it is not ready to replace human respondents in the chatbox. Medical chatbot using OpenAI’s GPT-3 told a fake patient to kill themselves.
·artificialintelligence-news.com·
Medical chatbot using OpenAI’s GPT-3 told a fake patient to kill themselves
The Shuri Network Achievements Summary 2020
The Shuri Network Achievements Summary 2020
How many times have you seen an all-female and black and ethnic minority (BME) panel talking about technology? For many people their first time would have been the Shuri Network launch last July. The Shuri Network was launched in 2019 to support women of colour in NHS digital health develop the skills and confidence to progress into senior leadership positions and help NHS leadership teams more closely represent the diversity of their workforce.
·up.raindrop.io·
The Shuri Network Achievements Summary 2020
How Data Can Map and Make Racial Inequality More Visible (If Done Responsibly) | by The GovLab | Data Stewards Network | Medium
How Data Can Map and Make Racial Inequality More Visible (If Done Responsibly) | by The GovLab | Data Stewards Network | Medium
Racism is a systemic issue that pervades every aspect of life in the United States and around the world. In recent months, its corrosive…
·medium.com·
How Data Can Map and Make Racial Inequality More Visible (If Done Responsibly) | by The GovLab | Data Stewards Network | Medium
Algorithmic Colonisation of Africa - Abeba Birhane
Algorithmic Colonisation of Africa - Abeba Birhane
Colonialism in the age of Artificial Intelligence takes the form of “state-of-the-art algorithms” and “AI driven solutions” unsuited to African problems, and hinders the development of local products, leaving the continent dependent on Western software and infrastructure.
·theelephant.info·
Algorithmic Colonisation of Africa - Abeba Birhane
Between Antidiscrimination and Data: Understanding Human Rights Discourse on Automated Discrimination in Europe
Between Antidiscrimination and Data: Understanding Human Rights Discourse on Automated Discrimination in Europe
Automated decision making threatens to disproportionately impact society’s most vulnerable communities living at the intersection of economic and social marginalization. The report discusses
tgyateng69·eprints.lse.ac.uk·
Between Antidiscrimination and Data: Understanding Human Rights Discourse on Automated Discrimination in Europe
How to make a chatbot that isn’t racist or sexist | MIT Technology Review
How to make a chatbot that isn’t racist or sexist | MIT Technology Review
Tools like GPT-3 are stunningly good, but they feed on the cesspits of the internet. How can we make them safe for the public to actually use? § Sometimes, to reckon with the effects of biased training data is to realize that the app shouldn't be built. That without human supervision, there is no way to stop the app from saying problematic stuff to its users, and that it's unacceptable to let it do so.
·technologyreview.com·
How to make a chatbot that isn’t racist or sexist | MIT Technology Review
OPEN-ODI-2020-01_Monitoring-Equality-in-Digital-Public-Services-report-1.pdf
OPEN-ODI-2020-01_Monitoring-Equality-in-Digital-Public-Services-report-1.pdf
Many of the public and private services we use are now digital. The move to digital is likely to increase as technology becomes more embedded in our lives. But what does this mean for how essential public services understand who is using, or indeed not using, them and why? § Data about the protected characteristics of people using these services isn’t currently collected andstatistics aren’t published in a consistent or collective way. This means it is harder tofind out who is excluded from using these services and why.
·up.raindrop.io·
OPEN-ODI-2020-01_Monitoring-Equality-in-Digital-Public-Services-report-1.pdf