Tag: racist-technology

105 links

hybridpedagogy.org > Shea Swauger
Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education
2 apr. 2020 - Cheating is not a technological problem, but a social and pedagogical problem. Technology is often blamed for creating the conditions in which cheating proliferates and is then offered as the solution to the problem it created; both claims are false.
 · algorithmic-bias · black-struggle · calls-for-papers · education · educational-surveillance · educational-technology · eugenic-gaze · eugenics · facial-recognition · feminism · gender · plagiarism · proctoring · proctorio · racist-technology · sexual-violence · solutionism · surveillance · trans-rights · turnitin

www.euractiv.com > Sarah Chander
Technology has codified structural racism – will the EU tackle racist tech?
3 sep. 2020 - The EU is preparing its ‘Action Plan’ to address structural racism in Europe. With digital high on the EU’s legislative agenda, it’s time we tackle racism perpetuated by technology, writes Sarah Chander.
 · biometrics · black-struggle · digital-services-act · eu · facial-recognition · law-enforcement · predictive-policing · racist-technology · surveillance

www.youtube.com > Lisa Nakamura and Philip Howard
Understanding Digital Racism After COVID-19
12 nov. 2020 - The Oxford Internet Institute hosts Lisa Nakamura, Director Digital Studies Institute, Gwendolyn Calvert Baker Collegiate Professor, Department of American Culture, University of Michigan, Ann Arbor. Professor Nakamura is the founding Director of the Digital Studies Institute at the University of Michigan, and a writer focusing on digital media, race, and gender.
 · covid-19 · not-read · racist-technology · social-media

journals.sagepub.com > André Brock
Beyond the pale: The Blackbird web browser’s critical reception
18 apr. 2011 - The browser has become part of our communicative infrastructure, invisible to our information literacy practices until a rupture occurs. In December 2008, the Mozilla-variant ‘niche’ browser, Blackbird, was released. Blackbird’s cultural affiliation with African American users became the rupture for pundits and early adopters. It was derided as racist, unnecessary, and pejorative to the actual needs of Black internet users. This article examines the racial and technological discourses surrounding Blackbird’s release on technology and cultural blogs. Findings indicate that racial ideologies play a factor in the reception of this culturally themed ICT artifact.
 · blackbird · platforms · racist-technology

points.datasociety.net > Emanuel Moss and Jacob Metcalf
Looking for Race in Tech Company Ethics
22 sep. 2020 - This blog post expands on Metcalf and Moss’s report Ethics Owners: A New Model of Organizational Responsibility in Data-Driven Technology Companies. At the bottom, you’ll find a resource list for those interested in diving more into the intersection of race, ethics, and technology.
 · amazon · black-struggle · diversity · ethics · facial-recognition · law-enforcement · organizational-ethics · platforms · racist-technology · rekognition · silicon-valley

datasociety.net
The Hustle Economy: Race, Gender and Digital Entrepreneurship
26 jan. 2021 - Apply to participate in Data & Society’s academic workshop, The Hustle Economy: Race, Gender and Digital Entrepreneurship. This online collaborative program on May 20, 2021 will have space for both deep dives into academic works-in-progress as well as multidisciplinary discussions of alternative practitioner projects that contribute to the understanding of hustle economies and their embodiments. Data & Society’s Director of Research and Associate Professor of Anthropology at the University of Washington Sareeta Amrute, Associate Professor at the University of North Carolina at Chapel Hill School of Information and Library Science Tressie McMillan Cottom, and Assistant Professor of Media Studies at the University of Virginia Lana Swartz invite applications from project leads to workshop their academic papers, podcasts, chapters, data mappings, and so on, and from collaborators to prepare interdisciplinary feedback on the selected works-in-progress. Together, we’ll help develop this emerging field centered on the lived experience, blunders, and promises of the digital economy.
 · black-struggle · gig-economy · platforms · racist-technology

joanna-bryson.blogspot.com > Joanna Bryson
Three very different sources of bias in AI, and how to fix them
13 jul. 2017 - Since our Science paper came out it's been evident that people are surprised that machines can be biased. They assume machines are necessarily neutral and objective, which is in some sense true -- in the sense that there is no machine perspective or ethics. But to the extent an artefact is an element of our culture, it will always reflect bias.
 · accountability · algorithmic-bias · artificial-intelligence · black-struggle · machine-learning · racist-technology · tools-for-justice

www.oneworld.nl > Florentijn van Rootselaar
Hoe Nederland A.I. inzet voor etnisch profileren
14 jan. 2021 - China dat kunstmatige intelligentie inzet om Oeigoeren te onderdrukken: klinkt als een ver-van-je-bed-show? Ook Nederland (ver)volgt specifieke bevolkingsgroepen met algoritmes. Zoals in Roermond, waar camera's alarm slaan bij auto's met een Oost-Europees nummerbord.
 · algorithmic-bias · artificial-intelligence · china · dna · ethnic-profiling · facial-recognition · false-positives · netherlands · predictive-policing · racist-technology

open.spotify.com
Programmed Racism - Global Digital Cultures
24 nov. 2020 - This episode is part of the GDC Webinar series that took place on september 2020. How do digital technologies mediate racism? It is increasingly clear that digital technologies, including auto-complete function, facial recognition, and profiling tools are not neutral but racialized in specific ways. This webinar focuses on the different modes of programmed racism. We present historical and contemporary examples of racial bias in computational systems and learn about the potential of Civic AI. We discuss the need for a global perspective and postcolonial approaches to computation and discrimination. What research agenda is needed to address current problems and inequalities? Chair: Lonneke van der Velden, University of Amsterdam Speakers: Sennay Ghebreab,  Associate Professor of informatics, University of Amsterdam and Scientific Director of the Civic AI Lab, for civic-centered and community minded design, development and development of AI Linnet Taylor, Associate Professor at the Tilburg Institute for Law, Technology, and Society (TILT), PI of the ERC-funded Global Data Justice Project. Payal Arora, Professor and Chair in Technology, Values, and Global Media Cultures at the Erasmus School of Philosophy, Erasmus University Rotterdam and Author of the ‘Next Billion Users’ with Harvard Press.
 · algorithmic-bias · facial-recognition · not-read · racist-technology

www.youtube.com
Understanding Digital Racism After COVID-19
12 nov. 2020 - The Oxford Internet Institute hosts Lisa Nakamura, Director Digital Studies Institute, Gwendolyn Calvert Baker Collegiate Professor, Department of American Culture, University of Michigan, Ann Arbor. Professor Nakamura is the founding Director of the Digital Studies Institute at the University of Michigan, and a writer focusing on digital media, race, and gender. 'We are living in an open-ended crisis with two faces: unexpected accelerated digital adoption and an impassioned and invigorated racial justice movement. These two vast and overlapping cultural transitions require new inquiry into the entangled and intensified dialogue between race and digital technology after COVID. My project analyzes digital racial practices on Facebook, Twitter, Zoom, and TikTok while we are in the midst of a technological and racialized cultural breaking point, both to speak from within the crisis and to leave a record for those who come after us. How to Understand Digital Racism After COVID-19 contains three parts: Methods, Objects, and Making, designed to provide humanists and critical social scientists from diverse disciplines or experience levels with pragmatic and easy to use tools and methods for accelerated critical analyses of the digital racial pandemic.'
 · black-struggle · covid-19 · not-read · racist-technology · social-media

www.dukeupress.edu > Louise Amoore
Cloud Ethics
1 may. 2020 - In Cloud Ethics Louise Amoore examines how machine learning algorithms are transforming the ethics and politics of contemporary society. Conceptualizing algorithms as ethicopolitical entities that are entangled with the data attributes of people, Amoore outlines how algorithms give incomplete accounts of themselves, learn through relationships with human practices, and exist in the world in ways that exceed their source code. In these ways, algorithms and their relations to people cannot be understood by simply examining their code, nor can ethics be encoded into algorithms. Instead, Amoore locates the ethical responsibility of algorithms in the conditions of partiality and opacity that haunt both human and algorithmic decisions. To this end, she proposes what she calls cloud ethics—an approach to holding algorithms accountable by engaging with the social and technical conditions under which they emerge and operate.
 · algorithmic-bias · data-ethics · not-read · project-hva · racist-technology

ainowinstitute.org > Kate Crawford, Meredith Whittaker and Sarah Myers West
Discriminating Systems: Gender, Race, and Power in AI
1 apr. 2019 - The diversity crisis in AI is well-documented and wide-reaching. It can be seen in unequal workplaces throughout industry and in academia, in the disparities in hiring and promotion, in the AI technologies that reflect and amplify biased stereotypes, and in the resurfacing of biological determinism in automated systems.
 · artificial-intelligence · not-read · racist-technology

www.nytimes.com > Kashmir Hill
Designed to Deceive: Do These People Look Real to You?
21 nov. 2020 - The people in this story may look familiar, like ones you’ve seen on Facebook or Twitter or Tinder. But they don’t exist. They were born from the mind of a computer, and the technology behind them is improving at a startling pace.
 · artificial-intelligence · black-struggle · deepfakes · diversity · facial-recognition · gender · generative-adversarial-networks · project-hva · racist-technology

www.newyorker.com > Andrew Marantz
Why Facebook Can’t Fix Itself
12 oct. 2020 - The platform is overrun with hate speech and disinformation. Does it actually want to solve the problem?
 · advertising · attention · brazil · censorship · content-moderation · donald-trump · facebook · freedom-of-speech · hate-speech · islamophobia · mark-zuckerberg · peter-thiel · politics · qanon · racist-technology · social-media · surveillance-capitalism · tech-worker-movement · technology · unions

www.californialawreview.org > Andrew D. Selbst and Solon Barocas
Big Data’s Disparate Impact
1 jun. 2016 - Advocates of algorithmic techniques like data mining argue that these techniques eliminate human biases from the decision-making process. But an algorithm is only as good as the data it works with. Data is frequently imperfect in ways that allow these algorithms to inherit the prejudices of prior decision makers. In other cases, data may simply […]
 · algorithmic-bias · black-struggle · data-mining · fairness · project-hva · racist-technology · recruitment

www.vn.nl > Sennay Ghebreab
Ja, gezichtsherkenningstechnologie discrimineert - maar een verbod is niet de oplossing
5 oct. 2020 - Zoals de dood van George Floyd leidde tot wereldwijde protesten, zo deed de vooringenomen beeldverwerkingstechnologie PULSE dat in de wetenschappelijke wereld. Er werd opgeroepen tot een verbod, maar neuro-informaticus Sennay Ghebreab vraagt zich af of een digitale beeldenstorm het probleem oplost.
 · abolition · algorithmic-bias · amazon · artificial-intelligence · black-struggle · brain-reading · facial-recognition · false-positives · ibm · pulse · racist-technology · rekognition

www.kevindorst.com > Brian Hedden
How (Not) to Test for Algorithmic Bias
Predictive and decision-making algorithms are playing an increasingly prominent role in our lives. They help determine what ads we see on social media, where police are deployed, who will be given a loan or a job, and whether someone will be released on bail or granted parole. Part of this is due to the recent rise of machine learning. But some algorithms are relatively simple and don’t involve any AI or ‘deep learning.’
 · algorithmic-bias · black-struggle · compas · fairness · justice · philosophy · polarization · project-hva · racist-technology

digitalfreedomfund.org > Sarah Chander
Our First Steps to Decolonise Digital Rights
24 sep. 2020 - In early 2020, DFF and its project partner EDRi started their joint work of initiating a decolonising process for the digital rights field in Europe. How does this fit into the current landscape of digital rights and recent developments in the movement for racial and social justice? And what have we been up to these past months?
 · black-struggle · colonialism · decolonization · digital-rights · racist-technology · surveillance-capitalism

decorrespondent.nl > Hans de Zwart
‘In de Tweede Wereldoorlog hadden we wél wat te verbergen’
8 may. 2014 - Welke lessen over privacy kunnen we nu trekken uit de aanslag op het Amsterdamse bevolkingsregister in 1943? ‘Vanuit een gebrek aan vrijheid krijg je een helderder perspectief op wat vrijheid betekent.’
 · algorithmic-regulation · anti-fragility · bijlmer · black-struggle · data-retention · ethnic-profiling · freedom · godwins-law · ibm · identification · lichtbildausweis · nothing-to-hide · racist-technology · resistance · second-world-war · smart-cities · social-engineering

www.bitsoffreedom.nl > Hans de Zwart
Facebook —het grootste land ter wereld— is gemaakt om te profileren (ook etnisch)
23 jun. 2016 - Typhoon werd als zwarte rapper in een mooie auto aangehouden. Sindsdien is de discussie over etnisch profileren terecht losgebarsten. Er is daarbij zelden aandacht voor het feit dat de businessmodellen van diensten uit Silicon Valley grotendeels zijn gebaseerd op profilering, en dat etnisch profileren daarbij als innovatief marketinginstrument wordt aangeprezen.
 · black-struggle · ethnic-profiling · facebook · racist-technology · social-media · universal-pictures

hackeducation.com > Audrey Watters
Robot Teachers, Racist Algorithms, and Disaster Pedagogy
3 sep. 2020 - I have volunteered to be a guest speaker in classes this Fall. It's really the least I can do to help teachers and students through another tough term. I spoke tonight in Dorothy Kim's class "Race Before Race: Premodern Critical Race Studies." Here's a bit of what I said...
 · algorithmic-bias · black-struggle · educational-surveillance · educational-technology · plagiarism · proctoring · project-iis · racist-technology · turnitin · united-kingdom

www.eff.org > Matthew Guariglia
Technology Can’t Predict Crime, It Can Only Weaponize Proximity to Policing
3 sep. 2020 - In June 2020, Santa Cruz, California became the first city in the United States to ban municipal use of predictive policing, a method of deploying law enforcement resources according to data-driven analytics that supposedly are able to predict perpetrators, victims, or locations of future crimes. Especially interesting is that Santa Cruz was one of the first cities in the country to experiment with the technology when it piloted, and then adopted, a predictive policing program in 2011. That program used historic and current crime data to break down some areas of the city into 500 foot by 500 foot blocks in order to pinpoint locations that were likely to be the scene of future crimes. However, after nine years, the city council voted unanimously to ban it over fears of how it perpetuated racial inequality.
 · black-struggle · predictive-policing · racist-technology · united-states

www.facebook.com
Simplifying Targeting Categories
11 aug. 2020 - Over the past few years, we’ve routinely reviewed and refined our targeting options to make it easier for advertisers to find and use targeting that will deliver the most value for businesses and people. Today, we’re sharing an update on our ongoing review and streamlining the options we provide by removing options that are not widely used by advertisers.
 · advertising · facebook · racist-technology · social-media

www.villamedia.nl > Mark Koster
Facebook weigert advertentie met cover van OPZIJ met zwarte vrouw
17 aug. 2020 - Facebook heeft een advertentie met cover van het feministische maandblad OPZIJ offline gehaald omdat deze overeenkomsten zou vertonen met een blackface-afbeelding. Op de cover van het tijdschrift prijkt de beeltenis van Dr. Abbie Vandivere. De wetenschapper haalde de wereldpers met haar ontdekkingen tijdens de restauratie van Vermeer’s Meisje met de parel voor het Mauritshuis. Vandivere is zwart en heeft op de foto haar lippen rood geverfd.
 · black-struggle · censorship · facebook · racist-technology · social-media

dailynous.com > Amanda Askell, Annette Zimmermann, C. Thi Nguyen, Carlos Montemayor, David Chalmers, GPT-3, Henry Shevlin, Justin Khoo, Regina Rini and Shannon Vallor
Philosophers On GPT-3 (updated with replies by GPT-3)
30 jul. 2020 - Nine philosophers explore the various issues and questions raised by the newly released language model, GPT-3, in this edition of Philosophers On.
 · algorithmic-art · algorithmic-bias · art · artificial-intelligence · chat-bots · consciousness · disinformation · freedom-of-speech · gpt-3 · justice · language · philosophy · plagiarism · racist-technology

science.sciencemag.org > Brian Powers, Christine Vogeli, Sendhil Mullainathan and Ziad Obermeyer
Dissecting racial bias in an algorithm used to manage the health of populations
25 oct. 2019 - The U.S. health care system uses commercial algorithms to guide health decisions. Obermeyer et al. find evidence of racial bias in one widely used algorithm, such that Black patients assigned the same level of risk by the algorithm are sicker than White patients (see the Perspective by Benjamin). The authors estimated that this racial bias reduces the number of Black patients identified for extra care by more than half. Bias occurs because the algorithm uses health costs as a proxy for health needs. Less money is spent on Black patients who have the same level of need, and the algorithm thus falsely concludes that Black patients are healthier than equally sick White patients. Reformulating the algorithm so that it no longer uses costs as a proxy for needs eliminates the racial bias in predicting who needs extra care.
 · algorithmic-bias · healthcare · not-read · racist-technology · united-states

gizmodo.com > Sidney Fussell
Why Can't This Soap Dispenser Identify Dark Skin?
17 aug. 2017 - On Wednesday, a Facebook employee in Nigeria shared footage of a minor inconvenience that he says speaks to tech’s larger diversity problem. In the video, a white man and a dark-skinned black man both try to get soap from a soap dispenser. The soap dispenses for the white man, but not the darker skinned man. After a bit of laughter, a person can be overheard chucking, “too black!”
 · biometrics · black-struggle · racist-technology