Source: Who Writes The Rules

6 links

www.whowritestherules.online > Nakeema Stefflbauer
Dr Nakeema Stefflbauer: ‘#defundbias in online hiring and listen to the people in Europe whom AI algorithms harm’
23 aug. 2021 - The first time I applied to work at a European company, my interviewer verbally grilled me about my ethnic origin. “Is your family from Egypt? Morocco? Are you Muslim?” asked a white Belgian man looking for a project manager. He was the CEO. My CV at the time was US-style, without a photograph, but with descriptions of research I had conducted at various Middle East and North African universities. I'd listed my nationality and my BA, MA, and PhD degrees, which confirmed my Ivy League graduate status several times over. “Are either of your parents Middle Eastern?” the CEO persisted.
 · algorithmic-bias · belgium · black-struggle · eu · racist-technology · recruitment

www.whowritestherules.online > Hera Hussain
Hera Hussain: 'Decolonising digital rights'
23 aug. 2021 - For as long as I can remember, I’ve felt the duty of being that woman who sits in a meeting room in London, Geneva, New York, Berlin and Paris and talks about what digital rights mean for not just people of colour in Europe and North America, but across the rest of the world. Approximately 84% of the world’s poor live in South Asia and sub-Saharan Africa, and the digital divide remains steep but that’s only part of the story. These aren’t passive consumers of the web. They’re active prosumers. TikTok has been downloaded over 360 million times in South East Asia, a region of 658 million people. With social platforms, anyone with a phone can become a star, make money, connect with others, build a family of choice and acceptance, fall in love, and live a life they may not be allowed otherwise.
 · digital-rights · facebook · feminism · harassment · platforms · racist-technology · tiktok · trauma · youtube

www.whowritestherules.online > Asha Allen
Asha Allen: 'The Brussels bubble: Advocating for the rights of marginalised women and girls in EU tech policy'
23 aug. 2021 - Since 2017, the issue of online violence against women and girls has increasingly crept up the EU political agenda. Thanks to the collective work of inspirational activists, I have the honour to work side-by-side with, making sure that the reality of the persistent harms racialised and marginalised women face is recognised as a marked win. This has not been without its challenges, particularly speaking as a young Black woman advocate in the Brussels political Bubble.
 · black-struggle · diversity · equality · eu · feminism · harassment · inclusion · intersectionality · platforms · racist-technology

www.whowritestherules.online > Aina Abiodun
Aina Abiodun: ‘The cost of their enrichment is my continued oppression’
23 aug. 2021 - I was once such a passionate advocate of the web that I made it my business to preach the gospel to my then skeptical friends, that technology would deliver a democratized, equitable and creatively limitless future to us all. After all, it was for everyone. And it was free. And there were no rules.
 · advertising · beauty-ideal · black-struggle · feminism · harassment · racist-technology · social-media

www.whowritestherules.online > Raziye Buse Çetin
Raziye Buse Çetin: 'The absence of marginalised people in AI policymaking'
11 mar. 2019 - Creating welcoming and safe spaces for racialised people in policymaking is essential for addressing AI harms. Since the beginning of my career as an AI policy researcher, I’ve witnessed many important instances where people of color were almost totally absent from AI policy conversations. I remember very well the feeling of discomfort I had experienced when I was stopped at the entrance of a launch event for a report on algorithmic bias. The person who was tasked with ushering people into the meeting room was convinced that I was not “in the right place”. Following a completely avoidable policing situation; I was in the room, but the room didn’t seem right to me. Although the topic was algorithmic bias and discrimination, I couldn’t spot one racialised person there — people who are most likely to experience algorithmic harm.
 · ai-ethics · algorithmic-bias · artificial-intelligence · black-struggle · eu · racist-technology · regulation