Difference between revisions of "May 12 2022 GM"
Latest revision as of 15:52, 12 May 2022
Glitter Meetup is the weekly town hall of the Internet Freedom community at the IFF Square on the IFF Mattermost, at 9am EST / 1pm UTC. Do you need an invite? Learn how to get one here.
Date: Thursday, May 12th
Time: 9am EDT / 1pm UTC
Who: Fenya Fischler
Where: On IFF Mattermost Square Channel.
- Don't have an account to the IFF Mattermost? you can request one following the directions here.
Colour of Surveillance Europe Conference
EDRi, alongside Controle Alt Delete and Bits of Freedom, will host the Colour of Surveillance Conference on the 15th and 16th September 2022, in Amsterdam, the Netherlands. This conference will draw attention to the racialised nature of surveillance in this region as well as foster connections between racial justice and digital rights groups. I will share more information about the conference, key themes and how to get involved.
Fenya Fischler supports and coordinates EDRi members, builds capacity and knowledge-sharing programmes, contributes to events and builds connections with broader coalitions. She is a an experienced organiser and researcher who has worked within the fields of human rights, feminism, drug policy and migration for over a decade. Outside of EDRi she is involved in queer Jewish community building, Palestine solidarity and anti-racist organising.
Fenya Fisher (@fenya.f on Mattermost, firstname.lastname@example.org via email) is our featured guest today, who will talk about the Color of Surveillance Conference and her work as part of EDRi! She is the Community & Membership at EDRi (edri.org) - coordinating with and working with EDRi's membership. They are a digital rights network in Europe with over 45 organizations working on digital rights. She is also working on organizing the Color of Surveillance Europe Conference and coordinating the Digital Dignity coalition (which tries to bring broader social / racial justice groups into digital rights in very brief)
Can you tell us a little bit about this Colour of Surveillance Conference -- what it's about, who will be there, and how people can apply and attend?
- European Digital Rights, in partnership with Bits of Freedom , Controle Alt Delete and Digital Freedom Fund are organising a European Colour of Surveillance Conference.
- The conference will explore the intersections between racism and surveillance and is aimed at making new connections between racial justice and digital rights groups to join efforts in resisting racism and surveillance. The Colour of Surveillance conference was initially developed by the Center of Privacy and Technology at the University of Georgetown (US).
- We are also working with a content committee to select session, our committee members come from orgs such as ESWA, PICUM, Justice Equity and Technology table.
- Find all the info on what we're looking for + how to apply here
- It will take place on 15-16 Sep in Amsterdam. You can apply to host a session or just attend as a participant!
To get a better idea of the work that this conference will hopefully discuss, can you tell us, what are some of the specific systemic issues facing racialised people in Europe, especially on the broader trends of discrimination and over-surveillance?
- To start... Racialised people in Europe are systematically discriminated against, over-surveilled, and underprotected in all areas of public life. Increasing use of security, surveillance and digital ‘solutions’ are likely to worsen these trends, further monitoring, judging and harming racialised and marginalised groups.
- Some specific examples:
- We see more and more use of AI and other tech to increase policing, migration control and discrimination in access to public and private services
- There are mass surveillance concerns, especially as it's used and tested in areas where racialised people, queer people, migrants are known to be
- There's more and more law enforcement access to databases for immigration control or other purposes, and an expanding reach of the criminal law and risk of discrimination
- More and more content moderation practices are happening by big tech. There's a risk of over-moderation specifically of marginalised groups, e.g. shadow banning of racialised people, women, sex workers
- Attacks on online anonymity, which has particular risks for queer people and sex workers.
Can you share with us how further security and digital "solutions" (and what are some examples of this?) worsen these trends?
- Very briefly, the argument in AI space is that use of AI systems is less discriminatory. This leads to proliferation of more tech being used in public areas, and in employment for example, perpetuating systemic discrimination.
- A lot of police forces are experimenting with it... (even when not legal)
- You can read more here too or here.
To get folks more excited about the conference you'll be hosting, can you tell us how this conference, and other future initiatives, help with creating tools of resistance?
- So for the conference we want to not only explore the current context and what is happening across europe, but also center practical strategies and tools of resistance that are already being practiced or in development
- In short, our hope is that organisers, collectives, people impacted by surveillance and racism will have a space to connect, share experiences and practical strategies and inspire and support each other to find ways to resist surveillance and racism as well as visioning alternatives to the current systems.
- We really want to bring out all the various ways people are resisting across europe; to build connections; inspire each other; share strategies
What strategies did you find for resisting surveillance and racism?
- Some examples:
- racialized people have attacked social scoring AI systems in the NL (i.e. SYRI, child welfare cases)
- app drivers contesting various forms of algorithmic oppression
Are there any parallels or contrasts between racialized communities in Europe, the US and/or other parts of the world?
- There are definitely different conceptions of race. I.e. minorities that are not so present in the US that are more prevalent / in different ways in Europe (E.g Roma), or minorities in general just have different histories and contexts.
- Also, part of the different racial histories, we have to keep in mind more varied forms of colonialism.
Do you remember some cases related to these differences that you are doing? These different forms of colonialism or conceptions of race?
- for example Black communities in Europe have very particular histories and origin stories that are different than those of Black communities in the US
- I'm thinking also of Jewish communities that have very different experiences in Europe or in North America
- Some examples on the use of border technologies:
- Technological Testing Grounds
- Data Protection and Digital Technologies
- https://www.euractiv.com/section/justice-home-affairs/opinion/surveillance-is-at-the-heart-of-the-eus-migration-control-agenda/ Surveillance is at the Heart of the EU’s Migration Control Agenda]
- All of these are also political choices that are very clearly informed by racialised ideas and not neutral
- But the use of technology is an attempt at cloaking this in "neutrality" (but we know technologies aren't neutral and exist in particular structures and contexts)
What have you've seen in the work of EDRi and more generally about how digital rights activists help raise awareness of these issues more effectively? Any particular strategies?
- In Europe (and I'm sure in other places) there's clearly a need for more work at the intersection of digital rights and broader social, racial, economic justice (and other) struggles. We know that digital technologies have the potential to reproduce and exacerbate existing forms of oppression.
- We also need more recognition that the field itself often centers the experiences of dominant groups / with more power and perpetuates uneven power dynamics. This in turn shapes what our priorities are, how digital rights are perceived and who is centered in our work. There is some defensiveness around accepting this in some cases.
- Overall we need to better center the experiences of historically oppressed communities and also recognise that digital rights work can't take place in a vacuum. In short: approach digital rights from a place of understanding of our current structures and systems that privilege some over others.
- An example of a process working to address this that my colleague Sarah is also involved in.
Are there things that have stuck out to you in your work or made you passionate about something in particular?
- I was working for a long time with feminist movements and this is quite new to me - I think especially the quite homogenous nature of people working in the digital rights field in Europe struck me. Sometimes it can also be quite difficult to connect with digital rights as quite technical or inaccessible language is used
- I'm definitely interested in looking at how digital rights intersect with other areas of social justice, racial justice, migrant rights, labor rights etc... and much more work is needed there.
- Digital rights work really touches on all of the aspects of social justice organizing and can't/shouldn't happen in a vacuum
Is it in-person or virtual. If it is in-person, has the Edri team considered how research from the global south can participate? Especially given the topic here on surveillance and algorithmic biases regarding immigration?
- It's in person but we're looking at livestreaming the first day
- It's open for anyone to participate, but needs to be a connection to the European context. We have funding available to support travel, accommodation as well as stipends where relevant