New Freedom House report on data localization

Data localization requirements have been gaining traction around the world. While they are traditionally introduced as part of data privacy and cybersecurity bills, they can now also be found in bills addressing other pressing political and social issues, including  disinformation and COVID-19. While restrictions on data storage and cross-border transfers are frequently introduced under the guise of protecting user privacy, they raise significant human rights risks as they put user data under the legal purview of governments. I recently partnered with my friends at Freedom on the Net to discuss the grave human rights risks posed by data localization requirements. In the report, we also present some initial ideas for a human rights impact assessment that could be used to systematically assess these risks. You can find the report here.

 

What I Learned From 10 Years in the LGBTQ Movement

In life, we all encounter moments that completely transform our lives and redirect the course of our path ahead. Arriving in San Francisco for the first time ten years ago was one of those moments for me. I don’t think I’ll ever forget the first time I stepped into the Castro District. I felt both immense pride and crippling anxiety as I walked past the iconic rainbow flags and LGBTQ bars on Market Street – Pride because I had finally arrived at a place that would allow me to be my authentic self for the first time in my life; anxiety because I knew very well that being yourself never comes easy and requires some sacrifices along the way.

Before I came to San Francisco, the idea of living my true self seemed out of reach. I was born and raised in Traun, a small town in Austria of about 25,000 people. When it was time to apply for college, I played it safe and went to the University of Salzburg – while the 1,5 hours train ride seemed like a big deal at first, it was in Salzburg that I first realized that there was a world of opportunities outside of Traun. My time in Salzburg would also prepare me for a much bigger leap. During the last year of my undergraduate studies, I was presented with the opportunity to spend an exchange year abroad in yet another small town – Bowling Green, Ohio. BG was everything I imagined an American college town to be like, including a main street, Waffle House and big box supermarkets. While there was no LGBTQ community to speak of, being away from my small hometown for the first time in my life helped me take a step I’d been running from for 23 years. I finally came out. I’d never considered myself a brave person, but in that moment I found a strength I never knew I had. But this was 2008 in a small town in the Midwest, so just like in my hometown, I felt like there was no one else like me. I wasn’t sure where to turn to in order to find people like me. While I struggled with my coming out, I also enjoyed significant privileges that so many young LGBTQ people don’t have – the unwavering support of family and friends, and the resources to travel to a distant place that would help me come to terms with my identity.

Only months after returning from my exchange year in Ohio, I had the opportunity to spend several months in San Francisco for research. Of course, I was aware that San Francisco was home to those who didn’t seem to fit in anywhere else, so the prospect of finding a space where I could fully be myself suddenly seemed within reach. My first days in San Francisco were a culture shock to say the least. This was the first time I ever saw people living out and proud. I couldn’t believe my eyes when I saw a same-sex couple walking hand in hand in Dolores Park. For the first time in my life, I felt truly alive.

When I first came to San Francisco, I was mostly excited about meeting queer people and visiting the bars I heard so much about. What I didn’t know when I first left for California was that I’d find something just as important – a sense of purpose and a community that made me realize I didn’t have to do this by myself. It was not only the feeling of belonging that I learned to admire about the community, but also something else – it was here that I first met activists fighting relentlessly for equal rights. I came to understand the importance of grassroots activism and that we all have a role to play in bringing about change.

As I spent more time in San Francisco and later moved to the East Coast for graduate school, I had the privilege of witnessing significant moments for LGBTQ history. I met activists in California fighting an uphill battle against Proposition 8, witnessed the end of Don’t Ask, Don’t Tell, and celebrated in the streets of Baltimore as the Supreme Court legalized same-sex marriage. However, with each victory, it also became painfully clear that some of the greatest challenges were yet to be faced. Some of the most vulnerable members of the LGBTQ community – in particular the B and T and queer people of color – remain invisible. From the loss of critical protections for transgender people to rampant news about yet another Black transgender woman losing her life to violence, it seems like these victories were truly only the beginning in the fight for equality.

As we continue to fight on so many fronts, it is sometimes hard to know where to begin. One thing I’ve learned in almost a decade as a member of the LGBTQ community is that we have the power to bring about significant change if we stand together. Earlier this year, I had the privilege of joining LGBTQ leaders in D.C. to witness the House of Representatives take a historic step by passing the Equality Act, which would expand the Civil Rights Act and other non-discrimination laws to include sexual orientation and gender identity as protected characteristics, providing critical non-discrimination protections for queer people in all fifty states. While the Equality Act is unlikely to pass in the Senate anytime soon, it was a historic moment for LGBTQ rights. It was also a reminder of the battles ahead. Ten years after first arriving in San Francisco, I am no longer scared and proud to fight alongside my community for a future free of hate and violence.

Addressing Hate & Bias in Search Algorithms

For the last two weeks, the city of Chemnitz in Germany has been rocked by right-wing demonstrations that have escalated into violence against migrants and other minority communities. Internet platforms have not only played a mobilizing role for Germany’s far right movement, but have also served the right’s agenda by spreading misinformation and extremist viewpoints. This is in line with researcher and CUNY professor Jessie Daniels’ discussion of the “Algorithmic Rise of the ‘Alt-Right’,” with the Internet’s distributed design allowing supremacists to spread hate on an unprecedented scale. News coming out of Chemnitz also demonstrate the harmful, real-life consequences that biased search results can have. Research has shown that YouTube, in particular, has played a significant role in mainstreaming extremist viewpoints and conspiracy theories in search results. YouTube’s search algorithms and recommended pages were significantly less likely to present users with mainstream news and balanced content.

The Chemnitz riots are not the only example of algorithms amplifying hate. Google has similarly come under scrutiny for promoting supremacist websites and Holocaust denial in response to the search query “did the Holocaust happen?” Researchers and journalists have also shown how search algorithms can undermine social justice by suppressing marginalized voices and feeding into societal biases. In the seminal book “Algorithms of Oppression,” Dr. Safiya Umoja Noble highlights the racist and sexist notions underlying Google’s search algorithms, with searches for women color routinely returning sexualized images. On YouTube, algorithms have not only demonetized the videos of transgender creators, but continue to block and demote LGBTQ-related videos in search results.

Driving the search algorithms of Internet platforms are advertising-based business models that maximize revenue by presenting users with the kind of content that keeps their attention on the platform for as long as possible. These algorithms constantly evolve by learning from users’ engagement. However, since companies consider algorithms trade secrets, they continue to leave users in the dark about the logic behind the search results being presented to them.

Making algorithms accountable to vulnerable users

Given the secretive and non-transparent nature of these algorithms, researchers and civil society actors struggle with the question of how best to hold Internet companies accountable for their algorithmic designs. Public outcry and activist pressure have yielded some positive change. After Safiya Umoja Noble and other researchers brought attention to the racist notions underlying Google Search results, the company modified its algorithm to represent Black women in a less sexualized way. Likewise, the company demoted websites promoting Holocaust denial from top search results. However, companies continue to respond reactively rather than proactively to pubic concerns related to their automated decision-making.

Significant changes have to be made by tech companies to ensure that the algorithms they create do not reflect the societal hate and biases encountered by minority users. They also have a unique opportunity to challenge racism, anti-semitism and other forms of prejudice through educational efforts. Journalist Yair Rosenberg argues that removal of hate speech “suppresses a symptom of hate, not the source.” Instead, he maintains that companies should work closely with groups like the Anti-Defamation League to develop counter-speech measures like disclaimers warning users that they are engaging with content promoting Holocaust denial. This could also be a helpful strategy for YouTube and other companies wanting to create educational opportunities for users searching for news on events like the Chemnitz protests.

In order to ensure that search algorithms do not promote content posing a challenge to social justice, companies should also take a stand against hate in their terms of service. Enforcement of these terms should not solely rely on automated systems. They must also involve humans making decisions about the types of content that violate the company’s terms.

Calling for greater transparency regarding companies’ terms of service enforcement, initiatives like the Ranking Digital Rights project have prompted companies to improve their transparency reporting by informing users about the number of accounts and content taken down for violating their terms. With its recent Transparency Report, Facebook has taken an important first step in this direction. In this report, the company publishes statistics on the company’s responses to hate speech and other types of content prohibited in its Community Standards. Companies should also provide appeals mechanisms that allow users to submit a complaint if they think their content was taken down wrongfully.

These strategies will only have long-term impact if companies work closely with affected communities when making decisions that shape user rights. Discussing algorithmic bias against vulnerable users, Farhad Manjoo writes in the New York Times, “These people — women, minorities and others who lack economic, social and political clout — fall into the blind spots of companies run by wealthy men in California.” It is critical for anti-hate organizations and other groups representing minority users to have a seat at the table when companies make changes to their policies and technological designs.

Some of the civil rights challenges that have resulted from algorithmic designs for minority communities directly contradict recent claims of tech companies having a liberal bias. In her work, Safiya Umoja Noble argues that automated decision-making and algorithms will be among the human rights issues of this century. Creating algorithms with companies’ vulnerable users in mind will be critical in ensuring that an increasingly automated online environment protects the human rights of all users.