An Open Letter To Facebook From The Data For Black Lives Movement
Over the past several years, Cambridge Analytica harvested the data of 87 million active Facebook users. We may never know what would have happened if this data did not land in the hands of individuals seeking to steer the country in the direction it is now. What we do know is that it’s not too late to do the right thing. This data, which would not exist without the 2.1 billion people who use Facebook, can and should be used to achieve a level of progress that fulfills the true promise of technology. Mr. Zuckerberg, I believe the world is ready for Facebook to live up to its mission — as “a tool for building community and bringing the world closer together.”
In the world we live in, data is destiny. For Black people, who have been disproportionately harmed by data-driven decision-making, this is especially true.
We urge you to work with Data for Black Lives and make a commitment to the following:
1. Commit anonymized Facebook data to a Public Data Trust.
2. Work with technologists, advocates, and ethicists to establish a Data Code of Ethics.
3. Hire Black data scientists and research scientists.
I founded Data for Black Lives because very early on I realized that data becomes a tool of profound social change or a weapon of political warfare — depending on whose hands it is in. In the United States, racism has always been numerical. The decision to make every Black life count as three-fifths of a person was embedded in the electoral college, an algorithm that continues to be the basis our current democracy. Histories of redlining, segregation, voter disenfranchisement and state-sanctioned violence have not disappeared, but have been codified and disguised through new big data regimes.
I grew up seeing hardworking families become homeless because of a three-digit FICO credit score. Seeing people who would never otherwise steal if it wasn’t for lack of economic opportunity, over-sentenced for crimes that should have no jail time, all because risk assessments have replaced judges and juries. I watched sharp and tenacious young people who saw education as a way out of poverty and gang life barred from access to college and denied any opportunity to pursue STEM degrees due to high stakes testing.
I have also seen the power of data to build social movements, amplify the voices of those who have been disenfranchised, push forward policy and set forth new blueprints for the future. Behind the power of technology, and behind every massive data set — are people. When the Data for Black Lives inaugural conference sold out a week after I announced via email, it was clear that I had struck a powerful chord that resonated throughout the world.
Representing 23 states and five countries, people came from all over to join us at the MIT Media Lab. In the room were hundreds of advocates tirelessly leading decades-long campaigns to make access to living wages and safe and affordable housing a reality for all, activists and organizers who showed up determined to use the power of data to make sure no family has to bury their child because of gun violence, police brutality, or preventable health issues like asthma and maternal mortality. With us were mathematicians, data scientists, and engineers who could not sit by and watch as the work they loved so much also cause so much harm.
‘We have collectively been too optimistic about what we build and our impact on the world,” wrote Alex Stamos — Facebook’s Chief Security Officer — on Twitter. “Believe it or not, a lot of the people at these companies, from the interns to the CEOs, agree.”
I learned to collect, analyze and use data as a high school student because early on I realize that alone we could be ignored — but there was power in a number. When students at a neighboring high school organized a peaceful protest after a school administrator put a 9th grader in a headlock, it made national news, but not the way it would today. I will never forget seeing the footage of SWAT team units flooding the school, of police shoving the small frames of students I grew up with against police cars. On CNN and local news, the headlines read, ‘Riot at Miami Edison Senior High.’ I knew that unless we found ways to be heard, to disrupt the narratives that facilitated this level of abuse, my future and the future of many other young people would be at stake.
When we were turned away from public hearings and ignored by school board members, we surveyed 600 students about their experiences and shared the findings in a comic book. Survey collection was a microphone, a way to provide data that reflected the disparate impact of suspensions and arrests happening in our schools.
Data was collective action and accountability for the horrific human and civil rights abuses that had been allowed to occur for far too long.
The first of its kind, the Public Data Trust will be a clearing house where students, community leaders, organizers, scientists and developers can access anonymized Facebook data for research in service of the public interest.
Facebook has given researchers access to developer tools for years to do research on a number of issues, but rarely is this research to benefit Black people and Black communities, and other historically disenfranchised communities domestically and worldwide. Facebook researchers have used surveys to better understand housing prices on a local and national level, artificial intelligence to identify suicide attempts before they happen, and statistical methods that have given them access to more data on natural disasters than rescue groups and federal agencies could dream of. Year round, Facebook Analytics Team’s data scientist use the data to sell products, aiding advertisers and other for-profits whose mission is not very different from Cambridge Analytica.
It is simply not enough for Facebook to make sure this never happens again. We see this as an opportunity for Facebook to use its data to defend the civil and human rights of everyone in this country, especially those whose lives will never be the same because of the actions of this current administration. Federal support for research on urgent public health issues like gun violence, gang prevention, maternal and infant health, and heart disease has all but disappeared. Facebook data made available to Black researchers and community-led organizations has the potential to fill the gaps in publicly available data that is outdated, full of errors, and often collected as a tactic of law enforcement, with the intent of criminalization and surveillance.
If Facebook is serious about positively impacting the world, it will not restrict access to data because of the missteps of Facebook staff and the deceit of individuals at Cambridge Analytica , but share the most valuable resource it has — its data — with the thousands of researchers, activists, organizers and community who are doing good work with very little. What would it look like for Facebook to partner with Black women’s health organization to use sentiment analysis to better understand how to provide support for a mother, before a baby dies? How can we use new applications of statistical methods to scale the work of activists working tirelessly to organize ceasefires and using Facebook as a way to intervene in a shooting before it happens? What new knowledge, insights, research questions and models can the vast amount of Facebook data we create daily lend to the achievement of racial justice and equity right now?
I don’t have the answer to these questions, but I know people who do. For every Aleksndr Kogan, Christopher Wylie and Robert Mercer there are countless Black scientists, researchers, students and people of all backgrounds with the technical skills, vision and empathy to change the world. They should not be punished because of the actions of others, and those who never had access to the developer tools and data resources at the disposal of researchers should not be further excluded. This Public Data Trust will create inroads for the public to receive training in data ethics and privacy as well as empower community groups to harness the power of the data to make real change in all of our lives.
We urge Facebook to work with organizers, activists, technologists and data ethicists to establish and implement a Data Code of Ethics.
For Black people, people of color, immigrants and poor people, who have been the subject of experiments without our consent for generations, what happened with Cambridge Analytica feels all too familiar.
When Alexandr Kogan told Facebook that his personality test was a ‘very vanilla social network app’ — his words spoke volumes. Whether it is the story of Henrietta Lacks, Nazi experiments or the Tuskegee airmen, data collection, surveillance, and experimentation on human subjects without their consent is part of a long history of science being used to enforce white supremacy. In the age of big data, unless we are conscious of this history, we risk repeating it.
Now is time that we address an issue that is long overdue: the absence of a robust, transparent, and accountable research review process within Facebook. In 2012, participants were never given the opportunity to opt out of an experiment where their emotions were deliberately manipulated through Facebook algorithms for the purposes of researching the impact of social media on mood and emotions. Although Facebook has developed its own internal review process lead by Public Policy research manager Molly Jackman since then (read about it here), it is rarely if ever subject to external review. Virtually every university and research institution in the country is subject to rigorous external review. Certainly Facebook, a private company, should be no exception.
While we commend Facebook researchers in the steps taken, we believe that any and all research is rendered invalid without the trust, consent and collaboration of community and the people directly impacted. In my training as a data scientist, the rigorous process of meeting the guidelines necessary to receive Institutional Review Board approval was crucial — through it I learned the violent history of human subject research and made a commitment to never repeat it.
With our network of over 2,000 scientists and activists, we are developing a Data Code of Ethics to be unveiled when we convene for the next Data for Black Lives Conference in January 2019. 90% of the data that exists today was collected in the last ten years. It is time we work together to set standards for human subject research in the age of big data and we believe it will be in the best interest of Facebook to join us in this effort.
Commit to increase the number of Black people on Facebook’s research and analytics staff.
There is however, no substitute for having Black researchers and data scientists within institutions like Facebook who understand how social media, big data and analytics impacts Black people and communities.
Currently, Facebook’s diversity numbers are abysmal, with only 1% of all US technical employees identifying as Black and far fewer African-American. If Facebook is seriously committed to justice, it will work to increase the number of Black people on its research staff. If identifying qualified candidates are the issue, we know of many Black research scientists, data scientists and engineers who jump at the opportunity to help shape the future of Facebook.
Many of the people who are a part of our network are employed at Google, Microsoft, Twitter, and even, Facebook. We have partnered with Google as a presenting sponsor of our conference because we know that within these multinational corporations are individuals who have a vested interest, a stake in making sure that their everyday jobs are not harming but helping the majority of people that use their services. And everyday more and more of these people are choosing to boldly take a stand for racial justice and equity within their companies. They are not blindly optimistic, but they are hopeful.
The actions of Facebook and Cambridge Analytica are a shocking betrayal of public trust. And we believe that individuals within both organizations should be held accountable. But there is a larger principle at stake. In our increasingly digital and automated world, data is power. And Facebook is sitting on one of the largest data sets that has ever been amassed. With that power comes enormous responsibility. Facebook has entrusted its data to the wrong people. And the consequences have been catastrophic. We ask Facebook to entrust its data to us.
We invite Facebook to partner with the Data for Black Lives movement to develop new ethical standards and to become a role model among tech companies for how to use big data to promote justice and equity. To Facebook employees, I personally invite you to the second Data for Black Lives conference to be held in early January 2019 at the MIT Media Lab. We are a community of people from all races who have the technical skills, the vision and the empathy to unleash the power of social network data for the future of this country. This future can begin now.
For individuals, organizations to sign on to our letter please, go here: d4bl.org/action
This article was originally posted on Medium. It is reposted here with the permission of the author, Yeshimabeit Milner.
Sign up for the Moguldom newsletter — business news you need to know about economic empowerment for the digital age, delivered to your inbox.