fbpx

Lawyer Jamal Greene, Talib Kweli’s Brother, Joins Facebook’s New Elite Censorship Police Force

Lawyer Jamal Greene, Talib Kweli’s Brother, Joins Facebook’s New Elite Censorship Police Force

Greene
Lawyer Jamal Greene, brother of hip-hop artist Talib Kweli, has joined Facebook’s new elite censorship police force — the Oversight Board. (Photo: Twitter)

Global tech giant Facebook needs someone other than Mark Zuckerberg to keep it in check so the platform has finally named the first 20 members of its independent, external Facebook Oversight Board.

The brother of hip-hop veteran Talib Kweli, lawyer Jamal Greene, has joined the board of Facebook’s new elite censorship cops. 

The board, dubbed by some as Facebook’s “Supreme Court,” will hear cases and decide whether or not some individual pieces of content can be posted on the site. It will begin hearing cases this summer, and can also recommend changes to Facebook’s content policy.

Greene is a Columbia Law School professor whose scholarship focuses on constitutional rights adjudication and the structure of legal and constitutional arguments. 

The board includes four co-chairs who helped select the others. Greene is one of the four, along with Colombian attorney Cataline Botero-Marino, who has expertise in freedom of expression and human rights. The other co-chairs include Micheal McConnell, a constitutional law professor at Stanford Law School and former U.S. federal circuit judge appointed by Republican President George W. Bush, and Helle Thorning-Schmidt, a social Democrat who was the first woman prime minister of Denmark (2011-2015).

Other board members include South African human rights advocate Afia Asantewaa Asare-Kyei; Evelyn Aswad (a University of Oklahoma College of Law professor); Stanford law professor Pamela Karlan; human rights activist Tawakkol Karman (the first Arab woman to win a Nobel Peace Prize in 2011); and Kenyan lawyer and human rights activist Maina Kiai.

“What we’ve looked for is people who have substantial expertise and independence of mind,” Greene told Wired

The process of selecting a board has taken about a year.

In 2019, Facebook set up an independent trust, funded by a $130-million grant, to manage the board, hire the staff, and pay members, Wired reported. The board is composed of an even number of men and women from 27 countries. Collectively, they speak 29 languages. The board will eventually increase to about 40 members.

Initially, the board will review posts, videos, photos, and comments that the company has decided to remove from Facebook or its photo-sharing site Instagram. It will decide whether the content can be posted or remain off the site. The content could range from issues such as nudity and violence to hate speech and harassment. The board will select the cases it will review.

“We will not be able to offer a ruling on every one of the many thousands of cases that we expect to be shared with us each year,” Greene and the other co-chairs wrote in an Opinion piece for The New York Times. “We will focus on identifying cases that have a real-world impact, are important for public discourse, and raise questions about current Facebook policies. Cases that examine the line between satire and hate speech, the spread of graphic content after tragic events, and whether manipulated content posted by public figures should be treated differently from other content are just some of those that may come before the board.”

Here’s how it will work. Each case will be reviewed by a panel of five members, with at least one from the same geographic region as the case originated. The panel can ask for subject matter experts to help make its decision, which then must be finalized by the whole board, The First Post reported. 

Decisions by the board will be binding — unless it could violate the law — and cannot be changed, not even by Facebook co-founder Zuckerberg. Decisions must be made and implemented within 90 days of the review.

Listen to GHOGH with Jamarlin Martin | Episode 70: Jamarlin Martin Jamarlin goes solo to discuss the COVID-19 crisis. He talks about the failed leadership of Trump, Andrew Cuomo, CDC Director Robert Redfield, Surgeon General Jerome Adams, and New York Mayor de Blasio.

The board could have its hands full. In 2019, users appealed more than 10 million pieces of content that Facebook removed or took action on, The First Post reported.

According to Facebook, the board will be making decisions on ads, groups, pages, profiles, and events.

However, it will not deal with Instagram direct messages, Facebook’s messaging platforms WhatsApp, Messenger, its dating service, or its Oculus virtual reality products.

“This is a historic moment,” Kate Klonick, a law professor who has been closely following the creation of the board, told Wired. “This is the first time a private transnational company has voluntarily assigned a part of its policies to an external body like this.”