How Did Facebook Get Away With Not Disclosing Political Ads When Other Media Have To?

How Did Facebook Get Away With Not Disclosing Political Ads When Other Media Have To?

Facebook CEO Zuckerberg said this month that Facebook will require political ads to be traceable to the Facebook page that paid for them.

Days after Facebook Zuckerberg dismissed as “crazy” the idea that fake news on Facebook had influenced the election, former President Barack Obama warned him to take the threat of political disinformation seriously, The Washington Post reported.

Obama warned that Facebook and the government must do more to address the threat, or it will be worse in the next presidential race.

The private exchange between Obama and Zuckerberg took place in a hotel room in Lima, Peru during a meeting of world leaders, according to The Verge. Zuckerberg acknowledged that there are problems with fake news, but told Obama they weren’t widespread and there was no easy solution, according to unnamed sources.

The sources said that Obama hoped the conversation would be a wake-up call for the CEO. Facebook detected elements of a Russian information operation in June 2016 and reported it to the FBI, but both struggled to work together to fight the issue, according to the report.

Since then, Facebook has blocked ads that spread fake news. It has also started showing users how to spot fake news. In the interest of transparency, Zuckerberg this month said Facebook will require political ads to be traceable to the Facebook page that paid for them.

Black Americans Have the Highest Mortality Rates But Lowest Levels of Life Insurance
Are you prioritizing your cable entertainment bill over protecting and investing in your family?
Smart Policies are as low as $30 a month, No Medical Exam Required
Click Here to Get Smart on Protecting Your Family and Loves Ones, No Matter What Happens

Mainstream media such as TV, radio and newspapers are required to say who paid for political advertising.

So how did Facebook get away with not having to do that?

Government is complicit, and government can help solve the problem of political disinformation, said Zeynep Tufekci, an associate professor at the University of North Carolina–Chapel Hill School of Information and Library Science, and the author of “Twitter and Tear Gas: The Power and Fragility of Networked Protest. ”

“(Facebook) got away with not disclosing political ads because the government was like ‘Oh, OK. Let’s just let you get away with this,’ even though if you do the same thing on a radio station, you’re forced to disclose. You do that on a TV station, you’re forced to disclose. You do this on the most important information conduit in the 21st century, you just get away with it,” Tufecki said in a Slate interview.

IN 2011, Facebook asked the Federal Election Commission to exempt it from rules requiring political advertisers to disclose who’s paying for an ad, Wired reported. Political ads on TV and radio must include such disclosures. Facebook argued that its ads should be regulated as “small items” like bumper stickers, which don’t require disclosures. The FEC was deadlocked on the issue, and didn’t do anything about it for six years.

Now, it’s blowing up again.

Facebook basically served as an in-house ad agency for the Trump campaign, Tufecki said:

“That’s Facebook’s business mode… A lot of these things are a natural consequence of the way they want their business, how they make their money, how they automate a lot, and don’t hire too many humans. Humans are expensive, algorithms are cheap. Humans don’t scale, and algorithms scale. Cultural problems don’t scale, and having one set of rules for the whole world scales.

But then you have these consequences. You have a place that’s optimized to make you pliable to ads. Right? It’s a place optimized to sell you certain kinds of messages. Any kind of messages. And by not distinguishing between what those messages are, and also by making considerations that determine how to make us pliable to those ads, Facebook is now also controlling our political information flow, our personal interactions. One set of rules that’s supposed to be good for some things, but it’s used across the board in a very powerful way for politics, for interaction, social interaction. Everybody’s kind of in that space, and I don’t think it’s been good.

You cannot be that powerful … and have everybody try to use you to influence everyone else. But as soon as somebody does use you for exactly what you’re designed for according to exactly how your business model operates, just throw up your hands and say, ‘Oh, we’re powerless. It’s just the people.'”

Facebook plans to look into “foreign actors, including additional Russian groups and other former Soviet states,” Zuckerberg said, but also “organizations like the campaigns” to further its “understanding of how they used our tools,” Business Insider reported.

The Russian Facebook operation was an amateur operation, Tufecki said on Slate:

It was not some super sophisticated thing. They just used Facebook the way it’s designed. It’s not like some deep understanding of U.S. politics and some very sophisticated spy thing … For me, the bigger point is what this exposes. … Look how pliable Facebook was to become this breeding ground of misinformation, how much its algorithms and business model helped it along, and how big an audience there also was.