The problems keep piling up for Facebook, and it’s unclear how long the internet giant will be able to brush them aside as it barrels toward acquiring its next billion users.
The world’s biggest social network has unwittingly allowed groups backed by the Russian government to target users with ads. That’s after it took months to acknowledge its outsized role in influencing the U.S. election by allowing the spread of fake news — though before news emerged that it let advertisers target messages to “Jew-haters.”
Now Facebook is under siege, facing questions from lawmakers and others seeking to rein in its enormous power. The company has turned over information on the Russia-backed ads to federal authorities investigating Russian interference in the U.S. presidential election. Critics say the company also needs to tell its users how they might have been influenced by outside meddlers.
Speculation is rife that Facebook executives, perhaps including CEO Mark Zuckerberg, could be called to testify before Congress. Hearings might lead to new regulations on the company.
“Facebook appears to have been used as an accomplice in a foreign government’s effort to undermine democratic self-governance in the United States,” writes Trevor Potter, former chairman of the Federal Election Commission and now head of a nonpartisan election-law group, in a letter to Zuckerberg.
“ERA OF ACCOUNTABILITY”
Potter’s group, the Campaign Legal Center, wants Facebook to make the Russian-sponsored ads public. The company has so far declined to do so, citing the ongoing investigations. It has provided the ads and other information to Robert Mueller, the special counsel in charge of the Russia investigation, Facebook said in a statement, although it declined to elaborate.
The company that nudges its users to reveal intimate details about their lives, it turns out, isn’t all that comfortable doing the same. That’s true for everything from the secret algorithms that recommend “people you might know” to data on its attempts to clamp down on the spread of false news shared across its network.
The company justifies its secrecy in many ways, having variously claimed legal restrictions, business secrets, security and privacy protections to excuse its opacity. But Jonathan Albright, whose late 2016 research on the “fake news” propaganda ecosystem outlined how propaganda websites track and target users, thinks the current moment may be a turning point for online giants like Facebook.
“Now that it has run directly into something that possibly affected the outcome of the election — but they can’t determine how — this may be their era of accountability,” said Albright, the director of research at the Tow Center for Digital Journalism at Columbia University.
There has been no other company on the planet, Albright added, that can provide access to as many real people as Facebook.
POWER GAMES AND NEW RULES
Facebook prefers to think of itself as an online platform, but in many respects it’s also a modern sort of media company, if for no other reason than that so many people rely on it as a source of news and information. In its early years, Facebook even described itself as a “social utility.”
Now the question is whether it should be regulated as one — and if so, how. There aren’t many straightforward answers, even where political ads already subject to government rules are concerned.
It’s already illegal for foreign nationals to spend money in connection with a U.S. federal election, whether on or off of Facebook. And campaign law requires people who spend money on another person’s website to disclose that fact in the ad itself.
Broadcast-era election law, however, can be a poor fit for the Internet Age. Attempts to sway political sentiment on Facebook can be targeted to small groups who share a common background or attitudes, making them difficult to track from the outside. And many such efforts might not resemble traditional advertisements at all. The goal of many Facebook marketing campaigns is to generate posts that regular people will spread widely for free; political persuasion campaigns can work the same way.
“As a practical matter, it is extremely difficult for the U.S. government to regulate content on the internet that may have an effect on the U.S. election,” said Nathaniel Persily, a professor at Stanford Law School. “If a teenager in his mother’s basement in Moscow wants to put up a YouTube video, it’s not clear what the U.S. will be able to do about that.”
Difficult doesn’t mean impossible. Persily, for instance, thinks that Facebook could use its AI technology to flag election-related ads that don’t bear the disclosures required by existing law.
Companies like Facebook could also be required to do some kind of due diligence on who is spending money on their platforms on behalf of candidates, he added. Keeping an online repository of all candidate-related ads within six months of an election, identified by their backers, could also provide an additional check on illegal attempts to sway elections.