Ventured

Tech, Business, and Real Estate News

Inside Facebook’s Plan To Protect The U.S. Midterm Elections

Source: Recode, Kurt Wagner
Photo: Zach Gibson/Getty

Is it enough?

Two weeks ago, on a hastily scheduled conference call with journalists, Facebook executives announced what many felt was inevitable: Someone, perhaps Russia, was once again trying to use the social network to “sow division” among U.S. voters, this time before November’s midterm elections.

The “bad actors,” as Facebook called them, created bogus events; posted about race issues, fascism, and President Donald Trump; and paid Facebook to promote their messages. “Some of the activity is consistent with what we saw from the IRA before and after the 2016 elections,” Facebook’s head of cybersecurity policy wrote in a blog post, referring to the Internet Research Agency, a Kremlin-backed online troll farm.

That activity, of course, may have altered a U.S. election, and sent Facebook and CEO Mark Zuckerberg down a path of self-reflection that has changed Facebook’s strategy, as well as its mission.

There was one big difference, though, between the disinformation campaign Facebook announced in July and the Russian campaign from 2016. This time, Facebook caught the bad guys — at least some of them — before the election.

It was a conflicting revelation. On one hand, Facebook’s safeguards to prevent another election interference campaign appeared to work. On the other, it was a sign that Facebook will once again be a target — or perhaps a weapon — for people who want to divide American voters ahead of the 2018 midterms and destabilize support for government officials.

Harvard lecturer Eric Rosenbach is bracing for the latter. As the former assistant secretary of defense for Homeland Defense and Global Security, and the former chief of staff for Secretary of Defense Ash Carter, Rosenbach knows how foreign threats like to operate.

“My greatest fear, and I hope I’m wrong, is that the Russians, or maybe it’s the Iranians — they’ve already started working on these things, they’ve already conducted penetrations of campaigns, and they’re getting set to go to the next stage of conducting an infowar at the time that will most hurt the candidates that are in key states,” he said in an interview with Recode. “A week or a couple days before the actual election day and midterms, they’ll carpet bomb the internet using Facebook and Twitter.”

Will Facebook be ready? The company says it’s moving quickly on its plan — which includes a physical war room to monitor the elections from its corporate headquarters in Menlo Park, Calif. — and has promised to double the number of safety and security employees on staff to 20,000 people. Facebook says it’s spending so much money monitoring political ads that it will actually hurt profits.

But Facebook is also running out of time to execute its plan. With the midterms less than three months away, it’s almost go time.

“When over half of Americans get their news from Facebook, it’s pretty damn important,” said Senator Mark Warner, D-Va., who has been one of the country’s most outspoken critics of Facebook’s role in elections. “We’re starting to see the enormous success of the Trump campaign in using social media. I think it’s changing the paradigm.”

Facebook’s plan

You can boil Facebook’s election plan down into three main challenges:

It wants to find and delete “fake” or “inauthentic” accounts.

It wants to find and diminish the spread of so-called fake news.

It wants to make it harder for outsiders to buy ads that promote candidates or important election issues.

Facebook’s top priority is finding and deleting “fake accounts” — either automated bots, or Pages and profiles operated by a real person pretending to be someone else — which are usually responsible for Facebook’s other major problems, like disinformation campaigns and misleading ads.

“By far, the most important thing is going after fake accounts,” COO Sheryl Sandberg told a roomful of journalists back in June. “If you look at the things that happened in the Russian IRA ads on our platform in 2016, all of it was done though fake accounts.”

Fake accounts are easy for Facebook to quantify, and make for nice headlines. Facebook took down almost 1.3 billion fake accounts in the last six months alone, and has routinely highlighted the number of fake accounts it takes down ahead of foreign elections. (For context, Facebook has about 2.2 billion monthly active users, and the company has estimated in the past that 3-4 percent of those are “false accounts.”) Before France’s presidential election in early 2017, Facebook deleted 30,000 fake accounts. It took down “tens of thousands” of accounts before Germany’s national elections last fall.

But finding the kinds of sophisticated networks trying to influence elections is much tougher. Facebook execs say they’re getting better at spotting them, in part because it knows the type of behavior those accounts exhibit. The Russian IRA accounts that used Facebook to try and influence the 2016 presidential election also provided a lead to other networks.

“Those kinds of investigations actually launch a whole bunch of other investigations,” said Samidh Chakrabarti, who leads product for all of Facebook’s election-related efforts. “What are all of the Pages that those accounts operated? And who are all the other admins of those Pages?”

“That’s what we call a ‘fan out.’ You basically start from a seed and you see who are potential co-conspirators.”

Chakrabarti won’t say how many investigations Facebook has in the works, but says they’re running several at the same time. He also won’t say whether or not Facebook knows of any other coordinated misinformation campaigns. “We’re always looking,” was all Chakrabarti replied.

Facebook says these “bad actors” are getting more sophisticated now that the company knows what to look for. When Facebook announced it had found the coordinated information campaign a few weeks back, it also confirmed that these “bad actors” were trying to hide their location using VPNs, and paid for ads through third parties. Facebook wouldn’t say who was behind the campaign, only that it was working with law enforcement and Congress to try and find out.

Working with government agencies is something Chakrabarti says Facebook is also doing much more of today than it did in 2016. “There are numerous leads that come from a lot of different places,” Chakrabarti said, though he wouldn’t get into details. “We have a lot of different lines of communication open with different agencies on this.”

Sen. Warner, who sits on the Senate Intelligence Committee, says Facebook’s relationship with Congress has improved, though “grudgingly.”

“It was not until the summer of 2017, after trips out to the Valley, after sitting with Facebook leadership, that they started to come clean,” Warner said. “Have they gotten better since then? Yes. … But people just aren’t buying the ‘we do no evil’ and the self-righteousness of all the social media platforms.”

Facebook’s efforts to stop so-called fake news may be an even tougher obstacle for the company given the challenges that come with separating black-and-white truths from personal opinions. The company’s efforts to flag false news and work with outside fact-checkers have been well documented, but deciding how to respond to bad information has still caused the company headaches. (Remember Mark Zuckerberg’s Holocaust denial statement? Or the company’s response to Alex Jones and Infowars?)

One effort that may be working — or at least creating hurdles for potential bad guys — is Facebook’s updates around election advertising. Following the 2016 election, in which Russian trolls bought thousands of dollars worth of ads that reached millions of people, Facebook created a dashboard where users can browse through all political ads that appear on Facebook’s services. The company also started requiring that political advertisers register with the company, a process that included responding to a physical mailer Facebook uses to verify an advertiser’s address.

While the move was intended to keep foreign actors from advertising for U.S. candidates, it created a bit of a hiccup for legitimate campaigns as well. Brian Rose, who ran for Congress in a special election in Mississippi earlier this year, found out his Page wasn’t approved to run ads in late May, less than two weeks before his election. “This is a devastating blow,” he told The Verge. Rose lost in a landslide.

(Facebook, which first announced the new process in October of 2017, believes it gave people plenty of warning. The company has previously provided presidential campaigns with assistance in understanding how to use its products — the same kind of assistance Facebook says it would provide any major corporate advertiser — but it isn’t doing that for the midterms, a company spokesperson confirmed.)

https://www.recode.net/2018/8/17/17686252/facebook-2018-midterm-election-plan-russia