Facebook’s election ‘war room’ takes aim at fake info.
Menlo Park, Calif. – In an otherwise innocuous part of Facebook’s expansive Silicon Valley campus, a locked door bears a taped-on sign that reads “War Room.” Behind the door lies a nerve center the social network has set up to combat fake accounts and bogus news stories ahead of upcoming elections.
Inside the room are dozens of employees staring intently at their monitors while data streams across giant dashboards. On the walls are posters of the sort Facebook frequently uses to caution or exhort its employees. One reads, “Nothing at Facebook is somebody else’s problem.”
That motto might strike some as ironic, given that the war room was created to counter threats that almost no one at the company, least of all CEO Mark Zuckerberg, took seriously just two years ago – and which the company’s critics now believe pose a threat to democracy.
Days after President Donald Trump’s surprise victory, Zuckerberg brushed off assertions that the outcome had been influenced by fictional news stories on Facebook.
But Facebook’s blase attitude shifted as criticism of the company mounted in Congress and elsewhere. Later that year, it acknowledged having run thousands of ads promoting false information placed by Russian agents.
The war room is a major part of Facebook’s ongoing repairs. Its technology draws upon the artificial intelligence system Facebook has been using to help identify “inauthentic” posts and user behavior. Facebook provided a tightly controlled glimpse at its war room to media ahead of the second round of presidential elections in Brazil on Oct. 28 and the U.S. midterm elections on Nov. 6.
More than 20 different teams now coordinate the efforts of more than 20,000 people devoted to blocking fake accounts and fictional news and stopping other abuses on Facebook and its other services.