Facebook Launches 'War Room' to Combat Election Manipulation

Advertisement
By Agence France-Presse | Updated: 18 October 2018 18:54 IST
Highlights
  • "War Room" is the nerve centre for the fight against misinformation
  • Inside, walls have clocks showing time in various regions of US, Brazil
  • It also has maps and TV screens showing CNN, Fox News, Twitter, etc.

In Facebook's "War Room," a nondescript space adorned with American and Brazilian flags, a team of 20 people monitors computer screens for signs of suspicious activity.

The freshly launched unit at Facebook's Menlo Park headquarters in California is the nerve centre for the fight against misinformation and manipulation of the largest social network by foreign actors trying to influence elections in the United States and elsewhere.

Inside, the walls have clocks showing the time in various regions of the US and Brazil, maps and TV screens showing CNN, Fox News and Twitter, and other monitors showing graphs of Facebook activity in real time.

Advertisement

Facebook, which has been blamed for doing too little to prevent misinformation efforts by Russia and others in the 2016 US election, now wants the world to know it is taking aggressive steps with initiatives like the war room.

"Our job is to detect ... anyone trying to manipulate the public debate," said Nathaniel Gleicher, a former White House cyber-security policy director for the National Security Council who is now heading Facebook's cyber-security policy.

"We work to find and remove these actors."

Facebook has been racing to get measures in place and began operating this nerve centre - with a hastily taped "WAR ROOM" sign on the glass door - for the first round of the presidential vote in Brazil on October 7.

Advertisement

It didn't take long to find false information and rumours being spread which could have had an impact on voters in Brazil.

"On election day, we saw a spike in voter suppression (messages) saying the election was delayed due to protests. That was not a true story," said Samidh Chakrabarti, Facebook's head of civic engagement.

Advertisement

Chakrabarti said Facebook was able to remove these posts in a couple of hours before they went viral.

"It could have taken days."

Humans and machines
At the unveiling of the war room for a small group of journalists including AFP this week, a man in a grey porkpie hat kept his eyes glued to his screen where a Brazilian flag was attached.

Advertisement

He said nothing but his mission was obvious - watching for any hints of interference with the second round of voting in Brazil on October 28.

The war room, which will ramp up activity for the November 6 midterm US elections, is the most concrete sign of Facebook's efforts to weed out misinformation.

With experts in computer science, cyber-security and legal specialists, the centre is operating during peak times for the US and Brazil at present, with plans to eventually work 24/7.

The war room adds a human dimension to the artificial intelligence tools Facebook has already deployed to detect inauthentic or manipulative activity.

"Humans can adapt quickly to new threats," Gleicher said of the latest effort.

Chakrabarti said the new center is an important part of coordinating activity - even for a company that has been built on remote communications among people in various parts of the world.

"There's no substitute to face to face interactions," he said.

The war room was activated just weeks ahead of the US vote, amid persistent fears of manipulation by Russia and other state entities, or efforts to polarize or inflame tensions.

The war room is part of stepped up security announced by Facebook that will be adding some 20,000 employees.

"With elections we need people to detect and remove (false information) as quickly as possible," Chakrabarti said.

The human and computerised efforts to weed out bad information complement each other, according to Chakrabarti.

"If an anomaly is detected in an automated way, then a data scientist will investigate, will see if there is really a problem," he said.

The efforts are also coordinated with Facebook's fact-checking partners around the world including media organizations such as AFP and university experts.

Gleicher said the team will remain on high alert for any effort that could lead to false information going viral and potentially impacting the result of an election.

"We need to stay ahead of bad actors," he said. "We keep shrinking the doorway. They keep trying to get in."

 

Catch the latest from the Consumer Electronics Show on Gadgets 360, at our CES 2026 hub.

Further reading: Facebook, US
Advertisement

Related Stories

Popular Mobile Brands
  1. Top Deals on Gaming Laptops During Amazon Great Republic Day Sale
  1. World’s Biggest Alien Search Enters Final Stage With 100 Mystery Signals
  2. NASA Pulls Out Artemis II Rocket to Launch Pad Ahead of Historic Moon Mission
  3. Shambhala OTT Release: When, Where to Watch the Telugu Supernatural Horror Film
  4. AGS 28 OTT Release: Know Where to Watch This Tamil Entertainer Starring Arjun, Abhirami
  5. Avatar: Fire and Ash OTT Release: When, Where to Watch James Cameron’s Epic Sci-Fi Fantasy
  6. OpenAI to Begin Testing Ads in ChatGPT, Says Responses Will Not Be Influenced
  7. Gurram Paapi Reddy OTT Release: When, Where to Watch This Telugu Crime Comedy Thriller
  8. Hypothetical ‘Dark Stars’ Could Rewrite Early Cosmic History, Research Suggests
  9. Honor Magic 8 Pro Air Key Features Confirmed; Company Teases External Lens for Honor Magic 8 RSR Porsche Design
  10. Lava Blaze Duo 3 India Launch Date Announced; Colour Options Teased Ahead of Debut
Gadgets 360 is available in
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2026. All rights reserved.