Facebook has learned it’s lesson from Cambridge Analytics and the last US elections. Now it’s preparing itself to handle misinformation and fake pages. They have set up a War Room ion the California headquarters.

The War Room

The war room is an undistinguished room, which is home to a cluster of sitting and standing desks that can hold from 20-30 people at one time. The room represents 20 different teams that manage everything from threat intelligence to moderation.

The war room is furnished with big computer screens that display everything from internal analytics and video conferences with other offices, to recent tweets from the TweetDeck app. The collection of different people leads to a cohesive effort that combines the collective information and intelligence, allowing people to find patterns and discern real from fake.

Every person that has access to the room is there for a specific reason, and they communicate from the room with other team members located elsewhere. According to Facebook’s head of cybersecurity policy Nathaniel Gleicher, the war room is Facebook’s “nerve center of this much broader effort.”

The War Room is designed to collect Facebook’s policy, legal, and security teams into one cohesive group, dedicated to handling the upcoming November midterm elections.

Ever since the 2016 elections, Facebook has been placed under the micron telescope of Congress and Federal investigators. The Russian fiasco combined with the Cambridge Analytics leak has made Facebook a targeted company.

The Concept

According to Facebook, they have hired thousands of moderators whose sole purpose it works together with the systems AI and a team of former US intelligence officials to review all suspicious pages.

According to Facebook’s director of elections and head of civic engagement, Samidh Chakrabarti, who told the press that the war room is “really the culmination of two years of massive investments we’ve made both in people and technology to ensure that our platforms are safe and secure for elections. So it builds upon work that we’ve done to crack down on fake accounts, on combating the spread of fake news on our platforms.”

It seems that trolls are learning as the systems improve, and Chakrabarti added to his statement to the press that “We know that the bad actors out there who are looking to interfere in elections, they are definitely well funded. They’re committed, and they are getting increasingly sophisticated. So as one example, I think they’ve been getting better at being able to mask the location that they’re coming from.”

According to Chakrabarti, there are man instances where the new team leaders will make decisions that go against the Facebook policy, and when very key issues arise, the decision will be bumped up to the com company’s CEO Mark Zuckerberg and COO Sheryl Sandberg.

Chakrabarti explained that “We’ve established a chain of command all the way up to Mark and Sheryl to be able to weigh in on the most consequential things.”

According to Facebook statements, it was in August this year that they removed a whole set of suspect Russian misinformation pages that posed as USA activists. They also told the press last Tuesday that Facebook removed over 1,700 fake pages from a Bangladesh run operation posing as a Women’s March network.

 

Join the discussion

avatar
  Subscribe  
Notify of