Take a fresh look at your lifestyle.

Facebook whistleblower Frances Haugen comes forward and alleges company lied to investors

165

The former Facebook employee who is responsible for thousands of pages of leaked internal company research has been identified as Frances Haugen.

Haugen, a 37-year-old data scientist from Iowa who was widely known as the ‘Facebook whistleblower,’ spoke out publicly for the first time since she anonymously filed at least eight complaints with federal law enforcement.

In a 60 Minutes interview that aired Sunday night, Haugen accused the social media network of prioritizing its own interests over the public good and allowing spread of misinformation. 

‘There were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money,’ she said in the interview.

She also claimed that Facebook played a role in the January 6 insurrection at the U.S. Capitol by removing measures that prevent the spread of misinformation which had been implemented ahead of the 2020 election

‘As soon as the election was over they turned them back off, or they changed the settings back to what they were before to prioritize growth over safety. And that really feels like a betrayal of democracy to me,’ Haugen stated.  

The former Facebook employee who is responsible for thousands of pages of leaked internal company research has been identified as Frances Haugen (pictured)

Haugen spoke out publicly for the first time since she anonymously filed at least eight complaints (an excerpt is shown above) with federal law enforcement

Haugen spoke out publicly for the first time since she anonymously filed at least eight complaints (an excerpt is shown above) with federal law enforcement

After realizing she could no longer trust her company to protect the public, Haugen secretly copied tens of thousands of Facebook internal research which she claims is evidence that ‘the company is lying to the public about making significant progress against hate, violence and misinformation.’ 

“We have evidence from a variety of sources that hate speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world,’ the complaint reads. 

Haugen claimed that Facebook’s ‘evidence of harm’ extended to its Instagram app, commenting on a study that showed teen girls said the social network site worsened thoughts of suicide and eating disorders.

‘What’s super tragic is Facebook’s own research says, as these young women begin to consume this — this eating disorder content, they get more and more depressed. And it actually makes them use the app more,’ Haugen explained.

‘And so, they end up in this feedback cycle where they hate their bodies more and more. Facebook’s own research says it is not just the Instagram is dangerous for teenagers, that it harms teenagers, it’s that it is distinctly worse than other forms of social media.’

Haugen also alleged that the way Facebook has written its algorithm is changing the way countries are lead.

‘You are forcing us to take positions that we don’t like, that we know are bad for society. We know if we don’t take those positions, we won’t win in the marketplace of social media,’ she said. 

Haugen secretly copied tens of thousands of Facebook internal research which she claims is evidence that 'the company is lying to the public about making significant progress against hate, violence and misinformation'

Haugen secretly copied tens of thousands of Facebook internal research which she claims is evidence that ‘the company is lying to the public about making significant progress against hate, violence and misinformation’

Haugen also alleged that the way Facebook has written its algorithm is changing the way countries are lead

Haugen also alleged that the way Facebook has written its algorithm is changing the way countries are lead

Haugen’s lawyers filed at least eight complaints with the Securities and Exchange Commission outlining her findings and comparing them with the company’s public statements.

The SEC did not confirm to 60 Minutes if they plan to take action against Facebook. DailyMail.com has also reached out to the organization for comment. 

Facebook, however, did released a statement in response to the allegations: ‘Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. 

‘We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.’

Facebook head of global affairs Nick Clegg, appearing on CNN Sunday morning, also called the allegations that the social media giant is responsible for the Capitol riot ‘ludicrous.’ 

‘The responsibility for the violence on January the 6th and the insurrection on that day lies squarely with the people who inflicted the violence and those who encouraged them, including then-President Trump and candidly many other people in the media who were encouraging the assertion that the election was stolen,’ he said. 

Facebook released a statement in response to the allegations: 'Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true'

Facebook released a statement in response to the allegations: ‘Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true’

Facebook head of global affairs Nick Clegg, appearing on CNN Sunday morning, also called the allegations that the social media giant is responsible for the Capitol riot 'ludicrous'

Facebook head of global affairs Nick Clegg, appearing on CNN Sunday morning, also called the allegations that the social media giant is responsible for the Capitol riot ‘ludicrous’

Meanwhile, a congressional panel will hear Haugen’s testimony on Tuesday.  

Sen. Richard Blumenthal (D-Conn.), who is a member of the panel, told the Washington Post that the SEC should take Haugen’s allegations that Facebook may have mislead investors ‘very seriously’. 

‘Facebook certainly misled and deceived the public, and so their investors may well have been deceived as well,’ Blumenthal said. 

Lawmakers will also investigate if Facebook’s products are harmful to children and whether or not the social media company undermined its safety efforts by disbanding its civic integrity team, as Haugen has alleged.

The social media giant confirmed that Antigone Davis, its global head of safety, would also testify before the Senate Commerce Committee Consumer protection panel. 

Haugen’s allegations have caused a headache for Facebook in recent weeks.

Some of the secrets contained in the trove of tens of thousands of pages of internal company documents she copied were previously leaked to the Wall Street Journal for a series of reports dubbed the ‘Facebook Files‘, including damning revelations the company knew its platform Instagram is toxic to young girls’ body image. 

With more damaging allegations headed for the company Sunday, Clegg warned employees: ‘We will continue to face scrutiny.’ 

Haugen's allegations have caused a headache for Facebook in recent weeks

Haugen’s allegations have caused a headache for Facebook in recent weeks

The email, from the company's Vice President of Policy and Global Affairs Nick Clegg (above), attempted to prepare staff for the allegations and launched into a lengthy defense of the company

The email, from the company’s Vice President of Policy and Global Affairs Nick Clegg (above), attempted to prepare staff for the allegations and launched into a lengthy defense of the company

According to Clegg’s email, the whistleblower will accuse her former employer of relaxing its emergency ‘break glass’ measures put in place in the lead-up to the election ‘too soon.’

Haugen claimed this played a role in enabling rioters in their quest to storm the Capitol on January 6 in a riot that left five dead.

The relaxation of safeguards including limits on live video allowed prospective rioters to gather on the platform and use it to plot the insurrection.

Clegg pushed back at this suggestion, insisting that the so-called ‘break glass’ safeguards were only rolled back when the data showed they were able to do so.

Some such measures were kept in place until February, he wrote, and some are now permanent features.  

‘We only rolled back these emergency measures – based on careful data-driven analysis – when we saw a return to more normal conditions,’ Clegg wrote. 

‘We left some of them on for a longer period of time through February this year and others, like not recommending civic, political or new Groups, we have decided to retain permanently.’

Clegg listed several safeguards which have been put in place in recent years and reeled off a list of success stories of handling misinformation around the election and shutting down groups focused on overturning the results. 

‘In 2020 alone, we removed more than 5 billion fake accounts — identifying almost all of them before anyone flagged them to us,’ he wrote.

‘And, from March to Election Day, we removed more than 265,000 pieces of Facebook and Instagram content in the US for violating our voter interference policies.’

Clegg admitted such policies were not ideal and resulted in many people and posts were impacted by this heavy-handed approach.

But, he said, an ‘extreme step’ was necessary because ‘these weren’t normal circumstances.’ 

‘It’s like shutting down an entire town’s roads and highways in response to a temporary threat that may be lurking somewhere in a particular neighborhood,’ he said.

Haugen claims the relaxation of measures on Facebook allowed rioters to plot the insurrection on the platform

Haugen claims the relaxation of measures on Facebook allowed rioters to plot the insurrection on the platform

Donald Trump speaking at a rally moments before his supporters stormed the Capitol

Donald Trump speaking at a rally moments before his supporters stormed the Capitol 

‘We wouldn’t take this kind of crude, catch-all measure in normal circumstances, but these weren’t normal circumstances.’

He wrote that the company had removed millions of pages and groups from hate groups and dangerous organizations such as the Proud Boys, QAnon conspiracy theorists and content pushing #StopTheSteal election fraud claims.  

The email also pushed back at an accusation that Facebook benefits from the divisiveness created on its platform.  

‘We do not profit from polarization, in fact, just the opposite,’ he wrote.

‘We do not allow dangerous organizations, including militarized social movements or violence-inducing conspiracy networks, to organize on our platforms.’

The VP called any suggestion the blame for the Capitol riot lies with Big Tech ‘so misleading’ and said the blame should be on the rioters themselves and the people who incited them. 

‘The suggestion that is sometimes made that the violent insurrection on January 6 would not have occurred if it was not for social media is so misleading,’ he wrote. 

‘To be clear, the responsibility for those events rests squarely with the perpetrators of the violence, and those in politics and elsewhere who actively encouraged them.’

The lengthy email to staff ended by urging the workforce to ‘hold our heads up high’ and ‘be proud’ of their work.  

FACEBOOK’S EMAIL TO STAFF IN FULL:

OUR POSITION ON POLARIZATION AND ELECTIONS

You will have seen the series of articles about us published in the Wall Street Journal in recent days, and the public interest it has provoked. This Sunday night, the ex-employee who leaked internal company material to the Journal will appear in a segment on 60 Minutes on CBS. We understand the piece is likely to assert that we contribute to polarization in the United States, and suggest that the extraordinary steps we took for the 2020 elections were relaxed too soon and contributed to the horrific events of January 6th in the Capitol.

I know some of you – especially those of you in the US – are going to get questions from friends and family about these things so I wanted to take a moment as we head into the weekend to provide what I hope is some useful context on our work in these crucial areas.

Facebook and Polarization

People are understandably anxious about the divisions in society and looking for answers and ways to fix the problems. Social media has had a big impact on society in recent years, and Facebook is often a place where much of this debate plays out. So it’s natural for people to ask whether it is part of the problem. But the idea that Facebook is the chief cause of polarization isn’t supported by the facts – as Chris and Pratiti set out in their note on the issue earlier this year.

The rise of polarization has been the subject of swathes of serious academic research in recent years. In truth, there isn’t a great deal of consensus. But what evidence there is simply does not support the idea that Facebook, or social media more generally, is the primary cause of polarization.

The increase in political polarization in the US pre-dates social media by several decades. If it were true that Facebook is the chief cause of polarization, we would expect to see it going up wherever Facebook is popular. It isn’t. In fact, polarization has gone down in a number of countries with high social media use at the same time that it has risen in the US.

Specifically, we expect the reporting to suggest that a change to Facebook’s News Feed ranking algorithm was responsible for elevating polarizing content on the platform. In January 2018, we made ranking changes to promote Meaningful Social Interactions (MSI) – so that you would see more content from friends, family and groups you are part of in your News Feed. This change was heavily driven by internal and external research that showed that meaningful engagement with friends and family on our platform was better for people’s wellbeing, and we further refined and improved it over time as we do with all ranking metrics. Of course, everyone has a rogue uncle or an old school classmate who holds strong or extreme views we disagree with – that’s life – and the change meant you are more likely to come across their posts too. Even so, we’ve developed industry-leading tools to remove hateful content and reduce the distribution of problematic content. As a result, the prevalence of hate speech on our platform is now down to about 0.05%.

But the simple fact remains that changes to algorithmic ranking systems on one social media platform cannot explain wider societal polarization. Indeed, polarizing content and misinformation are also present on platforms that have no algorithmic ranking whatsoever, including private messaging apps like iMessage and WhatsApp.

Elections and Democracy

There’s perhaps no other topic that we’ve been more vocal about as a company than on our work to dramatically change the way we approach elections. Starting in 2017, we began building new defenses, bringing in new expertise, and strengthening our policies to prevent interference. Today, we have more than 40,000 people across the company working on safety and security.

Since 2017, we have disrupted and removed more than 150 covert influence operations, including ahead of major democratic elections. In 2020 alone, we removed more than 5 billion fake accounts — identifying almost all of them before anyone flagged them to us. And, from March to Election Day, we removed more than 265,000 pieces of Facebook and Instagram content in the US for violating our voter interference policies.

Given the extraordinary circumstances of holding a contentious election in a pandemic, we implemented so called “break glass” measures – and spoke publicly about them – before and after Election Day to respond to specific and unusual signals we were seeing on our platform and to keep potentially violating content from spreading before our content reviewers could assess it against our policies.

These measures were not without trade-offs – they’re blunt instruments designed to deal with specific crisis scenarios. It’s like shutting down an entire town’s roads and highways in response to a temporary threat that may be lurking somewhere in a particular neighborhood. In implementing them, we know we impacted significant amounts of content that did not violate our rules to prioritize people’s safety during a period of extreme uncertainty. For example, we limited the distribution of live videos that our systems predicted may relate to the election. That was an extreme step that helped prevent potentially violating content from going viral, but it also impacted a lot of entirely normal and reasonable content, including some that had nothing to do with the election. We wouldn’t take this kind of crude, catch-all measure in normal circumstances, but these weren’t normal circumstances.

We only rolled back these emergency measures – based on careful data-driven analysis – when we saw a return to more normal conditions. We left some of them on for a longer period of time through February this year and others, like not recommending civic, political or new Groups, we have decided to retain permanently.

Fighting Hate Groups and other Dangerous Organizations

I want to be absolutely clear: we work to limit, not expand hate speech, and we have clear policies prohibiting content that incites violence. We do not profit from polarization, in fact, just the opposite. We do not allow dangerous organizations, including militarized social movements or violence-inducing conspiracy networks, to organize on our platforms. And we remove content that praises or supports hate groups, terrorist organizations and criminal groups.

We’ve been more aggressive than any other internet company in combating harmful content, including content that sought to delegitimize the election. But our work to crack down on these hate groups was years in the making. We took down tens of thousands of QAnon pages, groups and accounts from our apps, removed the original #StopTheSteal Group, and removed references to Stop the Steal in the run up to the inauguration. In 2020 alone, we removed more than 30 million pieces of content violating our policies regarding terrorism and more than 19 million pieces of content violating our policies around organized hate in 2020. We designated the Proud Boys as a hate organization in 2018 and we continue to remove praise, support, and representation of them. Between August last year and January 12 this year, we identified nearly 900 militia organizations under our Dangerous Organizations and Individuals policy and removed thousands of Pages, groups, events, Facebook profiles and Instagram accounts associated with these groups.

This work will never be complete. There will always be new threats and new problems to address, in the US and around the world. That’s why we remain vigilant and alert – and will always have to.

That is also why the suggestion that is sometimes made that the violent insurrection on January 6 would not have occurred if it was not for social media is so misleading. To be clear, the responsibility for those events rests squarely with the perpetrators of the violence, and those in politics and elsewhere who actively encouraged them. Mature democracies in which social media use is widespread hold elections all the time – for instance Germany’s election last week – without the disfiguring presence of violence. We actively share with Law Enforcement material that we can find on our services related to these traumatic events. But reducing the complex reasons for polarization in America – or the insurrection specifically – to a technological explanation is woefully simplistic.

We will continue to face scrutiny – some of it fair and some of it unfair. We’ll continue to be asked difficult questions. And many people will continue to be skeptical of our motives. That’s what comes with being part of a company that has a significant impact in the world. We need to be humble enough to accept criticism when it is fair, and to make changes where they are justified. We aren’t perfect and we don’t have all the answers. That’s why we do the sort of research that has been the subject of these stories in the first place. And we’ll keep looking for ways to respond to the feedback we hear from our users, including testing ways to make sure political content doesn’t take over their News Feeds.

But we should also continue to hold our heads up high. You and your teams do incredible work. Our tools and products have a hugely positive impact on the world and in people’s lives. And you have every reason to be proud of that work.

Leave A Reply

Your email address will not be published.