Frances Haugen grew up attending the Iowa caucuses with her academic parents giving her a strong sense of pride in democracy and civic responsibility.
Haugen holds a degree in Electrical and Computer Engineering from Olin College and an MBA from Harvard. She is a specialist in algorithmic product management, having worked on ranking algorithms at Google, Pinterest, Yelp, and Facebook. She was recruited to Facebook to be the lead Product Manager on the Civic Misinformation team, which dealt with issues related to democracy and misinformation, and later, worked on counter espionage.
During her time at Facebook, Frances became increasingly alarmed by the choices the company makes prioritizing profits over public safety and putting people’s lives at risk. As a last resort and at great personal risk, Frances made the courageous decision to blow the whistle on Facebook.
Haugen, in a recent New York Times Op-Ed, disclosed that Facebook’s products “were spurring hate and division, leading teenagers into rabbit holes of self-harm and anorexia, leaving millions of users without basic safety systems for hate speech or violence incitement and, at times, were even used to sell humans across the platform.”
Haugen, who joined Shared Assessments 2022 Summit as a keynote speaker, fundamentally believes that the problems we are facing today with social media are solvable. We can have social media that brings out the best in humanity.
Most people are unaware of the sheer magnitude of the reality that we are living in. Technology plays a strong part domestically, internationally, with our children and the intimate details of our lives. Technology – and social media specifically – introduces three kinds of risks:
In 2018, Facebook was facing a steadily decreasing volume of content produced on their platform. To combat this issue, they started implementing production experiments to see what would get users to produce more content. In these experiments, the “like” to comments relationship was notable: the more likes you received, the more likely you were to produce more content. Facebook prioritizing content that will receive engagement based on performance from your prior posts to users’ newsfeeds. The platform moved from being consumption focused towards becoming a platform of production.
Haugen points out that “Facebook’s poorly implemented content moderation strategies leave those most at risk of real world violence unprotected and consistently succeed at only one thing: angering everyone.”
Until now, there have been no laws holding Big Tech accountable. The EU has recently finalized its Digital Services Act (DSA) package. According to the European Commission, the DSA’s initiative aims to create a safe digital environment to protect users’ fundamental rights and to establish a level playing field for businesses. These strides will make it easier for regulators to verify that businesses are complying with the law.
What does that mean for Facebook and other social media platforms? With DSA in place, social media platforms are required to be transparent regarding the content that is trending on users’ newsfeeds. In addition, consumer protections must be applied to features that spy on users, create an addiction to the platform amongst kids, or weaken the safety of the public.
“The new requirement for access to data will allow independent research into the impact of social media products on public health and welfare,” states Haugen in her NY Times piece.
Corporate pressure and divestment are a real thing. To change the way your organizaion and your vendors view social media, organizational changes probably need to come from the top. As a risk practitioner, understand how the various parties in your supply chain use social media and emphasize the value of human judgment.