Zuckerberg shutdown calls to improve teen safety court documents claim

Meta CEO Mark Zuckerberg repeatedly shut down calls from execs to improve teen safety on Facebook and Instagram and to shut down ‘beauty filters,’ court documents claim

  • At times, Zuckerberg allegedly directly ‘vetoed’ initiatives designed to improve well-being on the apps
  • For years the social media giant has been linked to young people’s poor mental health
  • A lawsuit filed last month by 33 states alleges that Meta ‘prioritized targeting’ young users’ vulnerabilities

META CEO Mark Zuckerberg personally shut down calls to improve teen wellbeing on Facebook and Instagram, bombshell court documents claim.

At times, Zuckerberg allegedly directly overruled some of his most senior lieutenants on initiatives designed to improve safety on the apps, the documents state.

The communications were unsealed as part of an ongoing lawsuit against Meta by Massachusetts Attorney General Andrea Joy Campbell. It is part of a recent wave of legislation against META in state and federal courts claiming the company contributed to mental health issues among youths. 

The filings say Zuckerberg ignored or shut down suggestions by top executives, including the President of Global Affairs, Nick Clegg, to do more to protect Instagram’s 30million US teen users.

For years, the social media giant has been linked to young people’s poor mental health – with charities pointing to negative impacts on wellbeing and lower life satisfaction.

Meta CEO Mark Zuckerberg allegedly blocked calls to improve teen well-being on the app

The lawsuit filed last month in Massachusetts state court alleges that Meta – and Zuckerberg – have ‘relentlessly prioritized targeting these young users and has tailored its platforms’ features to manipulate and exploit their developing brains’.

They claim ‘Meta knows well that addictive overuse of its platforms by young people is dangerous and damaging to their mental and physical health.’ 

The latest unsealed documents allege that Zuckerberg personally ‘vetoed the proposal to formally ban plastic surgery simulation’ and ‘directed staff to “relax” or “lift” the temporary ban that had been in place’.

He allegedly said there was a ‘clear demand’ for the filters and ‘had seen no data suggesting the filters were harmful’. 

The lawsuit claims he made this decision despite knowing ‘the filters caused young users harm and negatively affected their well-being’.

Meta’s vice president of product design, Margaret Gould Stewart then allegedly wrote to Zuckerberg saying ‘I respect your call on this and I’ll support it, but want to just say for the record that I don’t think it’s the right call given the risks…

‘I just hope that years from now we will look back and feel good about the decision we made here.’ 

Filter factor: Madonna – aged 63 – in the snap (left) she posted in 2021. 

The case claims Meta is breaking consumer protection statutes and common law through ‘unfair’ and ‘deceptive’ practices. 

A year after the beauty filter decision, Clegg allegedly pressed Zuckerberg to make ‘additional investment to strengthen our position on wellbeing across the company.’

According to the complaint, Zuckerberg didn’t respond to his proposal at first and then appeared to respond through Meta’s chief financial officer, Susan Li, who ‘tersely respond[ed] that staffing was too ‘constrained’ to meet the request’.

Massachusetts Attorney General Andrea Joy Campbell said: ‘We allege that Meta knowingly targeted and exploited young people just so the company could make a profit – and the public is now able to see exactly how they did it. 

‘Understanding Meta’s harmful conduct is critical for parents, children, and teenagers, and I’m glad this information is now publicly available to them. 

‘My colleagues and I will continue to protect our young people and push for meaningful changes to this platform.’ 

Massachusetts is one of dozens of states that are suing Meta Platforms and its Instagram unit, accusing them of intentionally jeopardizing the mental health of children and teenagers for profit. 

The lawsuit alleges that Mark Zuckerberg’s corporation has ‘profoundly altered the psychological and social realities of a generation of young Americans’ by ‘ensnaring’ them in addictive cycles through its targeted algorithms. 

Filed in a California federal court in October, the complaint says that Meta, which also operates Facebook, has repeatedly misled the public about the substantial dangers of its platforms and contributed to a youth mental health crisis. 

Influencer Faye Dickinson created the popular Instagram filter ‘Filter vs Reality’ which uses a split-screen effect to show the difference between edited and unedited photos

Influencer Faye Dickinson , 28, from east London has called on social media apps to ban photo filters or have an age restriction for over 18s only. 

She told the Mail in 2021: ‘The problem with these filters is you see a side of yourself with dramatic filters that don’t exist, which corresponds to an unnatural and inhuman ideal of beauty that you can now achieve with filters.

‘It’s the unhealthy obsession we all have with that perfect look.’

A 2021 survey, of 2,069 people by Uvence, found a fifth said they now no longer post photos on social media without using editing tools that get rid of wrinkles, spots and stretch marks.

Meanwhile, 37 per cent said they prefer their filtered face to their real face. 

A Meta spokesperson said: ‘While filters exist across every major social platform and smartphone camera, Meta bans those that directly promote cosmetic surgery, changes in skin color or extreme weight loss. 

‘We clearly note when a filter is being used and we work to proactively review effects against these rules before they go live.

‘The complaint is filled with selective quotes from hand-picked documents that do not provide the full context of how the company operates or what decisions were made.

‘As a result of Meta’s ongoing investment in the well-being of the people that use our services, teens and their parents now have over 30 tools and resources, and we have protections to help keep teens safe and away from potentially harmful content or unwanted contact.’ 

Source: Read Full Article