Advertisment

Meta Accused Of Hampering Mental Health Of Young Women, Says Reports

According to a previously released document that has not been leaked from Meta, those in charge of Meta when it was still known as Facebook understood that Instagram was purposefully exposing young teenage females to hazardous and dangerous content but took no action to stop it.

author-image
Khushi Sabharwal
Updated On
New Update
Meta hampered girls' mental health
According to a previously released document that has not been leaked from Meta, those in charge when it was still known as Facebook understood that Instagram was purposefully exposing young teenage females to hazardous and dangerous content but took no action to stop it.
Advertisment

According to the document, an Instagram employee pretended to be a 13-year-old girl searching for diet advice while conducting research on Instagram's algorithm and suggestions. The algorithm decided to present information from more popular, viral issues that had more engagement, which was related to eating a suitable diet, rather than content from medical and proper fitness experts, which would have informed the user.

These supposedly "similar" viral topics ended out being posts about anorexia. The user was directed to graphic material and given recommendations to follow "skinny binge" and "apple core anorexic" accounts. Read on to learn how the leaked documents reveal how meta knew that Instagram was pushing women towards harmful content that hampered their mental health.

Meta Hampered Girls' Mental Health

Because of the app's suggested material and the algorithm employed by Instagram to construct a user's feed, it is well known that the social media company was aware that over 33% of all teenage users felt worse about their bodies. Instagram also understood that teen app users had greater levels of anxiety and sadness.

It's not the first time that mental health professionals and advocates have expressed disagreement with Instagram's algorithms and the content that it forces on users. In a case involving Molly Russell, a 14-year-old girl who committed suicide in 2017, a coroner in the UK earlier this year publicly cited Instagram as the cause of death.

Cases like this have sparked discussion regarding the social media platforms' content moderation standards and how they apply in practice. The Social Media Victims Law Center was founded by attorney Matt Bergman after reading the Facebook Papers, which were made public by leaker Frances Haugen last year. More than 1,200 families are currently working with him to file cases against social media firms.

Advertisment

Suggested Reading: No Need For An Ultra Exclusive Product, Vim Puts Spotlight On Why Men Should Do More Housework

 

 

mental health Meta
Advertisment