On the same day whistleblower before Congress about the harms of Facebook and Instagram to children in the fall of 2021, Arturo Bejar, then a contractor at the social media giant, sent an alarming email to Meta CEO Mark Zuckerberg about the same topic.
In the note, as first reported by , Bejar, who worked as an engineering director at Facebook from 2009 to 2015, outlined a 香蕉视频直播渃ritical gap香蕉视频直播 between how the company approached harm and how the people who use its products 香蕉视频直播 most notably young people 香蕉视频直播 experience it.
香蕉视频直播淭wo weeks ago my daughter, 16, and an experimenting creator on Instagram, made a post about cars, and someone commented 香蕉视频直播楪et back to the kitchen.香蕉视频直播 It was deeply upsetting to her,香蕉视频直播 he wrote. 香蕉视频直播淎t the same time the comment is far from being policy violating, and our tools of blocking or deleting mean that this person will go to other profiles and continue to spread misogyny. I don香蕉视频直播檛 think policy/reporting or having more content review are the solutions.香蕉视频直播
Bejar believes that Meta needs to change how it polices its platforms, with a focus on addressing harassment, unwanted sexual advances and other bad experiences even if these problems don香蕉视频直播檛 clearly violate existing policies. For instance, sending vulgar sexual messages to children doesn香蕉视频直播檛 necessarily break Instagram香蕉视频直播檚 rules, but Bejar said teens should have a way to tell the platform they don香蕉视频直播檛 want to receive these types of messages.
Two years later, Bejar is testifying before on Tuesday about social media and the teen mental health crisis, hoping to shed light on how Meta executives, including Zuckerberg, knew about the harms Instagram was causing but chose not to make meaningful changes to address them.
香蕉视频直播淚 can safely say that Meta香蕉视频直播檚 executives knew the harm that teenagers were experiencing, that there were things that they could do that are very doable and that they chose not to do them,香蕉视频直播 Bejar told The Associated Press. This, he said, makes it clear that 香蕉视频直播渨e can香蕉视频直播檛 trust them with our children.香蕉视频直播
Bejar points to user perception surveys that show, for instance, that 13% of Instagram users 香蕉视频直播 ages 13-15 香蕉视频直播 reported having received unwanted sexual advances on the platform within the previous seven days.
In his prepared remarks, Bejar is expected to say he doesn香蕉视频直播檛 believe the reforms he香蕉视频直播檚 suggesting would significantly affect revenue or profits for Meta and its peers. They are not intended to punish the companies, he said, but to help teenagers.
香蕉视频直播淵ou heard the company talk about it 香蕉视频直播榦h this is really complicated,香蕉视频直播櫹憬妒悠抵辈 Bejar told the AP. 香蕉视频直播淣o, it isn香蕉视频直播檛. Just give the teen a chance to say 香蕉视频直播榯his content is not for me香蕉视频直播 and then use that information to train all of the other systems and get feedback that makes it better.香蕉视频直播
The testimony comes amid a bipartisan push in Congress to adopt regulations aimed at protecting children online.
Meta, in a statement, said 香蕉视频直播淓very day countless people inside and outside of Meta are working on how to help keep young people safe online. The issues raised here regarding user perception surveys highlight one part of this effort, and surveys like these have led us to create features like and . Working with parents and experts, we have also introduced over to support teens and their families in having safe, positive experiences online. All of this work continues.香蕉视频直播
Regarding unwanted material users see that does not violate Instagram香蕉视频直播檚 rules, Meta points to its 2021 香蕉视频直播 香蕉视频直播 that say 香蕉视频直播減roblematic or low quality香蕉视频直播 content automatically receives reduced distribution on users香蕉视频直播 feeds. This includes clickbait, misinformation that香蕉视频直播檚 been fact-checked and 香蕉视频直播渂orderline香蕉视频直播 posts, such as a 香蕉视频直播漰hoto of a person posing in a sexually suggestive manner, speech that includes profanity, borderline hate speech, or gory images.香蕉视频直播
In 2022, Meta also introduced 香蕉视频直播渒indness reminders香蕉视频直播 that tell users to be respectful in their direct messages 香蕉视频直播 but it only applies to users who are sending message requests to a creator, not a regular user.
Bejar香蕉视频直播檚 testimony comes just two weeks after for harming young people and contributing to the youth mental health crisis. The lawsuits, filed in state and federal courts, claim that Meta knowingly and deliberately designs features on Instagram and Facebook that addict children to its platforms.
Bejar said it is 香蕉视频直播渁bsolutely essential香蕉视频直播 that Congress passes bipartisan legislation 香蕉视频直播渢o help ensure that there is transparency about these harms and that teens can get help香蕉视频直播 with the support of the right experts.
香蕉视频直播淭he most effective way to regulate social media companies is to require them to develop metrics that will allow both the company and outsiders to evaluate and track instances of harm, as experienced by users. This plays to the strengths of what these companies can do, because data for them is everything,香蕉视频直播 he wrote in his prepared .
Barbara Ortutay, The Associated Press