Home Technology & Innovation Creeps Will Have a Harder Time Sliding Into Instagram and Facebook DMs

Creeps Will Have a Harder Time Sliding Into Instagram and Facebook DMs

0


Photo: Michaela Handrek-Rehle/Bloomberg (Getty Images)

Meta no longer allows teenagers to receive direct messages from anyone they’re not following on Instagram, or friends with on Facebook, according to a company blog post on Thursday. The policy update aims to create a safer environment for minors on the platform, one week after Meta’s internal documents revealed that roughly 100,000 children are sexually harassed on Facebook and Instagram every single day.

New Mexico’s attorney general sued Meta for its child safety policies back in December, alleging the platform enables human trafficking. The complaint labeled Mark Zuckerberg’s social media companies as “breeding grounds” for predators targeting children for grooming and solicitation.

The Pew Research Center estimates that 62% of American teens use Instagram, while 32% use Facebook. Meta has already banned adults from messaging teenagers who don’t follow them on Instagram, but Thursday’s policy update takes safety precautions to a new extreme. Teenagers used to be able to message other teenagers, even if they didn’t follow them, but now they’re no longer able to.

A Meta spokesperson told Gizmodo these policy updates were unrelated to New Mexico’s lawsuit.

These latest measures intend to create an environment where teenagers are only interacting with people they know on Instagram and Facebook. It ensures minors can only send direct messages to teens or adults they follow or are friends with on Facebook. This policy circumvents a common tactic among predators, who create accounts impersonating a minor to try to lure in teenagers.

Meta has millions of children on its social media platforms, and it’s doing everything it can to keep profiting off of them. Meta sued the Federal Trade Commission (FTC) in November to ensure the company can keep monetizing children’s data. That came three years after the FTC sued Meta in a $5 billion lawsuit, trying to get the social media platforms to stop cashing out on children’s data.

Instagram and Facebook are facing another lawsuit claiming it doesn’t do enough to protect children. Attorneys general from 33 states collectively sued Meta for knowingly addicting a generation of children to its platforms and claimed the company violated The Children’s Online Privacy Protection Act of 1998 (COPPA) by letting users younger than 13 on the platform.

Meta is doing all it can to convince America that children are safe on its platform and that it should keep being able to profit off their data. However, legal cases against Meta are stacking up trying to prove the opposite.



Source link

Exit mobile version