Meta Introduces New Restrictions on Instagram to Safeguard Teens


Meta, the conglomerate facing accusations in the United States and Europe of harming the mental health of teenagers, unveiled new measures on Thursday to protect young users on its popular social media platform, Instagram.



In a statement from its California headquarters, Meta explained that teenagers now need parental approval through parental supervision tools available on Instagram to change certain application settings.


Users below the legal age will require explicit consent from their parents to switch their accounts from private to public, access "more sensitive" content, and receive messages from individuals not already following them on the platform.


The Parent's Statement

Meta, which includes Facebook, Instagram, WhatsApp, and others, emphasized its commitment to enhancing "teenagers' protection from unwanted communications" and "empowering parents to have a greater influence on their children's online experience."


Instagram's default settings will prevent any user not already connected to a minor teenager from communicating with them.


"Psychological and Physical Risks"

In late October, 41 U.S. states filed a civil lawsuit against Meta, accusing Facebook and Instagram of causing "psychological and physical harm to youth," citing the risks of addiction, cyberbullying, and eating disorders.


Prosecutors in the complaint filed in a California court asserted that "Meta leveraged powerful and unprecedented techniques to attract youth and teenagers (...) ultimately ensnaring them for profit."


Both Democratic and Republican attorneys general accused the giant conglomerate of "concealing how these platforms exploit the most vulnerable users and manipulate them" and "neglecting the significant harm" it causes to the "mental and physical health of our country's youth."


Oppo Unveils the Oppo A79 5G Smartphone

Apple Plans to Launch a New Version of AirPods Max