Instagram interface, which encourages users to consult sources and professionals specialized in certain topics – GOAL
MADRID, Jan. 9 (Portaltic/EP) –
Goal has announced new protection measures for Instagram and Facebook, with which it seeks to offer adolescents experiences more appropriate to their age, for whom it will make it difficult to access content such as those related to mental health, such as self-harm or suicide, to redirect them to official sources with which to obtain information.
The company has recognized that it has been developing different tools and resources for more than a decade with which it seeks to help minors and their parents to use social networks in the best way and the platforms it develops.
One of these measures is the ‘videoselfie’ – a tool that TikTok also uses -, which it uses to create an Instagram account and which serves to verify that users are of the minimum age to use its services.
Once the age is identified and it is determined that it is a minor, limitations are added to the actions that can be carried out within the platform and the type of content that appears in the ‘feed’ is controlled, so that it does not show what is considered dangerous or harmful (harassment, abuse, sexualization, etc.).
With the aim that adolescents continue to have “safe and appropriate experiences“In them, in this time he has consulted experts in psychology and mental health to understand which content is least harmful for users who are minors.
For this reason, it has announced that it will apply the most restrictive content control settings on both Facebook and Instagram, which means that all underage users, even those who already had an account before the implementation of the video selfie, will have a more difficult time access potentially sensitive content or accounts in sections such as Search or Explore.
Recommendations and discovery will also limit access to potentially harmful mental health content, such as those related to thoughts of self-harm, suicide or eating disorders.
Although the company considers thatSomeone who publishes this type of content can help destigmatize This topic also assumes that “it is complex and is not necessarily suitable for all young people.”
For this reason, from now on, when users – this applies to everyone, not just minors – search for terms related to these topics, Facebook and Instagram will hide the related results and direct them to specialized sources for help, a measure that will extend to new, potentially harmful search terms.
Likewise, Meta will implement a system of notifications with which it seeks to encourage the youngest to review the Settings section of your accounts and enable any privacy-related options that are available.
In this sense, it has said that in the event that teenagers activate the recommended settings, it will automatically change their settings to restrict who can republish their content, tag them or mention them, among other actions.
Finally, he noted that the company will also ensure that only the followers of these teenagers can send them messages, and that it will hide comments that may be offensive.
Meta has confirmed that it has begun to deploy these changes to the profiles of users under 18 years of age and it is expected that in the coming months are “fully implemented” on both platforms.