Instagram has consistently worked on improving the in-app
experience for its teen users, ensuring a safe social space for them to engage
with others. Recently, Instagram has launched new content recommendation
controls that reduce teen users’ exposure to sensitive content, such as
involving the topic of self-harm.
Instagram has taken into consideration the potential
negative impact of a complex topic like self-harm on teens’ mental health. “We
will start to remove this type of content from teens’ experiences on Instagram
and Facebook, as well as other types of age-inappropriate content,” says the
social media company. Currently, there is already limited recommendations of
self-harm related content within Reels and Explore. Moving forward, these
restrictions will also be applied to Feed and Stories, even including content from
accounts that teen users follow.
Linked with this addition is another important safety restriction that would inform Instagram if a teen user searched for terms similar to suicide, self-harm, and eating disorders. This would enable Instagram to redirect the user to official help services when they search for such terms. Additionally, specific search results that are detected as potentially triggering will be entirely hidden from the users.
Another way that Meta is ensuring enhanced safety for young
users is by making the most restrictive settings default for them. This
means that these settings will be automatically opted for when a user signs up
to any of Meta’s platforms. Lastly, Instagram is also sending out prompts
that encourage teen users to update their settings that create a more private
experience for them.