Some time back, Facebook had announced its plan to launch a separate version of Instagram for children below the age of 13 years. The news received a lot of criticism from people who got concerned about the company’s decision, but it looks like Facebook is sticking to its plan.
On the other hand, Facebook is ensuring a safer environment on Instagram’s original app, especially for young users, by taking 3 important steps. The first move is the default private account setting, which would make sure that a newly created account is a private one, so as to protect a user from potential harmful content from other users.
Secondly, the company is making it harder for any suspicious accounts on Instagram to find accounts owned by young users, so as to keep them safe from potential harassment.
The third step is limiting the reach of advertisers to the younger audience, so they are not exposed to unwanted ads containing content that may not be age-appropriate.
Moreover, the company is working on improving its AI technology, to help locate accounts of underage users who manage to create their accounts and stay in the hiding by falsely reporting their ages. Once targeted, these accounts would be taken down by Instagram.