Instagram is expanding its age verification program to several more regions, as it looks to improve its processes of clarifying and confirming user ages, and limiting their exposure in the app.
As per Meta’s Andy Stone:
“Starting today, we’re beginning to expand our Instagram age verification test to Mexico, Canada, South Korea, Australia, Japan, and more countries in Europe. This builds on an expansion to India and Brazil we announced in October, with more countries coming in the next few months.”
Initially launched in the US last June, Instagram’s age verification process requires users to verify their stated age by using one of three options:
- Upload their government ID
- Record a video selfie
- Ask mutual friends to verify their age.
Instagram’s video selfie process uses video analysis from Yoti to estimate a person’s age in the clip.
Meta says that the process has proven to be highly effective, with 96% of teens who’ve attempted to edit their birthdays from under 18 to 18+ stopped from doing so. That enhanced level of verification could be a big step in ensuring that young users are not accessing potentially harmful elements of the app, and are not being targeted by advertisers with inappropriate promotions.
This is an important focus, because as various investigations have found, social media platforms, including Instagram, can be harmful for young users, in a range of ways, while underage usage can also expose kids to predators and inappropriate content.
Such issues have been exacerbated over the last two years, with the pandemic lockdowns forcing more kids online for entertainment and social connection. And with parents also working from home, it’s almost impossible to be monitoring what your child is up to all of the time.
Additional measures like this are a significant and important step, and while it won’t stop every youngster from cheating the system, the combined effort will limit the capacity for kids to cheat their way into Meta’s apps.
The expansion, then, is another important development, which could go a long way towards improving the safety aspects of the platform.
It’s not a solution, but it’s another step – and it could be a significant one at that.
[ad_2]
Source link