Messaging platform Discord announced Monday it will implement enhanced safety features for teenage users globally, including facial recognition, joining a wave of social media companies rolling out age verification systems.
The rollout, beginning in early March, will make teen-appropriate settings the default for all users, with adults needing to verify their age to loosen protections including content filters and bans on direct messaging, the company said.
The San Francisco-based platform, popular among gamers, will use facial age estimation technology and identity verification through vendor partners to determine users' ages.
Tracking software running in the background will also help determine the age of users without always requiring direct verification.
"Nowhere is our safety work more important than when it comes to teen users," said Savannah Badalich, Discord's head of product policy.
Discord insisted the measures came with privacy protections, saying video selfies for age estimation never leave users' devices and that submitted identity documents are deleted quickly.
The platform said it successfully tested the measures in Britain and Australia last year before expanding worldwide.
The move follows similar actions by rivals facing intense scrutiny over child safety and follows an Australian ban on under-16s using social media that is being duplicated in other countries.
Resorting to facial recognition and other technologies addresses the reality that self-reported age has proven unreliable, with minors routinely lying about their birthdates to circumvent platform safety measures.
Gaming platform Roblox in January began requiring facial age verification globally for all users to access chat features, after facing multiple lawsuits alleging the platform enabled predatory behavior and child exploitation.
Meta, which owns Instagram and Facebook, has deployed AI-powered methods to determine age and introduced "Teen Accounts" with automatic restrictions for users under 18.
Mark Zuckerberg's company removed over 550,000 underage accounts in Australia alone in December ahead of that country's under-16 social media ban.
TikTok has implemented 60-minute daily screen time limits for users under 18 and notification cutoffs based on age groups.
The industry-wide shift comes as half of US states have enacted or introduced legislation involving age-related social media regulation, though courts have blocked many of the restrictions on free speech grounds.
The changes come the same day as a trial in California on social media addiction for children begins in Los Angeles, with plaintiffs alleging Meta's and YouTube's platforms were designed to be addictive to minors.
arp/des

(0) comments
Welcome to the discussion.
Log In
Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
PLEASE TURN OFF YOUR CAPS LOCK.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.