Discord is set to launch a global age verification tool next month, aiming to better safeguard the millions of teenagers using its platform.
Users will be required to verify their age via a facial scan or by uploading an official ID document. Once verified, they will gain access to age-restricted channels, servers, and sensitive content.
Having tested a limited age verification system in Australia and the UK, Discord plans to expand it as part of a global rollout. The company stated in its announcement that it will not store verification videos or ID photos on its own servers or those of its verification partners. It will rely on AI to estimate a user's age based on facial features.
With over 200 million monthly active users, Discord is primarily used for instant messaging and voice chat. While the platform originated in the gaming community, its appeal has broadened in recent years. Its servers now encompass a wide range of interests and online communities, some of which host explicit content. Discord does not proactively monitor individual servers, instead relying on a content moderation approach similar to Reddit.
"Our safety efforts are especially crucial when it comes to our teenage users, which is why we are announcing these updates ahead of Safer Internet Day," said Savannah Badalich, Discord's Head of Product Policy. "Rolling out teen safety defaults globally builds upon Discord's existing safety architecture, providing strong protections for teens while allowing flexibility for verified adults."
A Move Ahead of Potential Social Media Bans
This initiative marks the first large-scale deployment of a global age verification system by a major social platform, likely aiming to preempt stricter regulations regarding access for users under 18.
Australia recently became the first country to restrict social media use for those under 16, a decision that has sparked legal challenges and criticism from several tech companies, including Reddit. Discord was exempted from Australia's ban, possibly because it is viewed more as a communication tool than a traditional social network, though this status may not hold as regulations evolve in other regions.
At least seven European countries are reportedly considering similar measures, with age thresholds ranging from 15 to 16. The European Union is also contemplating stricter bloc-wide rules, following a majority vote in the European Parliament supporting enhanced protections for minors. The UK, New Zealand, and several provinces in Canada have also expressed interest in implementing similar restrictions.
Responses from Major Players
The long-standing model allowing social media platforms to register users from age 13 may be nearing its end. Beyond regulatory pressure, the industry faces a growing number of lawsuits alleging that major platforms contribute to mental health issues and intentionally design addictive features for teenagers.
Some companies, including Discord, are proactively strengthening verification systems and limiting access for minors. TikTok has stated it is updating its tools to prevent users under 13 from accessing the app, while Meta has announced it is developing similar systems for Facebook and Instagram.