Discord Age Verification Delayed: Why the Platform Is Stalling on Child Safety — and What Parents Need to Know
As lawsuits pile up and an IPO looms, the platform's safety promises ring hollow for millions of families

Discord's promise to tighten age checks has hit another delay, and the timing raises questions about whether child safety or corporate priorities are driving the decision.
The chat platform announced on 24 February that it is pushing back its global rollout of age verification until the second half of 2026. The original plan included facial scans and ID checks for users worldwide by March. That's now off the table after users pushed back hard.
Discord insists it needs more time to 'get it right.' Yet the delay comes just weeks after the company filed confidential IPO paperwork with the US Securities and Exchange Commission in January 2026. According to reports, Discord is chasing a valuation somewhere between $15 billion (£11.1 billion) and $25 billion (£18.5 billion).
With over 200 million monthly active users, many of them teenagers, any friction that shrinks user numbers is unwelcome news before a public listing. Robust age checks could do exactly that.
The Financial Stakes
Discord's business runs on growth. In 2021, the company raised $500 million (£370 billion) at a $15 billion (£11.1 billion) valuation, according to financial tracking site Forge Global. Secondary market transactions now peg the company closer to $6.6 billion (£4.9 billion) to $8 billion (£5.9 billion), a steep drop that makes a strong IPO debut even more important.
If strict verification leads to teenagers abandoning the platform or refusing to sign up, analysts will not look kindly on the numbers.
'Let me be upfront: we knew this rollout was going to be controversial,' Discord CTO Stanislav Vishnevskiy wrote in a blog post. 'Any time you introduce something that touches identity and verification, people are going to have strong feelings. Rightfully so.'
A System That Doesn't Work
Right now, Discord's age verification is limited to self-declaration. Users simply type in any birthdate, meaning a 12-year-old can claim to be 18 with a few keystrokes.
The UK's Online Safety Act, which took full effect in July 2025, requires platforms to use technically robust age checks, not self-declaration. UK users now face facial age estimation or government ID verification to access age-restricted content.
Globally, however, little has changed. Children can still stumble into unmoderated servers filled with adult content, scams, and other risks. Discord claims its internal systems can determine age for 90% of users by looking at signals like payment methods and account history. Yet lawsuits suggest these safeguards are insufficient.
The Lawsuits Keep Coming
On 17 April 2025, New Jersey Attorney General Matthew Platkin sued Discord under the state's Consumer Fraud Act. The complaint alleged that Discord's default settings and safety features left children exposed to predators, while the platform marketed itself as safe for teens.
The pattern in these cases is disturbing. Predators meet children on gaming platforms like Roblox, then move conversations to Discord's private servers and direct messages — away from parents and moderation.
By December 2025, 80 lawsuits involving Roblox and Discord were consolidated in California federal court. The National Centre on Sexual Exploitation has listed Discord on its 'Dirty Dozen' for four straight years. Discord's own transparency report showed over 530,000 accounts disabled for child safety violations in just one quarter of 2022.
What Parents Should Know
In the UK, Discord already requires age verification for certain content and settings. But outside the UK, children still rely on a system that asks users to self-certify their age.
Discord says it will add more verification options before the global rollout, including credit card checks. The company also promises to be more transparent about its verification vendors. In September 2025, hackers breached a third-party vendor Discord used for age-related appeals, exposing government IDs and personal data from around 70,000 users. Discord says it no longer works with that vendor.
The platform insists it takes child safety seriously. Yet lawsuits, regulatory pressure, and repeated inclusion on the 'Dirty Dozen' list suggest otherwise.
Discord has asked for more time. Parents should be asking: more time to protect children, or to protect an IPO?
© Copyright IBTimes 2025. All rights reserved.
















