Two giant tech platforms that are currently at the height of popularity and used by millions of kids are facing mounting public pressure, both in the courtroom and in the court of public opinion.
In early February, 800+ parents signed two open letters, one each addressed to the boards of both Roblox and Discord, accusing them of trying to quietly push the hundreds of child sexual abuse lawsuits against them out of the public’s view.
The open letters were originally written by four groups of parents representing a large portion of the US—each from Washington, California, Florida, and Texas. The letters were written on behalf of their children and hundreds or potentially thousands of other families that are also concerned about child sexual abuse on Roblox and how Discord can help predators hide.
These letters are part of bombshell lawsuits that have been filed against these mega-platforms for supposedly knowing about and not stopping online grooming, sextortion, and real-world sexual assault and rape that were facilitated on their platforms.
Right now, the outcome of the lawsuits is unclear. But what is clear is the message from the families: stop trying to silence survivors.
How are Roblox and Discord Trying to Hide These Lawsuits?
The number of these lawsuits, filed by law firms like Dolman Law Group, continues to grow nationwide. The story has become uncontainable as parents across the country discuss whether or not their children might be in danger on these platforms. And although the topic is hot now, our new cycles run fast, especially in the current political climate, so these companies are hoping that the topic will die down if they can quietly settle these cases through forced arbitration.
Forced arbitration is a legal process (often buried in a platform’s terms of service) that requires victims to resolve any disputes in private, off-the-record proceedings instead of suing publicly in court. This keeps evidence, testimony, and outcomes from becoming public information and requires that any agreed-upon dollar amounts remain confidential.
Why These Letters Were Written
According to reporting from ABC News, more than 800 parents signed onto the letter directed at Roblox, while another group of families sent a similar letter to Discord.
Many of these parents already have active lawsuits. Others are preparing to file.
Their core allegation is twofold:
Predators used these platforms to groom and exploit their children.
The companies are now trying to force those cases into confidential arbitration instead of open court.
Arbitration, while legal in many consumer disputes, happens behind closed doors. Proceedings are private. Outcomes are sealed. Patterns of misconduct often never reach public scrutiny.
For families seeking accountability or systemic change, secrecy is a huge part of the problem.
As one line from the Roblox letter put it:
“Secret arbitration is not protection; it is concealment.”
The Broader Lawsuit Landscape
The letters did not emerge in a vacuum.
Roblox alone is facing more than 100 allegations that predators used the platform to target minors.
Attorneys representing families say they are investigating thousands of additional claims involving grooming, sexual exploitation, and sextortion that allegedly began through in-game interactions.
In many cases, plaintiffs claim predators first contacted children on Roblox, then moved conversations to private communication platforms like Discord, where monitoring and parental visibility dropped off significantly.
Letter to Roblox: Safety Promises vs. Legal Tactics
The Roblox letter directly challenges what parents describe as a contradiction between the company’s public messaging and its legal strategy.
Roblox has long marketed itself as a child-friendly platform. Executives have publicly emphasized that even a single safety incident is “one too many.”
Parents say that messaging rings hollow.
“No company that claims to put ‘community before company’ can in good conscience attempt to silence child victims to protect itself.”
The letter outlines multiple alleged grooming scenarios, including cases where children were:
Manipulated into sending explicit images
Extorted using in-game currency (Robux)
Lured onto third-party apps for further exploitation
One family wrote:
“Our son’s life will never be the same… Not a day goes by that we don’t wish that we had never let him use Roblox.”
Another parent described losing their child to suicide after online exploitation:
“By trying to force his case into arbitration… Roblox is choosing secrecy over accountability.”
The letter ultimately demands that Roblox publicly commit to ending forced arbitration in sexual abuse cases.
Letter to Discord: Grooming Pipelines and “Silencing” Claims
The Discord letter echoes many of the same themes, but focuses more heavily on how predators use the platform’s communication tools.
Parents say Discord servers, private messages, and chat features became grooming environments once initial contact was made elsewhere.
“Predators easily reached them through your servers, channels, and private messages.”
The letter accuses Discord of refusing to take responsibility once abuse occurred, and instead attempting to shield itself legally.
“Forced arbitration does not protect children. It protects predators and platforms.”
Families described devastating sexual exploitation that occurred through the platform, including sextortion, self-harm, and suicide.
One parent wrote:
“The pain and confusion… were more than he could navigate at his young age.”
Like the Roblox families, these families are demanding that Discord abandon arbitration efforts and allow survivors to pursue claims publicly.
The Legal Flashpoint: Forced Arbitration
At the center of both letters is the same legal fight.
Companies are invoking arbitration terms of service, arguing that disputes must be resolved privately.
Families argue that those clauses should not apply to child sexual abuse cases.
They point to federal legislation like the Ending Forced Arbitration of Sexual Assault and Sexual Harassment Act, which limits the use of arbitration in sexual misconduct disputes.
The letters argue that attempting to apply arbitration to child exploitation claims is both legally questionable and morally indefensible.
Company Responses
Roblox has publicly maintained that child safety is a top priority.
In prior statements, the company said it is:
Investing in AI age verification
Using human moderators and automated filters
Restricting chat features for younger users
Working with law enforcement on abuse cases
A spokesperson previously stated the company is “deeply troubled by any allegations about harms to children online” and is committed to improving platform safety.
Discord has similarly said it is committed to safety and requires users to be at least 13 years old to use the platform.
Neither company publicly addressed the open letters in detail at the time of reporting.
Why This Moment Matters
These letters are not lawsuits themselves, but they are definitely a type of strategic pressure.
They signal three things:
Litigation volume is growing
Families want public trials, not private settlements
Platform liability is becoming harder to contain quietly
For parents, the issue is transparency.
For tech companies, it’s risk exposure.
And for the legal system, it raises a larger question:
Should platforms be allowed to contract their way out of public accountability when children are harmed?
When you’re seriously injured in an accident, you may need to hire a personal injury lawyer to handle your claim, but exactly what does a personal injury lawyer do? Most people know that if you are hurt in a car accident, personal injury law firms can help you recover compensation for your injuries. However, most […]
Hopefully, you will never need to bring a wrongful death case. But if you lose a loved one in an accident that was caused by someone else’s negligent or reckless behavior, you need to understand exactly what does a wrongful death lawyer do and why do you need one? A wrongful death lawyer handles civil […]