Highlights of the complaint’s allegations
Discord’s Platform is Structured to Encourage Unchecked and Unmoderated Engagement Among Its Users
Discord designed its app to appeal to children’s desire for personalization and play by offering custom emojis, stickers, and soundboard effects, all of which are intended to make chats more engaging and kid-friendly. And it has created or facilitated “student hubs” as well as communities focused on popular kids’ games, like Roblox.
Once engaged, Discord encourages and facilitates free interaction and engagement between its users. Specifically, Discord’s default settings allow users to receive friend requests from anyone on the app—and to receive private direct messages from friends and anyone using the same server or virtual “community”—enabling child users to connect easily and become “friends” with hundreds of other users. Then, because Discord’s default safety settings disable message scanning between “friends,” child users can be—and are—inundated with explicit content. This explicit content can include user-created child sexual abuse material, messages intended to sexually exploit or coerce a child to engage in self-harm, internet links to sexually explicit content, images, and videos depicting violence, and videos containing sexually explicit content. In short, the app’s design makes it easy for children to connect with other users, but also allows predators to lurk and target them, undeterred by the safety features Discord touts as reasons that parents and users should trust its app.
Discord Misled Users About its “Safe Direct Messaging” Feature
From March 28, 2017 until April 22, 2023, Discord included “Safe Direct Messaging” settings in the “Privacy & Safety” menu of Discord’s “User Settings.” The settings purported to address how direct messages from other users will be scanned and deleted before receipt by the intended user. The Safe Direct Messaging setting contained three options:
- Keep me safe. Scan direct messages from everyone.
- My friends are nice. Scan direct messages from everyone unless they are a friend.
- Do not scan. Direct messages with not be scanned for explicit content.
For most of the feature’s existence, Discord made the “My friends are nice” option the default setting for every new user on the app. This option only scanned incoming direct messages if the sender was not on the user’s friends list. For both the “Keep me Safe” and “My friends are nice” settings, Discord represented that it would “[a]utomatically scan and delete direct messages you receive that contain explicit media content.” But this was not true. Despite its claims, Discord knew that not all explicit content was being detected or deleted.
Discord’s Design Decisions Exacerbated the Risk to Children on the App
Combined with Discord’s deception about its Safe Direct Messaging features, Discord’s other design choices worked together to virtually ensure that children were harmed or placed at risk of harm on its app. For example:
- By default, Discord allows users to exchange DMs if they belong to a common server. Therefore, a malicious user—adult or child—need only to join a community server, which could contain over a million users, to exchange DMs with an unsuspecting child user.
- DMs among “friends” are even more dangerous. Discord’s default settings not only allow any user to send a friend request to a child, they also then permit those users, once “friends,” to exchange totally unscanned DMs through the default “My friends are nice” setting. Children can receive and accept friend requests from users whom they do not know and with whom they have no connection, and then engage privately on the platform without any oversight—all by design.
- Users may also create multiple accounts to hide their activities and circumvent being banned from servers, or from facing other repercussions. And even if users are banned from a server, or from Discord itself, Discord’s design allows them to simply re-engage using a brand new, easily created account.
Discord Misrepresented That Users Under the Age of 13 Are Not Permitted to Create Accounts and Are Banned from Discord Upon Discovery
At all relevant times, Discord’s Terms of Service have stated that users must be “at least 13 years old and meet the minimum age required by the laws in [the users’] country.” To this day, however, Discord only requires individuals to enter their date of birth to establish their age when creating an account—nothing more. Discord does not require users to verify their age or identity in any other way. Simple verification measures could have prevented predators from creating false accounts and kept children under 13 off the app more effectively.
Nevertheless, Discord actively chose not to bolster its age verification process for years and has allowed children under the age of 13 to operate freely on the app, despite their vulnerability to sexual predators.
Simply put, Discord has promised parents safety while simultaneously making deliberate choices about its app’s design and default settings, including Safe Direct Messaging and age verification systems, that broke those promises. As a result of Discord’s decisions, thousands of users were misled into signing up, believing they or their children would be safe, when they were really anything but.
Discord knew its safety features and policies could not and did not protect its youthful user base, but refused to do better, the complaint alleges. In particular, Discord misled parents and kids about its safety settings for direct messages (“DMs”).
They’re trying to force facial ID verification for the USA next
They probably saw the news stories about them doing this in other countries and it made them fucking wet thinking about this happening in the US
I heard you can bypass it, but won’t be allowed to see NSFW?
Everybody is so quick to blame the parents in these situations. Maybe there is some truth to that, but people also need to reckon with the fact that kids (and adults) are being constantly inundated by Skinner box apps, and “platforms” full of engagement bait designed to be addictive and attractive as possible. All run by corporations with functionally no regard for the safety of their users.
Yeah, sure, if you’re giving advice to an individual parent, they should probably be keeping a closer eye on what their kids are doing.
But there are systemic problems here that can’t be fixed with individual action. By laying the blame solely at the feet of the parents here, you are in effect putting parents up against dozens of huge corporations, each with armies of expert advertisers, designers, and psychologists working to build these products. It’s hardly a fair fight.
There is no policy which discord could enact that could protect children because they’re not in the room with the child. What they want discord and other websites to do is impossible and not their job
Their policy allows minors to use the app; maybe it shouldn’t?
No policy in the world will stop minors from using the app. I can’t tell you how many “I am 18+” dialogues I clicked on when I was younger. You want access restrictions.
Do you want to scan your id every time you open the discord app?
changing the policy would be a place to start.
Empty words should never be considered as a start.
Showing intent is always the start
Changing what policy, and to what?
no minors
They already have that policy, as the article notes. The problem is, how do you enforce it? As the comment you replied to notes, without requiring an ID verification, anyone can say they’re any age.
At what point does it become the parents’ responsibility to monitor what their kids are doing online?
That policy is not for minors, since it permits teenagers.
It’s 100% the parents’ responsibility to ensure their kids use the internet safely.
Again with the clueless parents. Stop letting your children use chat apps unsupervised.
https://play.google.com/store/apps/details?id=com.discord
The rating here is Teen, and it tells you all over the page that you will be interacting with other users.
To my fellow millennial parents; remember the rules of the internet when you were young, every one still applies today. It didnt change just because a whole load of tech illiterates decided putting all their info on Facebook was a good idea.
Discord’s job is to provide reliable services, your job is to protect your kids. You think my kids can even download discord? Fuck no. Enjoy tux paint, little one.
Honestly, any parent claiming their child was harmed by an online service should be brought up on charges of neglect
They want everyone to parent their kids except themselves.
It’s always “think of the children” to these miserable control freaks. They’d force kids into a dungeon if they had their say in it.
Its because they can’t be bothered to actually rear children. An iPad is not a replacement for you singing your child to sleep, or teaching them their ABCs, or reading to them. An tablet is a short attention grabber while you are sitting in a doctor’s lobby or driving somewhere. Restrict that usage to 4 hours a day at absolute max, raise it as they age. But they also won’t become tech literate using one of these. It obfuscates too much. So its really just sad all around.
Not very long ago, just before your time
Right before the towers fell, circa '99
This was catalogs, travel blogs, a chatroom or two
We set our sights and spent our nights waiting for you
You, insatiable you
Mommy let you use her iPad, you were barely twoAnd it did all the things we designed it to do
Now, look at you, ha, look at you
You, you! Unstoppable, watchable
Your time is now, your inside’s out, honey, how you grew
And if we stick together, who knows what we’ll do?
It was always the plan to put the world in your handWelcome to the internet,
Bo BurnhamApathy’s a tragedy and boredom is a crime!
Hell yeah I used to love tux paint as a kid
Are they going to sue churches as well?
Don’t be silly, they just don’t want competition.
For fuck’s sake.
looks like Discord is just going to continue to get worse.