Discord’s Most Disgusting Problem is Getting Worse…
By No Text To Speech
Summary
## Key takeaways - **Stage 0: Hidden Predator Servers**: CSAM servers existed but were hidden in dark web forums, requiring predators to actively seek them out like using Discord as LinkedIn for trading Mega links with massive folders of content. [00:41], [01:44] - **Stage 1: Listing Sites Expose Risks**: Sites like Disboard and Discadia list servers with CSAM, easily found via Google searches, putting normal users at risk of stumbling upon predator servers due to poor moderation. [02:02], [03:35] - **Stage 2: Teens Fuel CSAM Spread**: Discord 'comm' servers of 14-17 year olds pressure youth to collect and distribute images for social status, with teens selling nudes in normal servers, turning it into Amazon for predators. [04:36], [08:49] - **Stage 4: Predator Sellers Advertise**: Predators use Discord accounts to openly sell 500GB of CSAM videos for $200 via Mega links, with bios like 'trading no limit mega links' remaining unbanned despite obvious intent. [09:15], [10:08] - **Stage 5: Bots Invade Normal Servers**: Predators buy thousands of cheap accounts turned into bots with usernames like 'teen lesbian CP stuff' and CSAM profile pics, spamming even Roblox kid servers and gay pride Discords. [10:52], [15:50] - **Discord's Failures Exposed by Data**: DSA data shows 38% of users in illicit servers had prior violations, 86% child safety related, with banned accounts' CSAM messages and media persisting, contradicting zero-tolerance claims. [17:05], [21:02]
Topics Covered
- Predators Use Discord as Networking Hub
- Listing Sites Expose CSAM Servers
- Comms Fuel Teen CSAM Trading
- Teen Sellers Advertise Everywhere
- Bots Spam CSAM in All Servers
Full Transcript
Discord has a massive CP problem and it's getting significantly worse. In
fact, things have escalated so quickly and gotten so bad that just being on a completely normal Discord server puts you at risk of being exposed to CSAM content. But to explain how we got here,
content. But to explain how we got here, I want to walk you through the timeline of events that led to this, where each stage of the timeline gets progressively worse and worse until we end up where we're at now, where you can just join a
random Discord server and potentially be exposed to CSAM content. It's not even a matter of if you see it, it's when you see it. It is that [ __ ] bad. But
see it. It is that [ __ ] bad. But
first, let's start off with stage zero, how things were at the beginning. But
there were discord servers that would share seam content, but these were hidden in very like dark web forms. You would basically you would have to be a predator to find these servers and they would look a little something like this.
Now, this is not exactly the server. To
find those, you genuinely have to be a predator. But this is sort of what it
predator. But this is sort of what it was like. This is a Discord server
was like. This is a Discord server called the mega trading. Basically,
predators on Discord will have a collection of CSAM content and they will host it on a website called Mega. This
is like Google Drive, Dropbox, but Mega is encrypted. And since it's encrypted,
is encrypted. And since it's encrypted, predators take advantage of that. And
you can see from this screenshot here, this person sent a whole bunch of links.
And inside of those links, they contain folders, 69 GB folders, 14 GB folders that just contain files and files of videos and pictures of CSAM content. But
what Predators will do is they will find servers like this to trade their mega links. They are trying to expand their
links. They are trying to expand their collection. These people are genuinely
collection. These people are genuinely using Discord like LinkedIn for predators. Here's what I learned about
predators. Here's what I learned about B2B sales by engaging with predators.
I'm going to get sued for this joke.
Now, just keep in mind these servers always existed, but they were hard to find. So, for you as a normal internet
find. So, for you as a normal internet user, there was no risk for you being exposed to this stuff. And that's how things were. But let's get into the
things were. But let's get into the first stage of this horrible descent into Discord's ever growing CSAM problem. But there are websites where
problem. But there are websites where you can list your Discord server and advertise it. And people can go on these
advertise it. And people can go on these websites and look for Discord servers to join. And one that everyone knows about
join. And one that everyone knows about is called Discord. Now, Discord used to be horrible. In fact, I am Lucid, a
be horrible. In fact, I am Lucid, a YouTuber, had a video called Discord Has a CP Problem, which featured Discord and a whole bunch of horrible listings. In
fact, Discord was still a problem after that video a year after that even I covered it in my own video. Now,
fortunately, Discord isn't as bad anymore. I did find this one Discord
anymore. I did find this one Discord server which was called Linkland. And
considering the icon is of the mega icon, you can put the pieces together.
This was a Discord server that was sharing CSAM content. But good news, it got removed off Discord. But Discord
isn't the only website where you can search for Discord servers. In fact,
there's a website called Discadia. But
if you search for something as simple as 13-17 talking about an age range, you can see this is the first result on my screen.
Gay Boys Club Gateway. And I quote, "A gay hangout server for you and your friends to hang out in air quotes and have fun. No girls, 13 to 17, definitely
have fun. No girls, 13 to 17, definitely not even spelled correctly. That's how
you can tell they're teenagers. 18
plus." This is already bad, but it gets worse. There's two problems with this
worse. There's two problems with this sort of thing. First, if you are just a normal Discord user and you go on a Discord listing website like Dyscadia, there is statistically a chance that you
might find a Discord server that has Cessam content. But the second problem
Cessam content. But the second problem with this is the fact that these websites are indexed by Google. Now,
since we're talking about literally CSAM content, I am going to change the example a little bit so I don't go to jail, but I'm going to search for EAgirl Discord server. And you can see that the
Discord server. And you can see that the first search result is Discord. And if
you keep scrolling down, there's top.ggia.
top.ggia.
These websites are indexed by Google. So
all that means is that if a predator hears a name of a Discord server, like mega trading 9,000 gazillion, they could just type that into Google with Discord at the end of it and find the server. So
at this point in the timeline, it's easier for predators to trade their CSAM collections because it's easier for them to just Google search for servers. But
for you as a normal Discord user, if you're on one of these websites like Dcadia, there is a chance that you can just be exposed to CSAM because the website has sh moderation. The first
[ __ ] result that I found was illegal.
Now, if you thought that was bad, fair warning, no, it gets significantly worse. In fact, these next two stages
worse. In fact, these next two stages that I'm going to talk about on the timeline aren't due to predators. These
problems are escalated because of teenagers themselves. Teenagers have
teenagers themselves. Teenagers have made the problem worse. And that's where I get to stage two, comm servers. Now,
just to make it clear, this is like a toplevel summary and is not exhaustive at all. But if you don't know what the
at all. But if you don't know what the Discord comm is, let me give you an urban dictionary definition of it because they do a better job than I can explain. Comm is short for community.
explain. Comm is short for community.
Comm refers to communities that revolve around cyber crime and doxing. Comms are
typically comprised of teenagers from the age ranges of 14 to 17 who would rather scam, sim swap, fraud, and engage in destructive behaviors instead of getting a real J O. Sorry, I should probably be censoring that. That's a
slur. Or skill. They are primarily made up of script kitties and aggregate on games like Roblox, Minecraft, and VR chat because real life socialization is scary. Basically, these guys are
scary. Basically, these guys are chronically online losers. And that's
coming from me. I am already a loser. So
these guys are really bad. But as I was researching the CESAM problem, there was an article by Childlight who talked about how the SESAM economy is growing.
And I want to point out something they said about teenagers specifically. They
said, "Worryingly, this culture of collection referring to predators is also appearing in the behavior of children with one another. In a joint project conducted in the United Kingdom, female youth described feeling pressured
to send images while male youth felt pressure to collect and distribute those images to achieve higher social value.
So when you have these comm kiddos or discord eg gangsters, they feel insecure about their social status. So they will distribute these images to achieve a
higher social value. And that is deeprooted in the Discord comm. Because
if you go on a Discord comm server, 50% of it are people talking about their toxic relationships and leaking cam content. And the other 50% are these
content. And the other 50% are these dudes talking about how they're so rich, [ __ ] you, you're a poron. And they're
going to go, and I quote, band forband.
But they have all this fraud money and they'll usually spend it on like luxury watches and goods to again achieve higher social value. But with that fraud money, they will also pay egirl for nude
photos to again get higher social value.
And since teenage youth felt pressure to collect and distribute the images, predators will pretend to be teenagers to uh collect those images to again increase their collection. In fact, I
even made a video about how one of these comm servers were owned by a predator.
Like, this actually happens. I'm not
talking out of my ass. But with these comm servers, it has also resulted in another problem. Sellers. With those e
another problem. Sellers. With those e gangster servers, you had those e guys giving e- money to their e girls to be their egirlfriend to send them e- nude photos. And what ended up happening is
photos. And what ended up happening is that teenage girls realized that this could be a job. In fact, here's a screenshot of someone's Discord profile.
Well, I have censored everything for obvious reasons, but you can see their pronouns are she and her. And in their about me, they say, "Hi." 14. An invite
link to a Discord server and it says, "Only DM if buyer or friend." What girls have realized is that instead of having their e- relationship with their comboy instead, they can just sell photos to
anyone that pays for it. And I want to point out that you might think the problem is just girls, but I have been sent by a viewer, this person here. And
the name behind this was a dude's name.
And they said, "I'm 15 and mostly a bottom. I sell pictures. I'll basically
bottom. I sell pictures. I'll basically
do almost anything intercourse DLC-wise, but I do have some exceptions." And uh I don't want to read this. And just when I thought I hated myself by reading this, he continues on by saying, "Please send
me unsolicited pics and threats.
Sometimes I I genuinely hate myself and the internet and just what everything has become. You have people fighting to
has become. You have people fighting to fix the problem and then you have teenagers who will sell photos of themselves making the problem worse, fueling the whole entire problem because
they want easy money. And I have had some people that have sold pictures in the past reach out to me and explain how they deeply regretted it. Now, the
problem is that these girls or boys that are selling photos, they go everywhere on Discord. They are like a lethal combo
on Discord. They are like a lethal combo of a car salesman being slimy and a Jehovah's Witness going to every doorstep on planet Earth. I have seen on more than one occasion these e-girl
sellers putting in an advertisement in a Discord channel. Like straight up, it
Discord channel. Like straight up, it could be about like Skibbidity Toilet or something. They'll drop, "I'm selling
something. They'll drop, "I'm selling photos of myself. Check my bio." And
that's how this problem is getting worse. Like taking a step back, before
worse. Like taking a step back, before to see any sort of seesam content, you had to look for it yourself. Now it's
starting to be advertised to you. And
for the Discord Predators, this is like Amazon for them. They can just shop around and buy whatever the hell they want. But the Predators have taken
want. But the Predators have taken things a step further. Just like how you have those seller Discord accounts, Predators have done the exact same thing. Well, you can see that this
thing. Well, you can see that this account here says, "Hit me up." And when you click on the Discord account, their account says trading no limit mega links
24/7 mega link trade DMs always open.
Please don't do this, but my viewer for some reason decided to DM them. What
exactly are you selling? This person
said that they're just trading megas for people. What's a mega? It's just files
people. What's a mega? It's just files with videos and pics you can share with others. Uh-huh. And if just for science
others. Uh-huh. And if just for science I'd want to buy one, how much would that cost? $200 for 500 gigabytes worth of
cost? $200 for 500 gigabytes worth of videos. Now, you might be thinking, "No,
videos. Now, you might be thinking, "No, no text speech." No, no, no. Hear me
out. This person is just selling only fans leaks because those are also on Mega. Wrong.
Mega. Wrong.
Completely wrong because my viewer says like younger than 18. Yep. It's all a mix to be honest. This is an account that is selling CS Sam content and guess what? Their account hasn't been banned.
what? Their account hasn't been banned.
It still exists on Discord. It is
actually [ __ ] disgusting. So, taking
a step back, if the last stage was like Amazon for Discord Predators where they can shop around and get what they want, this stage is like latestage capitalism because now you're having advertisements
being shoved down your throat by actualing predators trying to promote and sell you cam content. So, predators,
they don't even need to look for this stuff anymore. It just shows up on their
stuff anymore. It just shows up on their doorstep. But that also means for you as
doorstep. But that also means for you as a normal human being, you are being exposed to this stuff just by being on Discord. Now, if you were hoping that
Discord. Now, if you were hoping that was as bad as it gets, no, it gets so much worse. And this is genuinely what
much worse. And this is genuinely what made me make this whole video in the first place, because predators are making and buying thousands of Discord accounts to spread and sell CSAM all
across Discord. Chances are, if you've
across Discord. Chances are, if you've been on Discord long enough, you have seen an account like this. This
account's username is teen lesbian CP stuff. You would think, you know,
stuff. You would think, you know, Discord, aing ginormous tech platform, couldn't have an AI that would look at this username and be like, you know what, maybe this is illegal and we should ban this account. But no, it
doesn't stop there because these accounts also in their bio say DM me for all kinds. They talk about Zangi, which
all kinds. They talk about Zangi, which is a private chatting service. They have
a group channel, Teens Link, that word again, mega links, mom and son links, and they advertise their Zangi group chat or their Signal accounts. So these
are just bots that are advertising their Signal, their Telegram, their Zangy, all these private chatting platforms where they sell and trade Cam content. But
what Predators will do on these bots is they will go into those minor selling servers. And these bots will go in and
servers. And these bots will go in and advertise that they're also selling content because the bot fits right in.
And the problem is that if this account gets banned, these predators have thousands of accounts that will replace it. You can like buy a thousand Discord
it. You can like buy a thousand Discord accounts for like $3 or something. I
don't know. I have a screenshot of how much it costs that I just found. But
predators will buy these accounts and turn them into CSAM spam accounts. And
you'd think that'd be the end of everything. You know, video done.
everything. You know, video done.
Discord is letting predators make bot accounts that promote CSAM content. This
is already [ __ ] up and disgusting, but no, it gets worse. Here's a Discord server here which is talking about reselling. They're buying on the
reselling. They're buying on the internet for cheap and they're reselling it for a higher price. But since the server has the word sell in it, if you go down to their VIP wins channel, you
will see that there are bots called mega Dropbox link where you can literally see on October 22nd, 2025. This is happening right now. They're trying to ping
right now. They're trying to ping everyone advertising the stuff that they have in their Mega Dropbox. And it's not just this account. If I go to the socials channel, you can see deleted user started to thread mega Dropbox
links. This server is infested with
links. This server is infested with these bots. Now their account is
these bots. Now their account is deleted. It is gone. They have been
deleted. It is gone. They have been banned off Discord for selling CSAM content. Yet their message on Discord
content. Yet their message on Discord still [ __ ] exists. Discord. They are
still advertising CSAM content. You
banning them has done absolutely [ __ ] nothing. Why do I, an incompetent
nothing. Why do I, an incompetent [ __ ] [ __ ] on the internet, need to point out the most obvious [ __ ] that anyone can [ __ ] see? Why do I have to explain this to a massive company that this is not enough? And by the way, I'm
not done my [ __ ] rant because guess what? It gets even worse because these
what? It gets even worse because these bots don't stop at invading any server that talks about selling. In fact, they have gone to NSFW Discord servers. And
this is a giant problem on Discord. When
you join one of these Discord servers, you have chats that look like this. You
have an obvious scam account saying, "Any boy want to chat?" Then you have CP blank teen link seller. And this account is just spamming DM me for any kinds of hot stuff channels and more. And they're
just spamming over and over again. And
this is on an NSFW channel. Oh yeah.
What does it say on my script? That's
right. It gets worse again because these bots don't just invade selling servers.
They don't just invade NSFW servers. No,
they've gone even further. It has gotten worse. They will use their bot accounts
worse. They will use their bot accounts to join just a gay pride discord server.
And you can see that again Discord is not doing their job here because the username is mega teen seller. This
account is trying to sell CP. Oh, wait.
Hold on. What's that? Oh, yeah. The
script again says it gets worse because not only do these accounts have very obvious names that they're selling cam content. No, you know what? Whying stop
content. No, you know what? Whying stop
there? Because now what's happening is that these accounts like this person rcpr mega link seller their profile picture is of cam content. So what that means is
that when they join that gay pride discord server if you click on their profile you are now being exposed to cam content because discord can'ting moderate such an obvious name. You might
be thinking you know what there's rgay rcpr you know maybe we're talking about CPR. There's this account here that
CPR. There's this account here that literally says selling gay CP and it has C Sam in the photo and Discord's like boom, we're letting that on our platform to scar a whole bunch of people. Then it
gets banned like maybe a day later. Oh
yeah, what does it say on my script? It
gets even worse. What's happening now is that these predators are going on Dadia and joining any Discord server that's advertised on Discia. And also, by the way, it happens on Discord. If you post your Discord server on one of those
listing websites, there's a good chance that you will have one of these CSAM bots join it. You can see this Mega Link account and looking at the Mega Link account, it is the same thing. And by
the way, this viewer didn't just have that happen once, they had it happen again. And just to make things even
again. And just to make things even worse, this Discord server is a fan community server for the Roblox game Build a Boat for Treasure. This is a
Discord server for a [ __ ] Roblox game, a game meant for kids and teenagers, and it's being joined by a Discord bot that is promoting CSAM
content. And remember, some of these
content. And remember, some of these bots actually haveing Seam as their profile picture. So these bots are
profile picture. So these bots are joining Roblox games and exposing kids to Seessam content. In fact, when I was making this video, I had over 50 different viewers telling me about this
problem. And I have cut it down to be as
problem. And I have cut it down to be as short as humanly possible to get my message across. This is everywhere. It's
message across. This is everywhere. It's
not a matter of if you'll find one of these accounts, it's when you'll find one of these accounts. Because here's
the problem. They will spam your chats advertising this stuff. They will also friend request people on your Discord server and spam their DMs advertising CSAM content. So before on Discord, you
CSAM content. So before on Discord, you would never be exposed to CSAM content unless you were a predator trying to find it. But as we have gone through the
find it. But as we have gone through the stages of this problem, if you join any Discord server, you can be exposed to CSAM content because these bots are
everywhere infiltrating every corner of Discord. And again, a lot of them have
Discord. And again, a lot of them have seam as their profile picture. And that
is how Discord sees problem has gotten significantly worse. But what is Discord
significantly worse. But what is Discord doing about this? And to explain how broken this system is, which I've already shown you plenty of already, I want to take a look at this article called Hiding in Plain Sight: Exposing
Discord's Thriving Exploitation Networks. And I'll have this linked in
Networks. And I'll have this linked in the description. It is well worth your
the description. It is well worth your read. But the reason why I want to point
read. But the reason why I want to point this out is because this article is based on numbers, actual [ __ ] data.
It's not just me talking anecdotally about what my viewers and I have experienced. These are actual numbers
experienced. These are actual numbers through Discord's digital services act disclosure. Because now what happens is
disclosure. Because now what happens is that Discord needs to disclose why they banned people and which accounts they banned. And in this article, I want to
banned. And in this article, I want to point out on January 31st, 2024, before the US Senate, then CEO of Discord, Jason Citroron, expressed his confidence in Discord safety practices. In the very
same appearance, he had given a written testimonial where he in his word stated, "No matter how Discord identifies violative content, users who upload child material, CSAM, to Discord, are
permanently banned from the service."
Except when you look through the DSA data, they found through their analysis that one Discord account got punished for protection of miners. Illegal. Like
it straight up says [ __ ] illegal. You
can see this happened on July 23rd, April 23rd, April 22nd. They have been punished multiple times. Actually, no,
not multiple times. Sorry. They have
been punished 248 times. 248 times they have been punished. And Discord is saying that they permanently ban people.
In fact, in the analysis through a sample of 63 elicit servers conducted in August of 2025, so prettying recent, they found that out of the 15,000 unique
users identified across the 63 servers, 5,695, approximately 38% had at least one prior violation recorded in the DSA logs. Now,
the problem is that these violations could be like cyber bullying and then child safety, except they did the data.
actually went through the data because out of these 5,695 repeat offenders, it was found that they were responsible for a total of 22,000 documented violations. And out of those
documented violations. And out of those 22,000 violations, 19,339 or approximately 86% were explicitly categorized by Discord's own system
under statement category protection of minors. And to really drive the point
minors. And to really drive the point home, the most prolific offenders were accounts dedicated to distributing links to external CESAM platforms like Telegram, what I've shown you, and a
significant number of these accounts have accumulated over 100 separate child safety violations. Yet, they remained
safety violations. Yet, they remained long enough to be scraped in this 2025 data set. How can Discord say we have
data set. How can Discord say we have zero tolerance policy for this when in their own data that they have to give to the EU because they are legally forced to? It does not even prove that point.
to? It does not even prove that point.
And to make things even better, by the way, this ising crazy, but during this investigation, Discord DSA reports provided in-depth granular data where you could look at people's user IDs and
figure out all this stuff. But after
August 2025, they completely anonymized everything so that people cannot go through this data and realize how [ __ ] up Discord is. Now, further in the article, they talk about systemic flaws
in content removal and persistence translation when Discord uh bans and removes stuff. It sucks ass. In fact,
removes stuff. It sucks ass. In fact,
I've shown you in this video where there was that deleted user that still had their message promoting their Zangy and their Telegram, which promoted Cam content. And in this article's analysis
content. And in this article's analysis of a 100 elicit Discord servers, they found 462,000 messages. And out of those messages, just under 20,000 messages, approximately 4.3% were identified as
orphan, where the person sending the message, the user account, is deleted either by them or by Discord for being banned. And out of those 20,000
banned. And out of those 20,000 messages, they found that 57 of those messages contained media attachments, images, videos, other files. And when
they looked through and did a manual review of those 507 media messages, they found over 300 messages depicting either CSAM or acts of Zufilia. So like I
showed you with that one message. Even
when the Discord account gets banned, the message remains. And in this analysis, it wasn't just messages that were being left up. It was actual photos and videos of CSAM or zoo file content.
Teenagers can still be exposed to this stuff. How is this okay? And before I go
stuff. How is this okay? And before I go on like a 30 minute rant and make this video even longer than it is, the biggest takeaway that everyone needs to understand is that you as a normal Discord user can be exposed to these
seam bots that have profile pictures of actual seam. And the thing is when these
actual seam. And the thing is when these Discord accounts get banned, all the predators have to do is just make another Discord account. And the people running these accounts, when they get banned, their messages might stay, which
to them is a win. They're already
advertising and it encourages them to make another Discord account. and
continue making bots to spam everywhere across Discord. And as I've shown you,
across Discord. And as I've shown you, as time continues, this problem gets worse and worse. But for now, I'm going to go eat ice cream for therapy.
Bye-bye. Love you. Wait.
Loading video analysis...