Edit - This is a post to the meta group of Blåhaj Lemmy. It is not intended for the entire lemmyverse. If you are not on Blåhaj Lemmy and plan on dropping in to offer your opinion on how we are doing things in a way you don’t agree with, your post will be removed.
==
A user on our instance reported a post on lemmynsfw as CSAM. Upon seeing the post, I looked at the community it was part of, and immediately purged all traces of that community from our instance.
I approached the admins of lemmynsfw and they assured me that the models with content in the community were all verified as being over 18. The fact that the community is explicitly focused on making the models appear as if they’re not 18 was fine with them. The fact that both myself and one a member of this instance assumed it was CSAM, was fine with them. I was in fact told that I was body shaming.
I’m sorry for the lack of warning, but a community skirting the line trying to look like CSAM isn’t a line I’m willing to walk. I have defederated lemmynsfw and won’t be reinstating it whilst that community is active.
paris ( @paris@lemmy.blahaj.zone ) English114•2 years agoFor anyone wondering, this is lemmynsfw’s take on the situation.
On a personal level, the vibes are off. Their defense seems really defensive and immediately moves to reframe the situation as body shaming. There’s a difference between an adult who looks underage posting porn of themselves and a community dedicated to porn of adults who look underage. Reducing the latter down to body shaming seems like unfair framing to me.
Thorny_Thicket ( @Thorny_Thicket@sopuli.xyz ) English55•2 years agoDid you check the community in question? I’m quite suprised to hear one could think that’s csam. To me it looks just like your typical low-effort onlyfans content. None of the models even looked “barely legal” but more like well over 20 in most cases.
noisehound ( @noisehound@lemmy.blahaj.zone ) English38•2 years agoThe community in question listed “child-like” in their sidebar until after this defederation. Gross.
Gormadt ( @Gormadt@lemmy.blahaj.zone ) English4•2 years agoWhen I checked their communities most were basically empty?
And I didn’t see a community that fits that description.
Edit: I did try to enable nsfw content and tried from other accounts I have on other instances.
Thorny_Thicket ( @Thorny_Thicket@sopuli.xyz ) English16•2 years agoYour instance just deferedated from lemmyNSFW. You can’t see any new content there anymore with that account.
Gormadt ( @Gormadt@lemmy.blahaj.zone ) English8•2 years agoI tried in private browser mode and from accounts I have have lemmy.ml and Beehaw
I still didn’t see anything?
IDK what’s up
magnetosphere ( @magnetosphere@kbin.social ) 26•2 years agoYeah. I don’t think they’re sincerely trying to “be inclusive”. I think they’re just trying to misuse progressive concepts to their own advantage.
They know full well what they’re doing. The fact that it isn’t legally CP is just a technicality.
ocasta ( @ocasta@lemmy.blahaj.zone ) 77•2 years agoI think it’s really strange to call that a technicality. Adults with babyfaces and braces doing porn (which appears to be what this was about, as far as I can tell) is worlds apart from children being abused. Calling that a “technicality” is like saying the difference between a slasher movie and a snuff film is a “technicality.” People who watch slasher movies arent actually wanting to see snuff films deep down inside. And people who find adults with babyfaces attractive arent actually lusting after kids deep down inside.
LegendofDragoon ( @LegendofDragoon@kbin.social ) 23•2 years agoThey literally said in the post no one looks too young to be lusted after. Major red flag right there.
NuMetalAlchemist ( @NuMetalAlchemist@lemmy.blahaj.zone ) 4•2 years agoHEY ADA, THIS GUY ISNT FROM BLAHAJ, WHY ISNT HE CENSORED? I THOUGHT THIS WAS FOR BLAHAJ INPUT ONLY? WHY THE DOUBLE STANDARD, ADA?WHY THE HYPOCRISY? I MEAN, WE ALREADY KNOW, BUT I WANNA SEE YOU SAY IT!
yessikg ( @yessikg@lemmy.blahaj.zone ) 4•2 years agoOh please just shut up
NuMetalAlchemist ( @NuMetalAlchemist@lemmy.blahaj.zone ) 3•2 years agoNo. 🤠
copygirl ( @copygirl@lemmy.blahaj.zone ) English90•2 years agoI think both instance admins have a valid stance on the matter. lemmynsfw appears to take reports very seriously and if necessary does age verification of questionable posts, something that likely takes a lot of time and effort. Blahaj Lemmy doesn’t like the idea of a community that’s dedicated to “adults that look or dress child-like”. While I understand the immediate (and perhaps somewhat reactionary) concern that might raise, is this concern based in fact, or in emotion?
Personally I’m in the camp of “let consenting adults do adult things”, whether that involves fetishes that are typically thought of as gross, dressing up in clothes or doing activities typically associated with younger ages, or simply having a body that appears underage to the average viewer. As the lemmynsfw admin mentioned, such persons have the right to lust and be lusted after, too. That’s why, as a society, we decided to draw the line at 18 years old, right?
I believe the concern is not that such content is not supposed to exist or be shared, but rather that it’s collected within a community. And I think the assumption here is that it makes it easy for “certain people” to find this content. But if it is in fact legal, and well moderated, then is there a problem? I don’t believe there is evidence that seeing such content could change your sexual preferences. On the other hand, saying such communities should not exist could send the wrong message, along the lines of “this is weird and should not exist”, which might be what was meant with “body shaming”.
I’m trying to make sense of the situation here and possibly try to deescalate things, as I do believe lemmynsfw approach to moderation otherwise appears to be very much compatible with Blahaj Lemmy. Is there a potential future where this decision is reconsidered? Would there be some sort of middle-ground that admins from both instances could meet and come to an understanding?
‘Leigh 🏳️⚧️ ( @leigh@lemmy.blahaj.zone ) English30•2 years agois this concern based in fact, or emotion?
Ada was clear in another comment thread that yes, emotion was absolutely involved in her decision. That isn’t a bad thing. Why is there a social attitude that decision-making is only valid if it’s cold and unfeeling?
Personally I’m in the camp of “let consenting adults do adult things”
Me too. I don’t think anyone is arguing against that. Anyone can still access LemmyNSFW’s content elsewhere, Blahaj Zone simply isn’t going to relay it anymore because some of it is incompatible with Ada’s goals in nurturing this community.
But if it is in fact legal, and well moderated, then is there a problem?
Yes. Legality has nothing to do with acceptability. This instance already bans lots of content that doesn’t actually violate any laws. It’s a judgment call.
copygirl ( @copygirl@lemmy.blahaj.zone ) English16•2 years agoThe reason I brought up emotion in my reply was because I’ve felt that the lemmynsfw admins have been able to explain their decision quite reasonably and seemed to be open to conversation, wheras Ada was set on one goal and upon finding disagreement, wasn’t in the right mindset to continue a constructive conversation. Which, to be fair, due to the nature of the content, is understandable.
If the content that the Blahaj Lemmy admins are concerned about are limited to certain communities, and part of the issue is the concentration of content in said communities in the first place (at least, as I speculated in my original reply), then I don’t quite understand why blocking these communities only isn’t something that was considered, rather than defederating the entire instance. I do respect Blahaj Lemmy’s decision not to want to host such content. Or is there some technical limitation that I’m not aware of?
I don’t quite understand why blocking these communities only isn’t something that was considered, rather than defederating the entire instance
Because I am not ok federating with a space that is ok with content that looks like CSAM. “It’s technically legal” isn’t sufficient in this case.
copygirl ( @copygirl@lemmy.blahaj.zone ) English19•2 years agoBut whether it’s technically legal is exactly what does or doesn’t make it CSAM. “Looking like” is going to be highly subjective, and I don’t understand how the admins of the other instance are supposed to handle reports, other than to verify whether or not it actually is the case or not.
Are petite looking people not supposed to make explicit content while dressing up cute? Should a trans man not share explicit pictures of himself, because he might look like an underage boy? Do we stop at porn that gives the appearance of someone being young? What about incest or ageplay? Like, what if you or someone else was made sufficiently uncomfortable by some other kind of porn? How do you decide what is and isn’t okay? How do you avoid bias? What would you be telling a model when they ask why you removed their content?
Apologies for going on with this when I’m sure you’re already sick of dealing with this. I had just felt like some of the points I brought up (like in my original reply) were entirely overlooked. Putting effort into an (attempted) thought-out reply doesn’t mean I get to receive a response I was hoping for, but I was at least hoping for something you hadn’t already said elsewhere.
but I was at least hoping for something you hadn’t already said elsewhere.
There is no more to this. I don’t have a list of endless reasons.
The reason is that it looks like CSAM and appeals to folk looking for CSAM. I’m a CSA survivor myself. A space that appeals to folk looking for CSAM isn’t a community that I’m willing to share space with.
copygirl ( @copygirl@lemmy.blahaj.zone ) English3•2 years agoI guess the core of the disagreement is that one side values safety higher while the other does expression? It could be argued that moderation can take care of anyone stepping over the line. People can be unwelcome creeps regardless of what they’re into, who would be attracted to other dedicated communities. I imagine someone could have the same concerns you do for similar reasons, when it comes to consensual non-consent roleplay. Interestingly enough, this actually is temporarily restricted on lemmynsfw, which could be because an appropriate moderation policy has not yet been agreed upon.
chumbalumber ( @chumbalumber@lemmy.blahaj.zone ) English14•2 years agoReminds me of a lot of the debates around kink at pride/ddlg kink stuff. The latter is really not my thing and makes me uncomfortable, but I recognise that that’s a personal thing between me and my partners that I can’t, and shouldn’t, police among others.
There’s also ethical debates to be had on porn in places like Lemmy/pornhub/etc. – we can’t know that the person has consented to being posted, or that they have recourse to get it taken down and stop it being spreaded if they do not.
Then there’s the realpolitik of, regardless of ethics, whether it’s better to have porn of this type in visible, well moderated communities, or whether it’s better to try to close off ethically dubious posting.
It’s one I don’t really have squared off in my head quite yet. Similarly with kink at pride; I’ve read about the historic importance of kinksters and recognise that, but at the same time I want there to be a space where queer kids can be involved with pride without being exposed to kink. Is that just prudish social norms talking? Idk; I’m still working it through.
Mewtwo ( @Mewtwo@lemmy.blahaj.zone ) English81•2 years agoFor the people like me that don’t know the term: CSAM is Child Sexual Abuse Materials. It’s the term used instead of CP as “pornography” is more commonly used for pleasure or conveys the idea of consent.
As for the porn that uses people that look under age, it’s no different than the anime children that are thousands of years old. It doesn’t matter how old they are, they look like children and it’s gross.
lucja808 ( @lucja808@lemmy.blahaj.zone ) English19•2 years agoThe world is messed up. I feel like advertising any adult material as “barely legal” should be banned too. It skirts the boundary too close. Not as close as the aforementioned thousand year old child body but it feels almost as bad imo.
emidio ( @emidio@lemmy.blahaj.zone ) English17•2 years agoI agree with you but not on the last point. There is a difference since they are real people, adults, and that they consent on being sexually attractive and arouse. I am not attracted to young looking bodies but that’s a notable difference to me. Also I don’t know how I feel about a community (in a broader way than a lemmy comm) focusing and fetichising on young looking adults (I do know that it disturbs me but I want to talk about society wise), but I understand that some people are attracted to young looking bodies and/or juvenile ones, and I feel like adults that consent to answer their desires is better than CASM
somedaysoon ( @somedaysoon@midwest.social ) English9•2 years agoAnd that’s where the body shaming comes in, you’re literally telling this 20-something that their body is gross and no one should find them attractive. How would it make you feel if someone said that about you?
‘Leigh 🏳️⚧️ ( @leigh@lemmy.blahaj.zone ) English70•2 years agoI enjoy NSFW content, but I certainly don’t want to stumble into “how close to CSAM can we get while staying technically legal?” content. And the bullshit lie about this being “body shaming” pisses me off.
This admin decision obviously isn’t up for a vote, but it’s just so obviously the right call. Thank you Ada for handling this, and I’m sorry (in the Canadian way, not the guilty way 😉🇨🇦) you had to see any of that.
LegendofDragoon ( @LegendofDragoon@kbin.social ) 66•2 years agoI’m not on this instance, but thank you for being so swift and resolute in your actions. Happy to see all due caution is being taken. Not so happy that such a community made it’s way here to the fediverse. Hopefully I won’t see any of it while doomscrolling.
kardum ( @kardum@lemmy.blahaj.zone ) English55•2 years agothe same community (adorableporn) is also on reddit btw with 2.2m subscribers.
i have no grand moral opinion on this type of content. for me it is the same as femboy content for example, where people also push for a youthful, girly aesthetic.
as long as the content is made by consenting verified adults, i don’t care.
it’s like adults cosplaying with japanese school uniforms or calling your partner “mommy” or “daddy”.
probably not the best move in terms of sexual morals for sure, in the grand scheme of things tho this is just how people express their sexuality i guess.
it’s like adults cosplaying with japanese school uniforms or calling your partner “mommy” or “daddy”.
No, it’s not, because no one mistakes those things for actual underage children
Jerkface (any/all) ( @jerkface@lemmy.ca ) English30•2 years agoNo, it’s not, because no one mistakes those things for actual underage children
That’s not what happened here. No one would mistake the image in question. You say there were other images; the admins there give a story that contradicts yours. They say there were no such images. Didn’t see those images removed in the modlog, either.
Could it possibly be that someone has blown things out of proportion and got emotional?
NuMetalAlchemist ( @NuMetalAlchemist@lemmy.blahaj.zone ) English18•2 years agoThat and the lack of humility afterwards make for a poor leader.
kardum ( @kardum@lemmy.blahaj.zone ) English30•2 years agoi had no problem distinguishing the models on the community from children.
maybe it’s more difficult in some cases without looking for the onlyfans link or sth similar of the model somewhere in the post, but that’s just human anatomy.
that’s why the guy at the gas station asks for my ID card, because it is not always super clear. but apparently clear enough for reddit admins and PR people from ad companies.
i agree playing into the innocent baby aspect is probably not great for sexual morals and i wouldn’t recommend this comm to a local priest or a nun, but this type of content thrives on pretty much every mainstream platform in some shape or form.
i get it, if this instance wants to be sexually pure and removed from evil carnal desires tho. that’s kind of cool too for sure.
i had no problem distinguishing the models on the community from children.
You didn’t see the content I saw. Content that was reported as CSAM by someone on this instance, who also thought it was CSAM.
maybe it’s more difficult in some cases without looking for the onlyfans link or sth similar of the model somewhere in the post, but that’s just human anatomy.
Again, a group that is focused on models in which that is the only way you can tell that they’re not underage, is a group that is focused on appealing to people who want underage models. That is a hard no.
Spin it how you like, but I am not going to be allowing material that is easily mistaken from CSAM
kardum ( @kardum@lemmy.blahaj.zone ) English15•2 years agoI thought about this some more and I can feel a lot more sympathy for your decision now.
It must be horrible to get a user report about CSAM and then see a picture, which could be really CSAM on first glance.
Even if every user report was wrong from now until infinity, that initial CSAM suspicion, because of the false user report, probably makes moderating just a soul-crushing activity.
It is great if admins from other instances are willing to handle with these horror reports, just to give their users a bigger platform, but this service is not something that can be taken for granted.
I’m sorry for coming across as ignorant, I just did not consider your perspective that much really.
NuMetalAlchemist ( @NuMetalAlchemist@lemmy.blahaj.zone ) English18•2 years ago“Even if every user report was wrong from now until infinity, that initial CSAM suspicion, because of the false user report, probably makes moderating just a soul-crushing activity.”
Then they shouldn’t be doing it. If seeing something that looks even slightly off-putting causes this level of over-reaction, Ada doesn’t need to be moderating a community for marginalized/at-risk people. I myself am a CSA survivor, and seeing my trauma being equated to some legal adults playing pretend is fuckin’ bullshit. Seeing my trauma being equated to drawn pictures is fuckin’ bullshit. My trauma being equated to AI generated shit is fuckin’ bullshit. I’ll tell you one thing, as a CSA kid, one thing I cannot stand is someone making decisions on my behalf. To protect me. Fuck you, I’ll fuckin bite anyone that tries to take away my free agency again.
I myself am a CSA survivor
FYI, so am I
NuMetalAlchemist ( @NuMetalAlchemist@lemmy.blahaj.zone ) English13•2 years agoCool, welcome to the real world where one size does not fit all. We handle our trauma differently. But I don’t subject others to my hangups. I don’t use it as a cudgel to squash dissent. Your trauma is not your fault, but it is your responsibility, not ours, to deal with.
gh0stcassette ( @gh0stcassette@lemmy.blahaj.zone ) English4•2 years agoI totally get that and definitely don’t blame Ada for defederating (although I don’t think it’s likely it was actually CSAM, nor that the community it was on is Inherently Problematic, as long as everyone in the posts is 18+, people’s kinks are none of my business).
The thing I don’t get is why reports made by blahaj users on lemmynsfw communities would go to the blahaj moderators at all. That seems like a design flaw in Lemmy, instance mods have no power to moderate content on off-instance communities, so why would they be notified of reports? That seems like it would clutter mod-logs for no reason and cause unnecessary drama (as happened here). Like if every subreddit post report immediately went to the Site Admins, that would be Terrible.
Though if Lemmy really is built like this for whatever reason, I would probably have done the same thing. I wouldn’t want to have to be Subjected to everything that could be reported on an NSFW instance, there’s probably some Heinous Shit that gets posted at least Occasionally, and I wouldn’t want to see all of it either. I just think it’s Really Stupid that lemmy is built this way, we need better moderation tools
The thing I don’t get is why reports made by blahaj users on lemmynsfw communities would go to the blahaj moderators at all.
Reports go to the admins on the instance the reporter is from, to the admins on the instance the reported account is from and to the admins of the instance the community the post was made to is from. The report also goes to the moderators of the community that the content was posted to.
Each instance only gets a single report, however many of those boxes it ticks, and that report can be dealt with by admins or moderators.
However, the results federate differently based on who does the action. So for example, me deleting content from a lemmynsfw community doesn’t federate. It just removes it from my instance. However, a moderator or an admin from lemmynsfw removing lemmynsfw content will federate out.
NuMetalAlchemist ( @NuMetalAlchemist@lemmy.blahaj.zone ) English10•2 years ago“You didn’t see the content I saw.”
Probably because it was removed for being against the rules?
kardum ( @kardum@lemmy.blahaj.zone ) English9•2 years agoContext always matters. I always check if adult material has actually been made by consenting adults. I would feel sick, if not enough information had been provided for that, but I at least have never encountered CSAM fortunately.
I had classmates in high school with balding or even graying hair and full beards. Some adults older than me, look younger than my nephews. Revenge porn and creepshots are common. (or atleast were, I’m not on platforms where these are popular)
Without context, porn will always be a morally grey area. Even commercialized hyper-capitalist porn is still an intimate affair.
That’s why I didn’t use pornhub for example, before every user had to verify themselves before posting. Before that I only read erotica or looked at suggestive drawings.
I understand your perspective tho. You get hardly paid to keep this instance running, looking at pictures that without context could be CSAM could make this volunteer work very mentally taxing. This is how NSFW works tho.
Without context, any pornographic material featuring real humans could in truth be some piece of evidence for a horrible crime.
Context always matters. I always check if adult material has actually been made by consenting adults. I would feel sick, if not enough information had been provided for that, but I at least have never encountered CSAM fortunately.
If I can’t tell, if I have to look something up because the people I’m looking at look like they’re underage, then it doesn’t matter what the answer is, because the issue is that it looks like CSAM even if it’s not. And a community designed in a way that attracts people looking for underage content is not a space I’m willing to federate with.
NuMetalAlchemist ( @NuMetalAlchemist@lemmy.blahaj.zone ) English6•2 years agoIsn’t it kind of shitty to tell an adult woman she can never be attractive or sexy because she looks too young? Do you truly believe that said person should never be allowed to find love, because it’s creepy? Is she supposed to just give up because you think her body is icky?
I’ve covered this many times already.
The issue isn’t individuals that happen to look younger than they are. The issue is with a community gathering sexual content of people that appear to be children.
The community that initiated this isn’t even the worst offender on lemmynsfw. There is at least one other that is explicitly focused on this.
Urist ( @urist@lemmy.blahaj.zone ) English49•2 years agoI get the feeling there’s going to be a lot of comments here from people who disagree.
This is not your instance. This is not even my instance, I am just signed up here (and thank you Ada, I like it here and I approve of this decision. CSAM-like porn is icky). There is no need to focus on the morality of sharing porn that ends up being viewed as CSAM. Hosting porn involves legal risk, and federating with an instance that has porn on it means that eventually you will host porn images. If you have your account here and you don’t like this choice, consider moving instances or hosting your own.
Not only that, does anyone remember /r/jailbait on reddit? They did not do anything about that subreddit because the images were “legal”, but the userbase they attracted began sharing real CSAM in the DMs. To be clear: I don’t know what community we’re talking about (lemmynsfw does not appear to have a jailbait community, I did not look hard) but you do not want the sort of people around that this attracts.
edit: remove unintentional link
katy ✨ ( @cupcakezealot@lemmy.blahaj.zone ) English41•2 years agoLet’s be honest; the only reason Reddit ever did anything with that subreddit is because CNN brought bad PR to them.
Urist ( @urist@lemmy.blahaj.zone ) English9•2 years agoI totally forgot about this. It’s so sad that you’re right. Sharing stuff in DMs was probably just the justification they needed to ban them without conflict (and oh my god, there was still so much drama.)
Melmi ( @melmi@lemmy.blahaj.zone ) English45•2 years agoIt’s ironic this went down over adorableporn and not fauxbait
novettam ( @novettam@lemmy.blahaj.zone ) English7•2 years agoIf this really is about that post sharing an image of /u/Im_Cherry_Blossom I’m a bit on the fence about this, but I leave it to Ada’s discretion.
I acted on the report I saw. By the sounds of it, I’d have acted exactly the same way if the report was for the other community
Strawberry ( @Strawberry@lemmy.blahaj.zone ) English4•2 years agoor the other one about posting pictures of women you know in real life without their knowledge
Lemmynade ( @Lemmynade@lemmy.ml ) English4•2 years agoI’m surprised as well, until I read your comment, I thought this was about fauxbait. That community is a giant red flag IMO.
Norah (pup/it/she) ( @princessnorah@lemmy.blahaj.zone ) English44•2 years agoBloody hell this thread is a mess of people from other instances complaining. I wish Lemmy would add the ability to set a community as private to it’s instance. Or only commentable by instance members. If you’re not from this instance, this defederation doesn’t affect you and you should step off. The admins job here is to protect us, the users on this instance. Not appease you.
moonsnotreal ( @moonsnotreal@lemmy.blahaj.zone ) English42•2 years agoThank you. Just the spam in new was bad enough, but CSAM? Holy crap.
To be clear, it is not CSAM. It is legal porn deliberately designed to look like CSAM
moonsnotreal ( @moonsnotreal@lemmy.blahaj.zone ) English12•2 years agoIt still feels in the grey area just like some anime
magnetosphere ( @magnetosphere@kbin.social ) 11•2 years agoMy “favorite” was the vampire who has the body of a little girl, but the argument was “in the story, she’s actually hundreds of years old, so she’s not a minor!” 🙄
Mr_Buscemi ( @Mr_Buscemi@lemmy.blahaj.zone ) English40•2 years agoKinda glad that server is blocked. I’ve had to block over 100 subs from them over the last few weeks.
The amount of porn that was coming in on my feed was crazy.
Evelyn ( @StarLuigi@lemmy.blahaj.zone ) English16•2 years agoFr, I love browsing all but there is a LOT of porn there.
Vlhacs ( @Vlhacs@reddthat.com ) English27•2 years agoTo be fair, it IS a porn instance. I think I would rather blame the sorting algorithms of lemmy that allows popular communities, including memes, to spam the All page due to the sheer number of posts and hides smaller ones.
Mr_Buscemi ( @Mr_Buscemi@lemmy.blahaj.zone ) English10•2 years agoI feel like I’ve blocked like 5+ different yiff subs that were almost all the exact same name lol
fadingembers ( @fadingembers@lemmy.blahaj.zone ) English9•2 years agoRight!? I was desperately wishing for a way to block the whole instance so this is great news for me
- argv_minus_one ( @argv_minus_one@beehaw.org ) English6•2 years ago
It would be nice if we could block entire instances…
Mr_Buscemi ( @Mr_Buscemi@lemmy.blahaj.zone ) English5•2 years agoI think some apps let you block them from showing up. Connect I think?
I use Voyager(WefWef) and it doesn’t have that yet.
athlon ( @athlon@lemm.ee ) English2•2 years agoThanks for the feature idea! I’ll add “Block Instance” function to my app.
RocksForBrains ( @RocksForBrains@lemm.ee ) English32•2 years agoTbh shoutout to Connect, let’s me block whatever. My block list primarily the hundreds of gross, weird porn that has popped up on this site.
some_guy ( @some_guy@kbin.social ) 28•2 years agoYou don’t appreciate 46 different sub-genres of furry porn, each with a separate community, filling up your feed?
Catoblepas ( @Catoblepas@lemmy.blahaj.zone ) 23•2 years agoHonestly almost all the porn I see on here is straight, I’ve only seen furry a handful of times with casual scrolling.
Tahssi ( @Tahssi@yiffit.net ) 8•2 years agoI’ve seen the joke a few times now, I just think it’s become the thing to joke about. I’m on a yiff instance and have 3 yiff communities of my own here and my feed is still 90% human porn.
sam ( @buffalobuffalo@lemmy.blahaj.zone ) English32•2 years agoIf I believe the mod of the community in question is telling the truth, Seems like the incident in question was just a misunderstanding. The community name is
spoiler
adorableporn
I will refer to this as “the first community” in the following text.
The mod of the community copy/pasted the dictionary definition from vocabulary.com, which contains the word “childlike”.
IMO, the community in question is not trying to skirt the line of Child Sexual Abuse Material (CSAM). In fact, there is a subreddit of the same name which has absolutely nothing to do with people that appear underage.
That said, the same mod also moderates, and posts to a different community with a concerning name. The spoiler below shows the name and the first three paragraphs of the sidebar as they appear:
spoiler
Community is now open to posting. Posts not having verification info will be removed.
FauxBait is a place for sharing images and videos of the youngest-looking, legal-aged (18+) girls. If you like fresh, young starlets, this is the place for you!
Just to be clear: We only feature legal, consenting adults in accordance with U.S. Laws. All models featured were at least 18 years old at the time of filming.
Also, I’m not sure if the timestamps can be trusted, but said mod was instated as the only active mod of the first community at the same time that Ada made this post, which would mean that the mod account could not have been the one that wrote the original sidebar of the first community. Not sure what to make of that. For the sake of balance though, said mod does seem to be doing verifications of the age requirements. Also, the modlog for the first community shows two admin removals from at least 10 days before this debacle, both of which err on the side of caution, so at least the admins to seem to care about enforcing their rules.
The situation seems very muddy, but I personally don’t think the original incident was that big of a deal (assuming the mod is telling the truth). However, I certainly don’t blame the blahaj admins for defederating as it’s certainly the safest option. Wouldn’t want blahaj lemmy to get taken down :| Also happy to see less pron in my feed; I’m too lazy to block the individual /c/. Personal Instance-level blocking can’t come soon enough.
but I personally don’t think the original incident was that big of a deal
The post I saw looked like an underage teenage girl. It was reported as child porn and looked like it to me before I even looked at the community.
Then when I looked at the community, I discovered it wasn’t accidental. The whole point of the community is to appeal to folks looking for people that look like underage teenagers.
That’s a pretty big deal.
stebo ( @stebo02@lemmy.dbzer0.com ) English26•2 years agoThe whole point of the community is to appeal to folks looking for people that look like underage teenagers.
It’s not though? Only the other community is like that. Still, defederating is probably the best choice indeed.
Wirlocke ( @Wirlocke@lemmy.blahaj.zone ) English30•2 years agoI feel like the people getting upset over this are taking these hypotheticals of “young looking adults just wanting to be able to make porn equally and that technically the community did nothing wrong”.
The problem is that just ignores the fact that pedophiles would definitely use communities like that as a “foot in the door” to a comminity that would naturally have a lot of closetted pedophiles. The issue isn’t young looking adults making porn, the issue is a community based around youngest possible looking adults is naturally gonna attract and encourage pedophiles.
It’s like they say, “all it takes is allowing one nazi in your bar for it to rapidly turn into a nazi bar”.
gh0stcassette ( @gh0stcassette@lemmy.blahaj.zone ) English14•2 years agoI mean yeah, but I think the solution here is just age verification. If you’re posting nsfw OC, you should have to verify age with mods, and if you’re posting nsfw from online, you should be able to prove they’re of age if prompted (like, if it’s a famous pornstar, they should be verified on pornhub or onlyfans or something so it’s easy to check whether they’re of age).
Like, I have small tits, I’d like to be able to post nsfw without people insinuating I’m pedo-baiting or that people attracted to me are intrinsically pedophilic. Just have strictly enforced age-gates and ban anyone being creepy
Wirlocke ( @Wirlocke@lemmy.blahaj.zone ) English3•2 years agoThe issue isn’t the people posting, the issue is a community meant to get the youngest possible looking adults, even if they’re adults, is going to attract closetted pedophiles and allow them to network.
If a young looking but verified adult posted to the numerous other nsfw communities there’d be no issue. But gathering people on the criteria of “young/child looking” is going to give a comfortable space for pedophiles to communicate with each other and spread the actual illegal stuff in private.
It’s not the content or the posters, it’s the communities that’s being fostered that’s the problem.
spaduf ( @spaduf@lemmy.blahaj.zone ) English12•2 years agoNot to mention there are potentially serious legal consequences for even hosting such content accidentally.
audiomodder ( @audiomodder@lemmy.blahaj.zone ) English29•2 years agoThank you thank you thank you thank you. Especially with the type of folks that choose blahaj for their lemmy server, I think it’s completely appropriate to err on the side of caution to avoid MAYBE being associated with that stuff. It affirms that when I chose an instance that I’ve made the right call.