Edit - This is a post to the meta group of Blåhaj Lemmy. It is not intended for the entire lemmyverse. If you are not on Blåhaj Lemmy and plan on dropping in to offer your opinion on how we are doing things in a way you don’t agree with, your post will be removed.

==

A user on our instance reported a post on lemmynsfw as CSAM. Upon seeing the post, I looked at the community it was part of, and immediately purged all traces of that community from our instance.

I approached the admins of lemmynsfw and they assured me that the models with content in the community were all verified as being over 18. The fact that the community is explicitly focused on making the models appear as if they’re not 18 was fine with them. The fact that both myself and one a member of this instance assumed it was CSAM, was fine with them. I was in fact told that I was body shaming.

I’m sorry for the lack of warning, but a community skirting the line trying to look like CSAM isn’t a line I’m willing to walk. I have defederated lemmynsfw and won’t be reinstating it whilst that community is active.

  •  paris   ( @paris@lemmy.blahaj.zone ) 
    link
    fedilink
    English
    112
    edit-2
    1 year ago

    For anyone wondering, this is lemmynsfw’s take on the situation.

    On a personal level, the vibes are off. Their defense seems really defensive and immediately moves to reframe the situation as body shaming. There’s a difference between an adult who looks underage posting porn of themselves and a community dedicated to porn of adults who look underage. Reducing the latter down to body shaming seems like unfair framing to me.

  •  copygirl   ( @copygirl@lemmy.blahaj.zone ) 
    link
    fedilink
    English
    89
    edit-2
    1 year ago

    I think both instance admins have a valid stance on the matter. lemmynsfw appears to take reports very seriously and if necessary does age verification of questionable posts, something that likely takes a lot of time and effort. Blahaj Lemmy doesn’t like the idea of a community that’s dedicated to “adults that look or dress child-like”. While I understand the immediate (and perhaps somewhat reactionary) concern that might raise, is this concern based in fact, or in emotion?

    Personally I’m in the camp of “let consenting adults do adult things”, whether that involves fetishes that are typically thought of as gross, dressing up in clothes or doing activities typically associated with younger ages, or simply having a body that appears underage to the average viewer. As the lemmynsfw admin mentioned, such persons have the right to lust and be lusted after, too. That’s why, as a society, we decided to draw the line at 18 years old, right?

    I believe the concern is not that such content is not supposed to exist or be shared, but rather that it’s collected within a community. And I think the assumption here is that it makes it easy for “certain people” to find this content. But if it is in fact legal, and well moderated, then is there a problem? I don’t believe there is evidence that seeing such content could change your sexual preferences. On the other hand, saying such communities should not exist could send the wrong message, along the lines of “this is weird and should not exist”, which might be what was meant with “body shaming”.

    I’m trying to make sense of the situation here and possibly try to deescalate things, as I do believe lemmynsfw approach to moderation otherwise appears to be very much compatible with Blahaj Lemmy. Is there a potential future where this decision is reconsidered? Would there be some sort of middle-ground that admins from both instances could meet and come to an understanding?

    • is this concern based in fact, or emotion?

      Ada was clear in another comment thread that yes, emotion was absolutely involved in her decision. That isn’t a bad thing. Why is there a social attitude that decision-making is only valid if it’s cold and unfeeling?

      Personally I’m in the camp of “let consenting adults do adult things”

      Me too. I don’t think anyone is arguing against that. Anyone can still access LemmyNSFW’s content elsewhere, Blahaj Zone simply isn’t going to relay it anymore because some of it is incompatible with Ada’s goals in nurturing this community.

      But if it is in fact legal, and well moderated, then is there a problem?

      Yes. Legality has nothing to do with acceptability. This instance already bans lots of content that doesn’t actually violate any laws. It’s a judgment call.

      • The reason I brought up emotion in my reply was because I’ve felt that the lemmynsfw admins have been able to explain their decision quite reasonably and seemed to be open to conversation, wheras Ada was set on one goal and upon finding disagreement, wasn’t in the right mindset to continue a constructive conversation. Which, to be fair, due to the nature of the content, is understandable.

        If the content that the Blahaj Lemmy admins are concerned about are limited to certain communities, and part of the issue is the concentration of content in said communities in the first place (at least, as I speculated in my original reply), then I don’t quite understand why blocking these communities only isn’t something that was considered, rather than defederating the entire instance. I do respect Blahaj Lemmy’s decision not to want to host such content. Or is there some technical limitation that I’m not aware of?

        •  Ada   ( @ada@lemmy.blahaj.zone ) OP
          link
          fedilink
          English
          81 year ago

          I don’t quite understand why blocking these communities only isn’t something that was considered, rather than defederating the entire instance

          Because I am not ok federating with a space that is ok with content that looks like CSAM. “It’s technically legal” isn’t sufficient in this case.

          • But whether it’s technically legal is exactly what does or doesn’t make it CSAM. “Looking like” is going to be highly subjective, and I don’t understand how the admins of the other instance are supposed to handle reports, other than to verify whether or not it actually is the case or not.

            Are petite looking people not supposed to make explicit content while dressing up cute? Should a trans man not share explicit pictures of himself, because he might look like an underage boy? Do we stop at porn that gives the appearance of someone being young? What about incest or ageplay? Like, what if you or someone else was made sufficiently uncomfortable by some other kind of porn? How do you decide what is and isn’t okay? How do you avoid bias? What would you be telling a model when they ask why you removed their content?

            Apologies for going on with this when I’m sure you’re already sick of dealing with this. I had just felt like some of the points I brought up (like in my original reply) were entirely overlooked. Putting effort into an (attempted) thought-out reply doesn’t mean I get to receive a response I was hoping for, but I was at least hoping for something you hadn’t already said elsewhere.

            •  Ada   ( @ada@lemmy.blahaj.zone ) OP
              link
              fedilink
              English
              101 year ago

              but I was at least hoping for something you hadn’t already said elsewhere.

              There is no more to this. I don’t have a list of endless reasons.

              The reason is that it looks like CSAM and appeals to folk looking for CSAM. I’m a CSA survivor myself. A space that appeals to folk looking for CSAM isn’t a community that I’m willing to share space with.

              • I guess the core of the disagreement is that one side values safety higher while the other does expression? It could be argued that moderation can take care of anyone stepping over the line. People can be unwelcome creeps regardless of what they’re into, who would be attracted to other dedicated communities. I imagine someone could have the same concerns you do for similar reasons, when it comes to consensual non-consent roleplay. Interestingly enough, this actually is temporarily restricted on lemmynsfw, which could be because an appropriate moderation policy has not yet been agreed upon.

    • Reminds me of a lot of the debates around kink at pride/ddlg kink stuff. The latter is really not my thing and makes me uncomfortable, but I recognise that that’s a personal thing between me and my partners that I can’t, and shouldn’t, police among others.

      There’s also ethical debates to be had on porn in places like Lemmy/pornhub/etc. – we can’t know that the person has consented to being posted, or that they have recourse to get it taken down and stop it being spreaded if they do not.

      Then there’s the realpolitik of, regardless of ethics, whether it’s better to have porn of this type in visible, well moderated communities, or whether it’s better to try to close off ethically dubious posting.

      It’s one I don’t really have squared off in my head quite yet. Similarly with kink at pride; I’ve read about the historic importance of kinksters and recognise that, but at the same time I want there to be a space where queer kids can be involved with pride without being exposed to kink. Is that just prudish social norms talking? Idk; I’m still working it through.

  • For the people like me that don’t know the term: CSAM is Child Sexual Abuse Materials. It’s the term used instead of CP as “pornography” is more commonly used for pleasure or conveys the idea of consent.

    As for the porn that uses people that look under age, it’s no different than the anime children that are thousands of years old. It doesn’t matter how old they are, they look like children and it’s gross.

    • The world is messed up. I feel like advertising any adult material as “barely legal” should be banned too. It skirts the boundary too close. Not as close as the aforementioned thousand year old child body but it feels almost as bad imo.

    • I agree with you but not on the last point. There is a difference since they are real people, adults, and that they consent on being sexually attractive and arouse. I am not attracted to young looking bodies but that’s a notable difference to me. Also I don’t know how I feel about a community (in a broader way than a lemmy comm) focusing and fetichising on young looking adults (I do know that it disturbs me but I want to talk about society wise), but I understand that some people are attracted to young looking bodies and/or juvenile ones, and I feel like adults that consent to answer their desires is better than CASM

    • And that’s where the body shaming comes in, you’re literally telling this 20-something that their body is gross and no one should find them attractive. How would it make you feel if someone said that about you?

  • I enjoy NSFW content, but I certainly don’t want to stumble into “how close to CSAM can we get while staying technically legal?” content. And the bullshit lie about this being “body shaming” pisses me off.

    This admin decision obviously isn’t up for a vote, but it’s just so obviously the right call. Thank you Ada for handling this, and I’m sorry (in the Canadian way, not the guilty way 😉🇨🇦) you had to see any of that.

  • I’m not on this instance, but thank you for being so swift and resolute in your actions. Happy to see all due caution is being taken. Not so happy that such a community made it’s way here to the fediverse. Hopefully I won’t see any of it while doomscrolling.

  •  kardum   ( @kardum@lemmy.blahaj.zone ) 
    link
    fedilink
    English
    54
    edit-2
    1 year ago

    the same community (adorableporn) is also on reddit btw with 2.2m subscribers.

    i have no grand moral opinion on this type of content. for me it is the same as femboy content for example, where people also push for a youthful, girly aesthetic.

    as long as the content is made by consenting verified adults, i don’t care.

    it’s like adults cosplaying with japanese school uniforms or calling your partner “mommy” or “daddy”.

    probably not the best move in terms of sexual morals for sure, in the grand scheme of things tho this is just how people express their sexuality i guess.

    •  Ada   ( @ada@lemmy.blahaj.zone ) OP
      link
      fedilink
      English
      161 year ago

      it’s like adults cosplaying with japanese school uniforms or calling your partner “mommy” or “daddy”.

      No, it’s not, because no one mistakes those things for actual underage children

      •  jerkface   ( @jerkface@lemmy.ca ) 
        link
        fedilink
        English
        301 year ago

        No, it’s not, because no one mistakes those things for actual underage children

        That’s not what happened here. No one would mistake the image in question. You say there were other images; the admins there give a story that contradicts yours. They say there were no such images. Didn’t see those images removed in the modlog, either.

        Could it possibly be that someone has blown things out of proportion and got emotional?

      •  kardum   ( @kardum@lemmy.blahaj.zone ) 
        link
        fedilink
        English
        30
        edit-2
        1 year ago

        i had no problem distinguishing the models on the community from children.

        maybe it’s more difficult in some cases without looking for the onlyfans link or sth similar of the model somewhere in the post, but that’s just human anatomy.

        that’s why the guy at the gas station asks for my ID card, because it is not always super clear. but apparently clear enough for reddit admins and PR people from ad companies.

        i agree playing into the innocent baby aspect is probably not great for sexual morals and i wouldn’t recommend this comm to a local priest or a nun, but this type of content thrives on pretty much every mainstream platform in some shape or form.

        i get it, if this instance wants to be sexually pure and removed from evil carnal desires tho. that’s kind of cool too for sure.

        •  Ada   ( @ada@lemmy.blahaj.zone ) OP
          link
          fedilink
          English
          13
          edit-2
          1 year ago

          i had no problem distinguishing the models on the community from children.

          You didn’t see the content I saw. Content that was reported as CSAM by someone on this instance, who also thought it was CSAM.

          maybe it’s more difficult in some cases without looking for the onlyfans link or sth similar of the model somewhere in the post, but that’s just human anatomy.

          Again, a group that is focused on models in which that is the only way you can tell that they’re not underage, is a group that is focused on appealing to people who want underage models. That is a hard no.

          Spin it how you like, but I am not going to be allowing material that is easily mistaken from CSAM

          • I thought about this some more and I can feel a lot more sympathy for your decision now.

            It must be horrible to get a user report about CSAM and then see a picture, which could be really CSAM on first glance.

            Even if every user report was wrong from now until infinity, that initial CSAM suspicion, because of the false user report, probably makes moderating just a soul-crushing activity.

            It is great if admins from other instances are willing to handle with these horror reports, just to give their users a bigger platform, but this service is not something that can be taken for granted.

            I’m sorry for coming across as ignorant, I just did not consider your perspective that much really.

            • “Even if every user report was wrong from now until infinity, that initial CSAM suspicion, because of the false user report, probably makes moderating just a soul-crushing activity.

              Then they shouldn’t be doing it. If seeing something that looks even slightly off-putting causes this level of over-reaction, Ada doesn’t need to be moderating a community for marginalized/at-risk people. I myself am a CSA survivor, and seeing my trauma being equated to some legal adults playing pretend is fuckin’ bullshit. Seeing my trauma being equated to drawn pictures is fuckin’ bullshit. My trauma being equated to AI generated shit is fuckin’ bullshit. I’ll tell you one thing, as a CSA kid, one thing I cannot stand is someone making decisions on my behalf. To protect me. Fuck you, I’ll fuckin bite anyone that tries to take away my free agency again.

            • I totally get that and definitely don’t blame Ada for defederating (although I don’t think it’s likely it was actually CSAM, nor that the community it was on is Inherently Problematic, as long as everyone in the posts is 18+, people’s kinks are none of my business).

              The thing I don’t get is why reports made by blahaj users on lemmynsfw communities would go to the blahaj moderators at all. That seems like a design flaw in Lemmy, instance mods have no power to moderate content on off-instance communities, so why would they be notified of reports? That seems like it would clutter mod-logs for no reason and cause unnecessary drama (as happened here). Like if every subreddit post report immediately went to the Site Admins, that would be Terrible.

              Though if Lemmy really is built like this for whatever reason, I would probably have done the same thing. I wouldn’t want to have to be Subjected to everything that could be reported on an NSFW instance, there’s probably some Heinous Shit that gets posted at least Occasionally, and I wouldn’t want to see all of it either. I just think it’s Really Stupid that lemmy is built this way, we need better moderation tools

              •  Ada   ( @ada@lemmy.blahaj.zone ) OP
                link
                fedilink
                English
                71 year ago

                The thing I don’t get is why reports made by blahaj users on lemmynsfw communities would go to the blahaj moderators at all.

                Reports go to the admins on the instance the reporter is from, to the admins on the instance the reported account is from and to the admins of the instance the community the post was made to is from. The report also goes to the moderators of the community that the content was posted to.

                Each instance only gets a single report, however many of those boxes it ticks, and that report can be dealt with by admins or moderators.

                However, the results federate differently based on who does the action. So for example, me deleting content from a lemmynsfw community doesn’t federate. It just removes it from my instance. However, a moderator or an admin from lemmynsfw removing lemmynsfw content will federate out.

          • Context always matters. I always check if adult material has actually been made by consenting adults. I would feel sick, if not enough information had been provided for that, but I at least have never encountered CSAM fortunately.

            I had classmates in high school with balding or even graying hair and full beards. Some adults older than me, look younger than my nephews. Revenge porn and creepshots are common. (or atleast were, I’m not on platforms where these are popular)

            Without context, porn will always be a morally grey area. Even commercialized hyper-capitalist porn is still an intimate affair.

            That’s why I didn’t use pornhub for example, before every user had to verify themselves before posting. Before that I only read erotica or looked at suggestive drawings.

            I understand your perspective tho. You get hardly paid to keep this instance running, looking at pictures that without context could be CSAM could make this volunteer work very mentally taxing. This is how NSFW works tho.

            Without context, any pornographic material featuring real humans could in truth be some piece of evidence for a horrible crime.

            •  Ada   ( @ada@lemmy.blahaj.zone ) OP
              link
              fedilink
              English
              41 year ago

              Context always matters. I always check if adult material has actually been made by consenting adults. I would feel sick, if not enough information had been provided for that, but I at least have never encountered CSAM fortunately.

              If I can’t tell, if I have to look something up because the people I’m looking at look like they’re underage, then it doesn’t matter what the answer is, because the issue is that it looks like CSAM even if it’s not. And a community designed in a way that attracts people looking for underage content is not a space I’m willing to federate with.

              • Isn’t it kind of shitty to tell an adult woman she can never be attractive or sexy because she looks too young? Do you truly believe that said person should never be allowed to find love, because it’s creepy? Is she supposed to just give up because you think her body is icky?

                •  Ada   ( @ada@lemmy.blahaj.zone ) OP
                  link
                  fedilink
                  English
                  41 year ago

                  I’ve covered this many times already.

                  The issue isn’t individuals that happen to look younger than they are. The issue is with a community gathering sexual content of people that appear to be children.

                  The community that initiated this isn’t even the worst offender on lemmynsfw. There is at least one other that is explicitly focused on this.

  •  Urist   ( @urist@lemmy.blahaj.zone ) 
    link
    fedilink
    English
    49
    edit-2
    1 year ago

    I get the feeling there’s going to be a lot of comments here from people who disagree.

    This is not your instance. This is not even my instance, I am just signed up here (and thank you Ada, I like it here and I approve of this decision. CSAM-like porn is icky). There is no need to focus on the morality of sharing porn that ends up being viewed as CSAM. Hosting porn involves legal risk, and federating with an instance that has porn on it means that eventually you will host porn images. If you have your account here and you don’t like this choice, consider moving instances or hosting your own.

    Not only that, does anyone remember /r/jailbait on reddit? They did not do anything about that subreddit because the images were “legal”, but the userbase they attracted began sharing real CSAM in the DMs. To be clear: I don’t know what community we’re talking about (lemmynsfw does not appear to have a jailbait community, I did not look hard) but you do not want the sort of people around that this attracts.

    edit: remove unintentional link

  • Bloody hell this thread is a mess of people from other instances complaining. I wish Lemmy would add the ability to set a community as private to it’s instance. Or only commentable by instance members. If you’re not from this instance, this defederation doesn’t affect you and you should step off. The admins job here is to protect us, the users on this instance. Not appease you.

  • If I believe the mod of the community in question is telling the truth, Seems like the incident in question was just a misunderstanding. The community name is

    spoiler

    adorableporn

    I will refer to this as “the first community” in the following text.

    The mod of the community copy/pasted the dictionary definition from vocabulary.com, which contains the word “childlike”.

    IMO, the community in question is not trying to skirt the line of Child Sexual Abuse Material (CSAM). In fact, there is a subreddit of the same name which has absolutely nothing to do with people that appear underage.

    That said, the same mod also moderates, and posts to a different community with a concerning name. The spoiler below shows the name and the first three paragraphs of the sidebar as they appear:

    spoiler

    Community is now open to posting. Posts not having verification info will be removed.

    FauxBait is a place for sharing images and videos of the youngest-looking, legal-aged (18+) girls. If you like fresh, young starlets, this is the place for you!

    Just to be clear: We only feature legal, consenting adults in accordance with U.S. Laws. All models featured were at least 18 years old at the time of filming.


    Also, I’m not sure if the timestamps can be trusted, but said mod was instated as the only active mod of the first community at the same time that Ada made this post, which would mean that the mod account could not have been the one that wrote the original sidebar of the first community. Not sure what to make of that. For the sake of balance though, said mod does seem to be doing verifications of the age requirements. Also, the modlog for the first community shows two admin removals from at least 10 days before this debacle, both of which err on the side of caution, so at least the admins to seem to care about enforcing their rules.


    The situation seems very muddy, but I personally don’t think the original incident was that big of a deal (assuming the mod is telling the truth). However, I certainly don’t blame the blahaj admins for defederating as it’s certainly the safest option. Wouldn’t want blahaj lemmy to get taken down :| Also happy to see less pron in my feed; I’m too lazy to block the individual /c/. Personal Instance-level blocking can’t come soon enough.

    •  Ada   ( @ada@lemmy.blahaj.zone ) OP
      link
      fedilink
      English
      251 year ago

      but I personally don’t think the original incident was that big of a deal

      The post I saw looked like an underage teenage girl. It was reported as child porn and looked like it to me before I even looked at the community.

      Then when I looked at the community, I discovered it wasn’t accidental. The whole point of the community is to appeal to folks looking for people that look like underage teenagers.

      That’s a pretty big deal.

      •  stebo   ( @stebo02@lemmy.dbzer0.com ) 
        link
        fedilink
        English
        26
        edit-2
        1 year ago

        The whole point of the community is to appeal to folks looking for people that look like underage teenagers.

        It’s not though? Only the other community is like that. Still, defederating is probably the best choice indeed.

  • I feel like the people getting upset over this are taking these hypotheticals of “young looking adults just wanting to be able to make porn equally and that technically the community did nothing wrong”.

    The problem is that just ignores the fact that pedophiles would definitely use communities like that as a “foot in the door” to a comminity that would naturally have a lot of closetted pedophiles. The issue isn’t young looking adults making porn, the issue is a community based around youngest possible looking adults is naturally gonna attract and encourage pedophiles.

    It’s like they say, “all it takes is allowing one nazi in your bar for it to rapidly turn into a nazi bar”.

    • I mean yeah, but I think the solution here is just age verification. If you’re posting nsfw OC, you should have to verify age with mods, and if you’re posting nsfw from online, you should be able to prove they’re of age if prompted (like, if it’s a famous pornstar, they should be verified on pornhub or onlyfans or something so it’s easy to check whether they’re of age).

      Like, I have small tits, I’d like to be able to post nsfw without people insinuating I’m pedo-baiting or that people attracted to me are intrinsically pedophilic. Just have strictly enforced age-gates and ban anyone being creepy

      • The issue isn’t the people posting, the issue is a community meant to get the youngest possible looking adults, even if they’re adults, is going to attract closetted pedophiles and allow them to network.

        If a young looking but verified adult posted to the numerous other nsfw communities there’d be no issue. But gathering people on the criteria of “young/child looking” is going to give a comfortable space for pedophiles to communicate with each other and spread the actual illegal stuff in private.

        It’s not the content or the posters, it’s the communities that’s being fostered that’s the problem.

  • Thank you thank you thank you thank you. Especially with the type of folks that choose blahaj for their lemmy server, I think it’s completely appropriate to err on the side of caution to avoid MAYBE being associated with that stuff. It affirms that when I chose an instance that I’ve made the right call.