•  Possibly linux   ( @possiblylinux127@lemmy.zip ) OP
    link
    fedilink
    English
    16
    edit-2
    4 months ago

    The content is irrelevant. One country should not censor the entire web. I don’t care how terrible it is. It is easy to say a stabbing is bad but what about a criticism or a leader or hard discussions.

    I don’t live in Australia but yet they were trying to enforce there legislation on me. Australia is very much not the only country that is guilty of this. It is one win in the bigger picture.

    If a international platform wants to host something questionable they should have the right to. If it violates local law they just remove it from the specific country.

    • I agree, but I think it is more complex than that. There are limits to free speech already. I agree that no one country should be able to censor others, but what about content that is illegally produced in that country.

      So if terrorist training videos were made in Australia, could banning them from distribution mean they could prosecute fitter for distributing them? How about csam? How about China prosecutes for ibfro about Tiananmen. What about CSAM?

      So objectively there are things some countries would want banned, but not all. Some that all might agree to ban. Classifying it might help but might that be more of an invasion of privacy? The web is built on lots of open protocols that assume good actors and no malicious intent. We are now adding protocols that increase privacy and security on top. Even something like the fediverse is a good example of the trade off between being public and being anonymous and being private. You can’t have it all.

        • Holding social media companies responsible for the content they host is a better solution in my view. We hold newspapers responsible. Why not social media? Yes, moderation is expensive but they are wildly profitable, musk aside.

          They don’t need to moderate everything, as the content volume is high, but they certainly could manually moderate all content that reaches a certain threshd. They choose not to and hide behind their users sharing as a reason.

          • That would be very bad for free speech. Companies would not take any chances and would remove any content that could remotely bring them trouble. I’m sure there would be lots of bad takedowns and it would be abused just like the DMCA.

            • Depending on private companies for free speech is bad for free speech in and of itself. So either course has negatives, which means the course with leqsr negative outcomes is best. If they over moderate, they lose users. If they undermoderate they face fines. I’m sure the market force will mean they do whatever is most profitable.