Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think this is a very overblown concern that will not be a big issue in practice… at least not for a number of years.


Mastodon already has "censorship" options, and still people use it big time to distribute CP. https://www.secjuice.com/mastodon-child-porn-pedophiles/

If I were to run such a node in Germany, and someone would use it to distribute CP with it, I face from 1 to 10 years in prison.

Yeah, no thanks.


This article quietly equates "CSAM" with lolicon and shotacon content, terms colloquially used to refer to drawings and other creative expressions, which is very clearly what they're avoiding mentioning. Of course, most people aware of fediverse politics already know about Pawoo and there is no good reason to speak about it in hushed tones, other than cowardice. Also, using the term "CSAM" to describe lolicon and shotacon is just intentional factual inaccuracy: the point of the term CSAM to specifically exclude content that doesn't directly involve the harm of a child; that's what "abuse material" is meant to refer. That does not change the legal reality or moral ramifications of the content, but it's bad to start out with the terminology so wrong because it paints the entire remainder of the discussion. Yes, it's complicated and uncomfortable, but let's not mince words.

When you run a Mastodon instance though, you don't actually need distribute this content at all. In fact, you probably would at least ban proxying media for communities like Pawoo, and honestly, probably, any NSFW-oriented instance if you want to be safe.

I hate to break anyone's innocence, but this stuff is literally everywhere on Twitter, and a lot of the rest of the Internet, too. Is it legal? Depends on your jurisdiction. Is it moral? ¯\_(ツ)_/¯. Only one thing's for sure: it's on your Internet, along with plenty of other things that you can also find on Mastodon somewhere. Such is the reality of federation and scale of the Internet. The stuff that goes through email relays unencrypted is not so different, other than the fact that it doesn't get broadcasted, and that it's probably worse in many cases.

If some instance was broadcasting outright CSAM on Mastodon, it would no doubt become quickly blacklisted by basically everyone and then probably also shut down off the clearnet.


Famous last words...

I wouldn't want to run the risk of being the first one it does happen to.

I think something like ipfs offers better resistance to such attacks for node owners. I wonder how the two compare because I don't know either in detail.

But sharing unencrypted content from others could be an issue for many reasons. Copyright too.

Especially because this protocol is aimed at content blocked or censored from other platforms, as others have pointed out.

Sure, you can block others from using your relay but then what's the point? Just host your stuff on your self hosted web blog.


Nostr isn't aimed at content blocked or censored.

Relays also don't host media, it's meant for text-based comms.


Well HN is text-based as well yet...

iVBORw0KGgoAAAANSUhEUgAAAAEAAAABAQMAAAAl21bKAAAAA1BMVEWbueItP/xuAAAACklEQVQI12NgAAAAAgAB4iG8MwAAAABJRU5ErkJggg==


"That doesn't look like anything to me"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: