![]() ![]() As Nilay notes in his commentary on the transcript, he feels that there should be much less moderation the closer you get to being an infrastructure provider (this is something I not only agree with, but have spent a lot of time discussing). The larger discussion is worth listening to, or reading below. ♬ original sound – Decoder with Nilay Patel Chris just trying to stare down Nilay just doesn’t host Nilay asked Substack CEO Chris Best the tough questions about whether racist speech should be allowed in their new consumer product, Substack Notes. Best had to know that content moderation questions were coming, but seemed not just unprepared for them, but completely out of his depth. That became ridiculously clear on Thursday when Chris Best went on Nilay Patel’s Decoder podcast at the Verge to talk about Substack’s new Notes product, which everyone is (fairly or not) comparing to Twitter. But, the exec team there often seems to have taken a “head in sand” approach to understanding any of this. I had been a fan of the service since it launched (and had actually spoken with one of the founders pre-launch to discuss the company’s plans, and even whether or not we could do something with them as Techdirt), as I think it’s been incredibly powerful as a tool for independent media. Substack has faced a few controversies regarding the content moderation (or lack thereof) for its main service, which allows writers to create blogs with subscription services built in. But it is unacceptable for the CEO of a social media site today to not realize this. It was understandable a decade ago, before the idea of “trust & safety” was a thing, that not everyone would understand all this. Let the Nazis build their own bar, or everyone will just assume you’re a Nazi too. It’s building your own brand as the Nazi bar and abdicating your own free speech rights of association to kick Nazis out of your private property, and to craft a different kind of community. But running a Nazi bar is not winning any free speech awards. It’s the “ oh shit, this is a Nazi bar now” problem.Īnd, look, sure, in the US, you can run the Nazi bar, thanks to the 1st Amendment. And if you don’t deal with the malicious users, the malicious users define you. ![]() And trouble drives away users, advertisers, or both. And if that doesn’t do it for you, the copyright police will.īut, then you realize that beyond spam and content that breaks the rules, you end up with malicious users who cause trouble. Or illegal content such as child sexual abuse material. Some discover it faster than others, but everyone discovers it. We won’t do any moderation beyond what’s required.” Even Twitter initially thought this. Every tech dude comes along and has this thought: “hey, we’ll be the free speech social media site. Fri, Apr 14th 2023 10:47am - Mike Masnick ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |