Discussion about this post

User's avatar
WRT's avatar

Telegram's lack of content moderation is not dissimilar to FB's failures regarding Myanmar dating back to 2012 - a lack of will driven in (large?) part by profit motive. With only 50 Telegram employees and about 3/4 billion users, Telegram is arguably more egregious. At least FB attempted to address its critics, even if for PR purposes.

https://about.fb.com/news/2021/02/an-update-on-myanmar/

https://www.amnesty.org/en/latest/news/2022/09/myanmar-facebooks-systems-promoted-violence-against-rohingya-meta-owes-reparations-new-report/

https://time.com/6217730/myanmar-meta-rohingya-facebook/

https://www.bbc.com/news/world-asia-59558090

Expand full comment
Noah Pardo-Friedman's avatar

I think an important and under-discussed aspect of the content moderation debate is algorithmic recommendations.

If a company truly is just a platform, just a publisher, in that it *only* provides people with a place to post/share content and doesn't promote anything in particular, then a completely hands-off, "we're not responsible for what our users post" policy makes some sense (though it's still not quite that simple, imho). If a company is promoting content to generate more clicks (presumably for surveillance advertising, which is another important and under-discussed part of all this), it's now more of a distributor (I'm paraphrasing Matt Stoller from the Substack BIG, which I heartily recommend), and bears more responsibility for content moderation, imho.

Since Telegram, as I understand it, does promote content through what seems at a glance to me to be more or less the conventional surveillance advertising business model, to me that means it's basically full of shit in its one-dimensionally libertarian attitude toward content moderation.

The SCOTUS's recent ruling on Section 230 should have an impact on all this going forward, one would think.

Expand full comment
3 more comments...

No posts