5 Comments

Telegram's lack of content moderation is not dissimilar to FB's failures regarding Myanmar dating back to 2012 - a lack of will driven in (large?) part by profit motive. With only 50 Telegram employees and about 3/4 billion users, Telegram is arguably more egregious. At least FB attempted to address its critics, even if for PR purposes.

https://about.fb.com/news/2021/02/an-update-on-myanmar/

https://www.amnesty.org/en/latest/news/2022/09/myanmar-facebooks-systems-promoted-violence-against-rohingya-meta-owes-reparations-new-report/

https://time.com/6217730/myanmar-meta-rohingya-facebook/

https://www.bbc.com/news/world-asia-59558090

Expand full comment

I think an important and under-discussed aspect of the content moderation debate is algorithmic recommendations.

If a company truly is just a platform, just a publisher, in that it *only* provides people with a place to post/share content and doesn't promote anything in particular, then a completely hands-off, "we're not responsible for what our users post" policy makes some sense (though it's still not quite that simple, imho). If a company is promoting content to generate more clicks (presumably for surveillance advertising, which is another important and under-discussed part of all this), it's now more of a distributor (I'm paraphrasing Matt Stoller from the Substack BIG, which I heartily recommend), and bears more responsibility for content moderation, imho.

Since Telegram, as I understand it, does promote content through what seems at a glance to me to be more or less the conventional surveillance advertising business model, to me that means it's basically full of shit in its one-dimensionally libertarian attitude toward content moderation.

The SCOTUS's recent ruling on Section 230 should have an impact on all this going forward, one would think.

Expand full comment

The statement from Telegram that they review absolutely zero take down requests is also just not true. Another thing people use Telegram for is piracy. For a research thing I wrote, I looked into some book piracy Telegram groups. They would get removed due to copyright infringement semi-regularly, to the point that people moved to other platforms. I suspect that if Telegram is threatened with a lawsuit by a large enough company, they are willing to moderate. It’s just they pretend to be a company that doesn’t moderate at all, just like every other site that makes that claim.

Expand full comment

Great overview. I think it’s also worth mentioning that while Signal and WhatsApp are E2E-encrypted and can’t be moderated they’ve still taken steps to limit forwarding to prevent rumors and misinformation to spread using their apps. To me it seems like a good compromise where conversations can still be private and apps need to consider if they’re more of a messaging service or a social network that needs more moderation.

The other thing is that I just don’t think Telegram has enough revenue potential for them to be able to invest in moderation. There was an article about Telegram in the FT in march and it was mentioned that their cost per user was very low but they still didn’t expect to grow profitable until next year. Adding a moderation team will move them further away from profitability.

A principle like “we don’t moderate” might play well with a conservative or free speech absolutist crowd but it can just as well be cope because you just can’t afford it - especially if you also turn down venture capital because you can’t get particularly good deals for the 5th social network and they’ll insist on moderation anyways. Twitter under Musk seems to have been an example of this dynamic as well as does the fate of Gab, Parler, Rumble and the other right leaning social networks that seem to fail crossing into the mainstream.

Expand full comment

cool, thanks!

Expand full comment