Discussion about this post

User's avatar
jooky's avatar

Another thought on Google: notice that a key weakness of their algorithm is that it must be giving extra weight to things that have been on the internet and unchanged for a long time. That's why 11 year old reddit comments are bubbling to the top.

I bet something in there is making an evaluation that phrases that have exhibited low churn over a long period must have more truthfulness - and the evidence of this fact is that they haven't been removed or revised over that period.

So not only are is their AI picking troll comments from reddit, but it's weighting things so that the oldest troll comments are the most relevant. And as anyone who was on reddit 11 years ago can tell you, that site was a far more vile hub of degeneracy than it is even now!

Basically, Google AI has been trained in r/jailbait, r/fatpeoplehate and those other cesspits.

Nice work, Google! This won't backfire on you at all!

Expand full comment
Gio's avatar

> There’s something deeply broken about our collective psyches that’s tied into how social media algorithmically promotes doom narratives. Some of these answers are just dead wrong factually [...] Some are absurd - who really thinks TV and cuisine is worse now than it was in the ‘30s? We are absolutely addicted to doom narratives and I have no idea how to shake it.

Doom and pessimism sells more than optimism. It’s more enticing to listen to, it hits you emotionally. A vision of doom is more vivid than one that says “things will keep getting better, gradually.” Also it’s easier for merchants of doom to go around podcasts uttering prophecies than it is for someone to say “progress is possible, you can help, but you need to work a hard for it.”

Expand full comment
7 more comments...

No posts