Les algorithmes ne servent pas seulement à vous présenter des pubs pour vous vendre des bébelles. Ils peuvent aussi servir à vous vendre une idéologie, selon cet article.
For their part, platforms have made efforts to curb some extremist content and misinformation. But these only came after years of allowing it largely unchecked — and profiting from it — and with mixed results. These measures are also reactive and limited; they do nothing to stop or curb any developing conspiracy theories or misinformation campaigns. Algorithms apparently aren’t as good at rooting out harmful content as they are at spreading it. (Facebook and YouTube did not respond to request for comment.) […] The conspiracies might be much easier to find (even when you weren’t looking for them); you still choose whether or not to go down the path they show you. But that path isn’t always obvious. You might think QAnon is stupid, but you will share #SaveTheChildren content. You might not believe in QAnon, but you’ll vote for a Congress member who does. You might not fall down the rabbit hole, but your friends and family will.