About 20 years ago I watched a TV programme featuring the magician Paul Zenon. He invited the viewers to think of a card, and he said he could guess the card you were thinking of. He then drew a card.
“Were you thinking of this card?” After a pregnant pause he continued: “Probably not — but it was worth it for the 1 in 52 of you who were.”
This is how crude algorithms like Facebook’s work. They tot up simplistic metrics such as likes. But for those who simply scroll past, are indifferent to, or actively dislike an item, Facebook records much less about what you think.
If 1 in 52 people like an item, Facebook knows that someone liked it. But it has little information about what the other 51 felt, if they felt anything.
This makes it difficult for Facebook to tell apart things that are boring from things that are abhorrent. In other words, Facebook has difficulty knowing what’s valuable to show to others, from what is dangerous and should not be shown to others.
This is one of the reasons why extremist views flourish on platforms like Facebook. As long as you’re one of the 1 in 52 that reacts to something, Facebook interprets that as validation of the content — no matter what the other 51 people think.