Because these machine learning algorithms only put out what they learn so that they can target the right videos to people. In this case, I think people were searching Youtube for these kinds of videos, so Youtube’s algorithm suggested them.
Nah they want you to be shocked and start watching. Called engagement, makes money by upsetting peoples emotional state so they become nervous, unhappy, angry etc.
That’s not antithetical to what the other person said. It is both true that the algorithm does what it think people like and also that bad things get high engagement. The issue is that bad things aren’t accounted for and filtered out appropriately
Because these machine learning algorithms only put out what they learn so that they can target the right videos to people. In this case, I think people were searching Youtube for these kinds of videos, so Youtube’s algorithm suggested them.
Nah they want you to be shocked and start watching. Called engagement, makes money by upsetting peoples emotional state so they become nervous, unhappy, angry etc.
That’s not antithetical to what the other person said. It is both true that the algorithm does what it think people like and also that bad things get high engagement. The issue is that bad things aren’t accounted for and filtered out appropriately