Elon Musk has until the end of Wednesday to respond to demands from Brussels to remove graphic images and disinformation linked to the violence in Israel from his social network X — or face the full force of Europe’s new social media rules.
Thierry Breton, the European Union commissioner who oversees the bloc’s Digital Services Act (DSA) rules, wrote to the owner of X, formerly Twitter, to warn Musk of his obligations under the bloc’s content rules.
If Musk fails to comply, the EU’s rules state X could face fines of up to 6 percent of its revenue for potential wrongdoing. Under the regulations, social media companies are obliged to remove all forms of hate speech, incitement to violence and other gruesome images or propaganda that promote terrorist organizations.
Since Hamas launched its violent attacks on Israel on October 7, X has been flooded with images, videos and hashtags depicting — in graphic detail — how hundreds of Israelis have been murdered or kidnapped. Under X’s own policies, such material should also be removed immediately.
You really don’t need to look further than the clinical data on PTSD. A sufficient amount of any form of trauma can cause mental health issues including but not limited to PTSD. Watching an execution video has a large potential to cause a severe trauma response, especially if the victims are people you know or love, or are members of your community.
Plenty of real world examples of content moderation teams at social media companies suffering from their exposure to extreme content.
Traumatizing people is one of the core goals of terrorism, because it does damage.
Thanks for not linking to a single study like I asked.
Sorry, but I won’t take you seriously until you do. You mentioned ‘studies.’ Show us them.
I’ve seen studies that disprove your studies.