Tag archives: hate speech

Exposed! How to combat the sharing of harmful content on social media

Online video celebrity Chrissy Chambers has recently settled a case against her ex-partner for damages suffered as a result of his posting sexually explicit video clips online.  The terms of the settlement are confidential but this and several other high-profile cases have generated much publicity around social media platforms’ responsibility to monitor and remove harmful or abusive content.

Content can be harmful in a number of ways, such as cyberbullying, threats of violence, hate speech and even “revenge porn” (the sharing, usually on a public platform, of intimate photos or videos of a person without their consent).

Corporate reputations are … Continue Reading

German law on hate speech – complaint procedures

The German law on hate speech (Network Enforcement ActNetzwerkdurchsetzungsgesetz) which came into effect on October 1, 2017 is continuously subject to criticism. Its legal and political implications in regard of the current global debate on the dealing with different opinions, the power and influence of social media on information and disinformation and its place in the context of an increasing fragmentation of the internet are widely discussed throughout media (i.e. see our posts here and here).

Since January 1, 2018, social media providers are now obliged to maintain a procedure for complaints. This procedure … Continue Reading

New German Law on Hate Speech on Social Media

June of 2017 ended with the German parliament approving the bill targeted at eliminating hate speech and fake news on social media, including on Facebook, YouTube, and Google. The law will take effect in October of 2017, and could carry fines up to EUR 50 million.

We previously discussed the bill on this blog post.  Now that the bill has been passed into law, social media companies are required to remove illegal hate speech within 24 hours after receiving notification or a complaint, and to block other offensive content within seven days.  The law also requires social media companies to … Continue Reading

WhatsApp group administrators may be responsible for members’ content

In India, an administrator of a Whatsapp group has recently faced arrest, following the sharing of what is alleged to be a defamatory photo-shopped image of Prime Minister, Narendra Modi.  South Africa has yet to test the liability of a group admin with regard to what is shared on their group.  However, given the rise in online racism and hate speech, paired with the millions of people around the world who use the Whatsapp application, it could only be a matter of time before a case like that in India comes before the South African courts.… Continue Reading

Germany considers € 50 million fines for social media sites failing to remove hate speech

The German Justice Ministry has introduced a draft law that would impose fines of up to €50 million on social media companies that fail to remove hate speech and other illegal content from their platforms quickly.

The fines would be imposed whenever social media companies do not remove online threats, hate speech, or slanderous fake news. Any obviously illegal content would have to be deleted within 24 hours; reported material that is later determined to be illegal would have to be removed within seven days.… Continue Reading

Twitter and hate speech policy

This year seems to have started off in much the same way as 2016 ended. Celebrities, politicians, and everyday people have flocked to social media to provide their commentary on everything from global crises to envelope sagas.

Towards the end of 2016, Twitter announced that no person is above their policies, particularly in respect of hate speech, and threatened to remove Donald Trump’s verified account if the President continued to violate them. But what exactly do the Twitter policies say?… Continue Reading

LexBlog