In India, an administrator of a Whatsapp group has recently faced arrest, following the sharing of what is alleged to be a defamatory photo-shopped image of Prime Minister, Narendra Modi. South Africa has yet to test the liability of a group admin with regard to what is shared on their group. However, given the rise in online racism and hate speech, paired with the millions of people around the world who use the Whatsapp application, it could only be a matter of time before a case like that in India comes before the South African courts.
The rise in the use of social media platforms to communicate has resulted in a surge in judgments around the world, which place affirmative obligations on social media users to vet the content that their name attaches to, prior to this content appearing on their social media accounts.
We previously wrote about the South African high court extending liability for defamation to people who have simply been tagged in posts on Facebook. It seems that administrators of social media groups (commonly referred to as ‘admins’) should brace themselves to shoulder similar extended liability for what is posted on a group.
A Whatsapp admin is automatically the person who creates the group. But, additional admins can be added, without necessarily agreeing to be one. Group admins then have the ability to control who is invited to, or removed from, the group chat.
We have seen numerous parodies of famous global leaders openly shared with ease on the internet. There are arguably even more of these images being circulated on what people may perceive to be private Whatsapp chats. Whilst a lot of these images will fall safely into the realms of parody, there are probably also countless other groups where material that is offensive, amounts to hate speech, or criminal conduct, is shared, and in a country you may want to visit one day.
The recent order in India should remind WhatsApp group admins that what is posted on private groups can still result in serious repercussions, including being found guilty of defamation or hate speech. A company that appoints a representative to create a Whatsapp group to communicate with a particular team, or the organisation as a whole, should also have the possibility of facing charges on the basis of vicarious liability at the forefront of their concerns.
Until we have certainty on the approach that our courts will take on the liability of Whatsapp admins, some practical advice that companies and individuals can implement immediately include:
- Ask yourself what the consequences of the content shared on your WhatsApp group (be it with family or friends, a home owners association, school committee, or a company) could be if this content was shared publicly. Remember, a claim for defamation only requires publication to one other person (including those in the group).
- Ensure the credibility of the participants of the group. Are all these people personally known to you as the group admin? If a person has subsequently left the organisation or circle to which the group pertains, have you removed them from the group?
- Educate the members of group regularly with regard to what content is permitted to be shared. For example, company groups should be used only to share information that pertains to groups. No cat videos. And certainly no political commentary. A comprehensive corporate social media policy should address this.
- Object to content that may be regarded as defamatory or offensive. This should also be followed by taking immediate steps to get a promise not to repeat the behaviour and, if not, to remove the content and the person sharing such content from the group. In addition, a message should be shared with the group stating that such behaviour is not condoned, and will not be tolerated from other members.
As at January 2017, about 1.2 billion people around the world used Whatsapp and its functionality to create groups. As much as we try to escape the constant barrage of notifications that a Whatsapp group inevitably brings with it, it seems that this way of communicating is here to stay. Whilst we cannot predict the content of the messages that we open in these groups, we can take steps to mitigate the risk that we could face as a result of their content.