Charities Against Hate recommends stricter penalties for online abuse

17 Mar 2021 News

A coalition of charities has released 16 recommendations for social media companies tackling online hate.

Charities Against Hate is a voluntary group including representatives of more than 40 leading charities, and wants platforms to provide better support to victims and include stricter penalties.

It says platforms are “miles behind other companies” in supporting people who have been harmed or put in danger through using their services.

Charities Against Hate, which was formed last summer in response to growing concerns about the spread of online hate, has also created a toolkit to enable people who have experienced hate online, and other campaign supporters, to write to their MP.

The group has previously found that the majority of charity staff and beneficiaries have witnessed online hate.

Alongside quicker access to support, the coalition suggests that stricter penalties and lifetime bans “should be considered at a far earlier stage than they currently are”, that platforms stop recommending and amplifying groups and communities that spread hate, misinformation or violent conspiracies.

Aisling Green, digital marketing strategy manager at Parkinson’s UK, said: “Social media plays an increasingly important and often positive role in our lives. But platforms are miles behind other companies in supporting people who are harmed or put in danger in the course of using a company’s products - instead, victims find it hard to know how to report harassment, and are likely to be met with silence or a painfully slow response.

“We hope platforms commit to putting more of their considerable resources into tackling online hate, which can have a devastating effect on users - and to more collaboration between platforms. Making the internet safer is in everyone’s interests.”

Other recommendations 

Other recommendations include greater collaboration between social media platforms. They should also recognise that tackling online abuse and hate should be an ongoing, long-term process as platforms cannot expect to “fix” online hate once and for all.

The recommendations suggest that social media platforms should investigate how their own staff are affected by moderating hateful and harmful content, and suggests that charities could provide mental health support and resources for this group.

Lydia Morgan, participation manager at Young Women’s Trust, who leads the campaigns lived experience group, said: “It has been awful to hear just how much people’s mental health, wellbeing, and confidence has been impacted by being subjected to hate online, and seeing hate being directed at others. 

“We’ve also found that people often don’t report it, because they don’t have any confidence that they would be taken seriously or that they’d receive any support. If the sort of hate we see on social platforms were happening on the street, it would be unacceptable, so why do we let it happen through some of the world’s biggest companies?”.

The recommendations come as the government plans to introduce an Online Safety Bill later this year. 

The bill will see more regulation for social media platforms, but the Charities Against Hate Coalition has told MPs that more needs to be done, and that it may not go far enough to protect the most vulnerable.

For more news, interviews, opinion and analysis about charities and the voluntary sector, sign up to receive the Civil Society News daily bulletin here.

More on