A charity has urged Twitter to do more to help disabled users report hate speech against them on the site, claiming it remained too difficult to report disability-related abuse.

Muscular Dystrophy UK said the lack of a clear option to label abusive tweets based on disability was preventing more reporting of such hate speech.

The charity said that while options to report abusive tweets based on race, religion, gender or orientation were clearly labelled on the site’s reporting tool, an option for disability hate speech was not available.

Twitter said it did have a clear policy on the issue, and that attacks on other people based on a disability were specifically prohibited under its rules on “hateful content”.

That phrase does appear as a option on Twitter’s reporting tool, but a direct reference to disability does not.

Lauren West, manager of Muscular Dystrophy UK’s Trailblazers network of young disabled people, warned that unless a clearer reference was introduced, disabled people would turn away from the site.

(Yui Mok/PA)
Twitter said it did have a clear policy (Yui Mok/PA)

“Platforms like Twitter and Facebook should be a valuable tool for disabled people to take part in everyday conversations, but hate-filled language keeps many away,” she said.

“It has become so common that it barely raises an eyebrow and this situation has to change.

“Platforms like Twitter have to give us the tools we need to protect ourselves from hate speech, and adding disability to its Report Tweet function is an easy starting point.”

The charity said Twitter agreed to review this process during a public meeting last year, but claimed no visible change has been made since then.

It added that “hiding” the characteristic within the Twitter rules was greatly reducing the chance of disabled people taking action against hateful language when they encountered it.

Those rules – which can be viewed on the Twitter help centre website – say: “You may not promote violence against or directly attack or threaten other people on the basis of race, ethnicity, national origin, sexual orientation, gender, gender identity, religious affiliation, age, disability, or disease.

“We also do not allow accounts whose primary purpose is inciting harm towards others on the basis of these categories.”

In response to the charity, the social media site said it now took action on ten times the number of abusive accounts as this time last year, and had made more than 30 individual changes to the platform, including on its policies, with the goal of improving safety for all its users.

Twitter added that the current tool to directly report hateful content, and includes references to disability, has been in place since November 2016.

The firm is among several internet giants, including Facebook and Google, who have pledged to improve their policing of abusive content on their platforms.

Twitter boss Jack Dorsey said last year that the firm would be more “aggressive” in enforcing its rules, while Facebook boss Mark Zuckerberg said his mission for 2018 was to “fix” Facebook and its issues around abuse.