Twitter has asked its members to help shape new rules banning "dehumanising speech", in which people are compared to animals or objects.

It said language that made people seem "less than human" had repercussions.

The social network already has a hateful-conduct policy but has been accused of bias for allowing some types of insulting language to remain online.

For example, countless tweets describing middle-aged white men as "gammon" can be found on the platform.

At present, some derogatory language is deemed to be a clear violation of Twitter's existing hateful-conduct policy.

It bans insults based on a person's:

race
ethnicity
nationality
sexual orientation
sex
gender
religious beliefs
age
disability
medical condition

But Twitter's critics have used the hashtag #verifiedhate to highlight examples of what they believe to be bias in what the platform judges to be unacceptable.

The "gammon" insult gained popularity after a collage of contributors to the BBC's Question Time programme - each middle-aged, white and male - was shared along with the phrase "Great Wall of Gammon" in 2017.

Twitter said it intended to prohibit dehumanising language towards people "in an identifiable group" because researchers had shown it could lead to real-world violence.

Asked whether calling men "gammon" would count as dehumanising speech, the company said it would first seek the views of its members.

The scope of "identifiable groups" covered by the new rules will be decided after a public consultation that will run until 9 October.

"This consultation is the time for people, experts and free expression groups to be involved in the development of a policy that will protect individuals and groups," said Nick Pickles, senior policy strategist for Twitter.

Mr Pickles said that all views would be welcome in the consultation and that the company would be transparent about the results.

He recognised that some people might tell Twitter it should not block "dehumanising speech" since it would stifle free expression.

"We think it's really important to have outside input, for the health of the platform. It's about being transparent about how we develop our processes," he told the BBC.

"It's a genuine commitment from the company to be more accountable to our users."