Harassment of Zelda Williams following father's suicide leads to change.

Twitter has pledged to update its policies regarding abuse and user safety, following a series of distressing incidents that caused Zelda Williams, daughter of the late comedian and actor Robin Williams, to leave the social network.

Two accounts have been removed by Twitter after Zelda Williams received abusive messages and doctored pictures and subsequently announced she would be leaving the social network. In a statement, Twitter's Vice President of Trust and Safety Del Harvey said the company would be addressing a variety of different issues that the event had raised and would update its policies accordingly.

"We will not tolerate abuse of this nature on Twitter. We have suspended a number of accounts related to this issue for violating our rules, and we are in the process of evaluating how we can further improve our policies to better handle tragic situations like this one. This includes expanding our policies regarding self-harm and private information, and improving support for family members of deceased users."

The social network has long struggled to balance its determination to promote free speech with the need to tackle abuse. The fallout from the terrible abuse suffered by campaigner Caroline Criado-Perez resulted in Twitter rolling out an abuse button to make it easier for users to report problems they are encountering. Two of those that targeted Criado-Perez were jailed earlier this year for their part in the torrent of abuse she suffered.

It's disconcerting that it takes such high-profile attacks for the social network to promise that it will revise and refine its policies on abuse, but in the case of Zelda Williams, it also raises a number of other questions about how Twitter handles violent imagery or discussions around suicide and self harm.

Following Robin Williams' suicide earlier this week, there has been an unprecedented amount of open conversation across social media about mental health—a topic that touches the lives of many and yet is rarely discussed. This has had many positive results—boundaries have been broken, there has been an outpouring of empathy toward those suffering with mental health issues, and media outlets that have reported insensitively and irresponsibly on Williams' death have been named and justifiably held to account.

It is thanks to Twitter's positioning of itself as a forum for public free speech that has allowed these vastly productive conversations to occur. At the same time, the nature of the abuse suffered by Zelda Williams was sadistic and inhumane to the core.*Policing the network in a way which allows debate to flourish, while outlawing the sharing of certain types of content is no easy feat.

According to Twitter's currently policies, if it identifies users that have expressed thoughts on self-harm or suicide, it will reach out to the users and provide them with a list of recognized resources. This is entirely appropriate and responsible, but it needs to look at mental health from other perspectives, too. There is no clause in Twitter's guidelines banning the sharing of self-harm imagery, for example. The only content that is explicitly banned in this way relates to pornography, which is far from the only kind of damaging or triggering imagery out there. Clearly, Twitter is off to a good start when it comes to being sensitive to the mental health of its users, but its policies may well need to be fleshed out in order to cover a wider range of circumstances.

The other policy Twitter has promised to update relates to how it supports family members of deceased users. Currently if a user passes away, family members or friends can mail or fax a variety of documents—including a copy of a death certificate—to Twitter and have the account removed.

It's not clear yet how Twitter will offer further support, but it could potentially provide more advice and resources to help people deal with grief or find a way to deliver sympathy messages or tributes (the heartfelt ones, at least) to a user's family.

We don't know yet when to expect the updates, but hopefully it will help to refine the platform as a space for conversation, while keeping both high-profile and regular users safe from abuse.