Instagram is admitting responsibility for the content it hosts. What does that mean for social media? | thearticle

feature-image

Play all audios:

Loading...

For most of us, Instagram is the “happy” social network. A place full of pictures of elaborate brunch plates, contrived gym works outs, and staged selfies. It seems to have less trolling and


nastiness than Twitter and less embarrassing random comments from the person you haven’t seen since you left school 15 years ago than Facebook. For others though, the photo-sharing platform


has a darker side. One of those was 14-year-old Molly Russell. In 2017, she took her own life, having looked at self-harm and suicide-related material on Instagram. Such content is not hard


to find, and it is almost impossible for parents to know if their teenager is looking at it. Scrolling quietly on their phone, the troubled teen could be doing anything. Most parents do not


really understand Instagram anyway, with its range of public and private messages and its time-limited stories. Since Molly Russell’s tragic death, her father has waged a campaign to get


Instagram to change how it handles the kind of material she was looking at. Molly’s father, Ian Russell, only became aware that his daughter engaged with this kind of content after she took


her own life. On Sunday, Instagram’s boss, Adam Mosseri, announced in a blog post that “we will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or


memes or content from films or comics that use graphic imagery. We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods.”


He went on to say that “accounts sharing this type of content will also not be recommended in search or in our discovery surfaces, like Explore. And we’ll send more people more resources


with localised helplines like the Samaritans and PAPYRUS in the UK or the National Suicide Prevention Lifeline and The Trevor Project in the United States.” These are important steps


forward. For too long, social media firms have tried to absolve themselves of any responsibility for what happens on their networks. They provide the platform, they argued, what users choose


to do with it is up to them. Those that built the likes of Instagram, Facebook and Twitter always thought that it was not for them to dictate what people could say in the spaces that they


created. It is all very idealistic, but not really practical as the services have grown. Ultimately, it has put vulnerable users in danger. While the companies are slowly moving away from


this position, the logic is still having consequences. It has, for example, contributed in part to Facebook, which owns Instagram, being nervous about the way it fact-checks political


adverts. It does not reject political adverts that contain falsehoods in the same way it does other types of paid advertising. With the ability to communicate instantly under the cloak of


anonymity, the nastiest parts of human nature can emerge. And, as Molly’s father pointed out to BBC News on his trip to California during which Instagram made its announcement, “the big


platforms really don’t seem to be doing much about it.” I would strongly push back against censorship on social networks. However, for too long, those behind these social networks have tried


to shirk their responsibilities. They need to protect their millions of users, particularly those who are younger or potentially vulnerable to certain kinds of material. That may mean


making it harder to find, or banning it altogether (although we must always be aware of the risks of driving things into ever darker corners of the internet). Social networks also have to


concede that they are publishers. No, Facebook, Twitter and Instagram might not employ “reporters” to write original stories (yet), but they are hosting content in the same way newspapers,


magazines and websites are. The videos posted on these channels, as well as YouTube and others, are no different from linear television to younger users. Newspapers and television channels


are very careful when discussing issues of suicide, but on social media it is a free-for-all. This week, Instagram took a small but significant step in bringing that free-for-for all to an


end.