Like what you've read?

On Line Opinion is the only Australian site where you get all sides of the story. We don't
charge, but we need your support. Here�s how you can help.

  • Advertise

    We have a monthly audience of 70,000 and advertising packages from $200 a month.

  • Volunteer

    We always need commissioning editors and sub-editors.

  • Contribute

    Got something to say? Submit an essay.


 The National Forum   Donate   Your Account   On Line Opinion   Forum   Blogs   Polling   About   
On Line Opinion logo ON LINE OPINION - Australia's e-journal of social and political debate

Subscribe!
Subscribe





On Line Opinion is a not-for-profit publication and relies on the generosity of its sponsors, editors and contributors. If you would like to help, contact us.
___________

Syndicate
RSS/XML


RSS 2.0

'Instagram Helped Kill My Daughter': censorship tendencies in social media

By Binoy Kampmark - posted Thursday, 31 January 2019


It is all a rather sorry tale. Molly Russell, another teenager gorged on social media content, took her own life in 2017, supposedly after viewing what the BBC described as "disturbing content about suicide on social media." Causation is presumed, and the platform hosting the content is saddled with blame.

Molly's father was not so much seeking answers as attributing culpability. Instagram, claimed Ian Russell, "helped kill my daughter". He was also spoiling to challenge other platforms: "Pinterest has a huge amount to answer for." These platforms do, but not in quite the same way suggested by the aggrieved father.

The political classes were also quick to jump the gun. Here was a chance to score a few moral points as a distraction from the messiness of Brexit negotiations. UK Health Secretary Matt Hancock was in combative mood on the Andrew Marr show: "If we think they need to do things they are refusing to do, then we can and we must legislate." Material dealing with self-harm and suicide would have to be purged. As has become popular in this instance, the purging element would have to come from technology platforms themselves, helped along by the kindly legislators.

Advertisement

Any time the censor steps in as defender of morality and safety citizens should be alarmed. Such attitudes are precisely the sorts of things that empty libraries and lead to the burning of books, even if they host the nasty and the unfortunate. Content deemed undesirable must be removed; offensive content must be expunged to make us safe. The alarming thing here is that compelling the tech behemoths to undertake such a task has the effect of granting them even more powers of social control than before. Don't they exert enough control as it is?

While social media giants can be accused, on a certain level, of faux humanitarianism and their own variant of sublimated sociopathic control (surveillance capitalism is alive and well), they are merely being hectored for the logical consequence of sharing information and content. This is set to become more concentrated, with Facebook, as Zak Doffman writes, planning to integrate Instagram and WhatsApp further to enable users "across all three platforms to share messages and information more easily". Given Facebook's insatiable quest for advertising revenue, Instagram is being tasked with being the dominant force behind it.

The onus on production and exchange is on customers: the customers supply the material, and spectacle. They are the users and the exploited. This, in turn, enables the social media tech groups to monetise data, trading it, exploiting it and tanking privacy measures in the process. The social media junkie is a modern, unreflective drone.

In doing so, an illusion of independent thinking is created, where debates can supposedly be had, and ideas formed. The grand peripatetic walk can be pursued. Often, the opposite takes place: groups assemble along lines of similar thought; material of like vein is bounced around under the impression it advances discussion when it merely provides filling for a cork-lined room or chamber of near-identical thinking. All of this is assisted by the algorithmic functions performed by the social media entities, all in the name of making the "experience" you have a richer one. Far be it in their interest to make sure you juggle two contradictory ideas at the same time.

Instagram's own "Community Guidelines" have the aim of fostering and protecting "this amazing community" of users. It suggests that photos and videos that are shared should only be done by those with a right to. Featured photos and videos should be directed towards "a diverse audience". A reminder that the tech giant is already keen on promoting a degree of control is evident in restrictions on nudity – a point that landed the platform in some hot water last year. "This includes photos, videos, and some digitally-created content that show sexual intercourse, genitals, and close-ups of fully-nude buttocks." That's many an art period banished from viewing and discussion.

The suicide fraternity is evidently wide enough to garner interest, even if the cult of self-harm takes much ethical punishment from the safety lobby. Material is still shared. Self-harm advisories are distributed through the appropriate channels.

Advertisement

Instagram's response to this is to try to nudge such individuals towards content and groups that might just as equally sport reassuring materials to discourage suicide and self-harm. Facebook, through its recently appointed Vice-President of Global Affairs, Sir Nick Clegg, was even happy to point out that the company had prevented suicides: "Over the last year, 3,500 people who were displaying behaviour liable to lead to the taking of their own lives on Facebook were saved by early responders being pointed to those and people and intervening at the right time."

This is all to the good, but such views fail in not understanding that social media is not used or engaged in to change ideas so much as create communities who only worship a select few. The tyranny of the algorithm is a hard one to dislodge.

In engaging such content, we are dealing with narcotised dragoons of users, the unquestioning creating content for the unchallenged. That might prove to be the greatest social crime of all, the paradox of nipping curiosity rather than nurturing it, but instead of dealing with the complexities of information from this perspective, governments are going to make technology companies the chief censors. It might well be argued that enough of that is already taking place as it is, this being the age of deplatforming. Whether it be a government or a social media giant, the shoddy principle is the same: others know better than you do, and you should be protected from yourself.

  1. Pages:
  2. Page 1
  3. All


Discuss in our Forums

See what other readers are saying about this article!

Click here to read & post comments.

4 posts so far.

Share this:
reddit this reddit thisbookmark with del.icio.us Del.icio.usdigg thisseed newsvineSeed NewsvineStumbleUpon StumbleUponsubmit to propellerkwoff it

About the Author

Binoy Kampmark was a Commonwealth Scholar at Selwyn College, Cambridge. He currently lectures at RMIT University, Melbourne and blogs at Oz Moses.

Other articles by this Author

All articles by Binoy Kampmark

Creative Commons LicenseThis work is licensed under a Creative Commons License.

Article Tools
Comment 4 comments
Print Printable version
Subscribe Subscribe
Email Email a friend
Advertisement

About Us Search Discuss Feedback Legals Privacy