Facebook, Twitter, and RIM mulls UK PM Cameron’s riot block threat

Filed as Editorial on August 12, 2011 2:07 am

Social Media LogosIn the midst of the week’s widespread lootingBritish prime minister David Cameron suggested blocking individuals from accessing their social media accounts if they’re found to be openly plotting violent acts.

Twitter, Facebook, and BlackBerry Messenger have been pointed out as playing a role in organizing riots in London, Birmingham, Manchester and other cities in the UK.

Home Secretary Teresa May has summoned the three companies to discuss methods of preventing people from organizing violent mass actions.

The Home Secretary said that among the issues that will be discussed is if there is actually a way to stop people from using the social networking sites to “plot violence, disorder, and criminality.”

All three social networking sites have expressed willingness to cooperate and this seems to be a reversal of policy, especially when considering how social media played a role in Egypt.  Is this a case where what was perfectly all right in Egypt won’t be perfectly all right in the UK?

Biz Stone actually blogged citing a defense of the freedom of expression:

Our goal is to instantly connect people everywhere to what is most meaningful to them. For this to happen, freedom of expression is essential. Some Tweets may facilitate positive change in a repressed country, some make us laugh, some make us think, some downright anger a vast majority of users. We don’t always agree with the things people choose to tweet, but we keep the information flowing irrespective of any view we may have about the content.

Since yesterday people on Facebook have been noticing buggy actions — not being able to post content, posted content disappearing and then reappearing, etcetera.  In at least one Facebook group that I am in, links that refer to articles on freedom of speech versus religion failed several times to get posted and then all of a sudden, the link post appears several minutes later.

An article in FT Tech Hub probably explains this buggy behavior:

“Facebook has a very clear set of terms that we use for the service called our statement of rights and responsibilities,” Richard Allan, Facebook’s European head of policy, told the FT.

“That makes it very clear that people should not use the platform to make credible threats of violence or to promote illegal activity… Where we do get reports or are made aware of content that is in breach of the rules, the content comes down and depending on the seriousness of the offence the user may lose their account.”

Moreover, in the same article, Facebook said that it had assigned more people to remove posts that explicitly incite violence.

The thing is, I doubt Facebook’s employees can actually scour all the items being shared on it and judge correctly whether a post is seriously inciting violence or is just a joke being shared among friends on Facebook.

Considered with other instances where Facebook use had been linked to violence and death, I think this will be a long ranging discussion on censorship on the social web.

 

Tags: , , , , , , ,

This post was written by

You can visit the for a short bio, more posts, and other information about the author.


Submissions & Subscriptions

Submit the post to Reddit, StumbleUpon, Digg or Del.icio.us.

Did you like it? Then subscribe to our RSS feed!



    Your words are your own, so be nice and helpful if you can. If this is the first time you're posting a comment, it might go into moderation. Don't worry, it's not lost, so there's no need to repost it! We accept clean XHTML in comments, but don't overdo it please.

    Current ye@r *