Fancy a job at Facebook?
You might imagine it's all very exciting and glamorous to be able to boast that you work for the world's biggest social network, but the reality may be rather different.
Imagine, for instance, that you're one of Facebook's in-house moderators, whose job it is to review and remove inappropriate content from the site. Yup, you think you've won the career lottery and the truth is that you're spending all deciding if what someone has posted constitutes hate speech, terrorist propaganda or disturbing sexual content.
As Olivia Solon at The Guardian has previously reported, you might be being paid about $15 per hour to decide whether the child sexual abuse images, and videos of beheadings and bestiality should be removed from Facebook or not.
Questions have been asked before regarding just how much psychological support and training Facebook gives to such moderators, but now there is a further concern - the physical safety of both the worker and their loved ones.
As The Guardian reports today, Facebook has unwittingly exposed the identities of moderating staff to suspected terrorists on the network.
There appears to have been a colossal security lapse on the social network which affected more than 1,000 Facebook workers, resulting in their personal profiles popping up as notifications in the activity logs of Facebook groups as admins were removed.
In short, when a Facebook moderator removed an administrator for breaking the site's terms of service, the personal profiles of those moderators were then shared with the group's remaining admins.
The Guardian explains further:
"Of the 1,000 affected workers, around 40 worked in a counter-terrorism unit based at Facebook’s European headquarters in Dublin, Ireland. Six of those were assessed to be “high priority” victims of the mistake after Facebook concluded their personal profiles were likely viewed by potential terrorists."
"The Guardian spoke to one of the six, who did not wish to be named out of concern for his and his family’s safety. The Iraqi-born Irish citizen, who is in his early twenties, fled Ireland and went into hiding after discovering that seven individuals associated with a suspected terrorist group he banned from Facebook – an Egypt-based group that backed Hamas and, he said, had members who were Islamic State sympathizers – had viewed his personal profile."
Sheesh. What a fail by Facebook.
It must be ghastly enough to be in a team which has to view hate content on Facebook, without having the additional threat that your identity could be unmasked to suspected terrorists.
Even if the chances of moderators themselves being physically attacked is perceived to be low, there will be fears that their family (perhaps still living in the Middle East) could be put at risk because Facebook allowed personal profiles to be revealed in such a slip-shod fashion. That's a risk I wouldn't be happy to take for a mere $15 per hour.
I think the problem here is that Facebook was never designed with security and privacy in mind.
The site's raison d'etre has never been about building a safe community for friends to share and chat with each other.
Time after time Facebook has put users at risk with privacy gaffes and corporate policies designed to boost advertising revenues over the welfare of its members. If security and privacy were truly part of Facebook's DNA it's hard to imagine how something this dreadful could ever have happened.
Not that I think Facebook did this intentionally. Of course, they didn't - it was clearly a monumental cock-up on their part. But it was an accident with potentially serious consequences.
A company which had security and privacy at its heart would never have allowed a mistake like this to occur.
Ironically news of this dangerous security breach has come to light just as Facebook announces details of the steps it is taking to remove terrorist-related content.