The social media giant has updated its privacy controls and a new Firefox extension stops Facebook from snooping on your browsing.
In the wake of the Cambridge Analytica scandal, Facebook has released a major privacy update, allowing users to more easily view and edit the information the social media giant holds about them. Mozilla, meanwhile, has released a Firefox browser extension that stops Facebook from snooping on your browsing.
Both releases come at a time when the social media giant is under scrutiny following developments involving Cambridge Analytica, the political strategy company alleged to accessed data of some 50 million Facebook users without their permission.
It includes a privacy shortcuts menu so users are able to quickly make changes to their privacy settings, including adding two-factor authentication, deleting posts that have been shared, like or commented upon, managing the information used to show ads and changing the information people can see about you.
The new feature also allows you to delete things you've searched for (such as users, groups, ages or just search terms). If you want to, you're now able to download all of the information Facebook holds about you too, including photos, posts, contacts and more. If you want to move it to another service, you can do pretty easily.
Across all devices, the social network's settings user interface has been updated to make it easier to find privacy settings and it's removed some “outdated” setting so it's totally clear which apply to which apps.
“In the coming weeks, we’ll be proposing updates to Facebook’s terms of service that include our commitments to people. We’ll also update our data policy to better spell out what data we collect and how we use it,” said Erin Egan, VP and chief privacy officer of policy and Ashlie Beringer, VP and deputy general counsel.
“These updates are about transparency – not about gaining new rights to collect, use, or share data. We’ve worked with regulators, legislators and privacy experts on these tools and updates.”
In a statement, Facebook said the privacy update was taken in order to comply with EU’s upcoming GDPR guidelines, due to come into force in May. The company said the majority of its updates have been on the cards for a while, but Facebook has decided to roll them out now to make it clear to users their data will no longer be misused.
Next: Firefox extension stops Facebook from snooping on your browsing
Firefox extension stops Facebook from snooping on your browsing
Facebook’s privacy update doesn’t stop the social media giant from tracking you around the web, however. This is a common technique used by websites to track the third-party cookies on your system, allowing them to see what you click on and what sites you visit.
To counter this, Mozilla Firefox has released a clever extension called Facebook Container that puts a stop to Facebook off-platform data mining efforts.
When you install the Facebook Container extension, it will open up a blue-coloured browser tab. Everything you do on Facebook in this tab will be, as the name suggest, contained in the tab. If you click on a non-Facebook link, the link will open outside of the container, letting you browse without the social network looking over your shoulder.
Facebook Container obviously isn’t something that could have prevented the Cambridge Analytica scandal – seeing as that used data found on the social network itself. However, being logged out of Facebook in other tabs will prevent the company from following you from site to site. What it does is essentially keep Facebook from snooping on all your internet habits and shopping decisions, so you can restrict those pesky targeted ads from appearing in your news feed.
“The pages you visit on the web can say a lot about you. They can infer where you live, the hobbies you have and your political persuasion,” Mozilla wrote in a blog post. “This add-on offers a solution that doesn’t tell users to simply stop using a service that they get value from. Instead, it gives users tools that help them protect themselves from the unexpected side effects of their usage.”
Last week, in a pretty clever PR move, Mozilla announced that it would no longer be advertising on Facebook, in a bid to encourage Mark Zuckerberg to improve the social network’s privacy settings. For those who don’t want to commit to deleting their Facebook profile, Facebook Container is a convenient compromise.
You can download Facebook Container here.
Next: Inside the Cambridge Analytica scandal
Inside the Cambridge Analytica scandal
So, what was the Cambridge Analytica scandal all about anyway?
Facebook recently banned two related accounts. The first was Cambridge Analytica, a data analytics firm that built profiles of Facebook users for targeted political advertising. The second was Christopher Wylie, the co-founder of the service.
The key difference? The latter was banned after talking to journalists about the former.
But what is Cambridge Analytica accused of doing, how did it allegedly do it, and what does it all have to do with Brexit and the ascendancy of Donald Trump? Here’s a quick explainer to fill in the gaps and keep you abreast of developments in a story that’s likely to run and run.
What is Cambridge Analytica?
Cambridge Analytica is a British data analytics firm which helps political campaigns target voters online. The company (and its parent company SCL) claims to have built “5,000 data points on over 230 million American voters” allowing political campaigns to target voters susceptible to certain messages with precise accuracy.
The question is how they acquired this data – and much of it is alleged to come from Facebook, without user consent.
Facebook’s announcement that it was suspending Cambridge Analytica – which stops the latter buying ads or accessing Facebook data – appeared to be a pre-emptive response to stories which broke around the world, of how Cambridge Analytica acquired data on 50 million Facebook users without their consent. The stories came from interviews with Christopher Wylie.
According to Facebook, this data was provided to Cambridge Analytica by a Cambridge University researcher called Aleksandr Kogan, who started a firm called Global Science Research (or GSR). GSR created a personality quiz on Facebook called “thisisyourdigitallife” which was labelled as a research experiment to be used by scientists to build psychological profiles.
Facebook’s developer policies allow data to be used in this way – what they don’t allow developers to do is to use it for other things, which is what Kogan is accused of doing. The data reportedly ended up in Cambridge Analytica’s hands to be used as part of its voter modelling.
In other words, 270,000 people who took what was marked as a fun, throwaway quiz were actually providing detailed information to be used by political campaigns to try and manipulate their voting intentions in the future.
But how did 270,000 turn into 50 million?
This is down to Facebook. In the old days, access to your Facebook data meant not just yours, but also friends’ accounts, provided their security settings weren’t a lot more locked down than the average profile. In that way, 270,000 voter profiles became 50 million.
In 2015, Facebook changed what data is accessible, making friends’ information off-limits to third-party apps. But by then, the profile information was already out there, so in this particular case this is locking the door after the data horse has bolted.
How is Facebook data useful to political campaigns?
As well as general demographic data (location, age, gender and so on) which can be used to predict voting intention (for example, in the 2017 UK general election, you were more likely to have voted Labour if you were under the age of 40), there are other interesting, and often obscure, parallels. For example, as The Guardian explains, people who liked the page “I hate Israel” on Facebook were more likely to also digitally show their appreciation of Kit Kats and Nike shoes.
As Wylie explains in the same interview: “I began looking at consumer and demographic data to see what united Lib Dem voters, because apart from bits of Wales and the Shetlands it’s weird, disparate regions. And what I found is there were no strong correlations. There was no signal in the data.
“And then I came across a paper about how personality traits could be a precursor to political behaviour, and it suddenly made sense. Liberalism is correlated with high openness and low conscientiousness, and when you think of Lib Dems they’re absent-minded professors and hippies. They’re the early adopters… they’re highly open to new ideas. And it just clicked all of a sudden.”
If you know and can talk directly to voters who are more responsive to your message, and where they live, the theory goes, you can have a serious impact on the election: you can prompt likely supporters into voting, and try to depress the turnout amongst those less likely to vote for your candidate. This is hardly a new development, but it's the first time that the full scale of how it is already being used has been revealed.
So is this a data breach, or what?
That’s how some news outlets are framing it, but that’s far from the full story. A data breach suggests the information was hacked, leaked or stolen. What actually happened was that the data was taken in a manner that was entirely within the rules that Facebook created. The data was accessible to researchers, and they took it on the understanding that it was only to be used for that express purpose. It was only after it was extracted that it was allegedly passed onto Cambridge Analytica.
As Wylie explains: “Facebook could see it was happening. Their security protocols were triggered because Kogan’s apps were pulling this enormous amount of data, but apparently, Kogan told them it was for academic use. So they were like, ‘Fine’.”
As Facebook itself puts it: “The claim that this is a data breach is completely false. Aleksandr Kogan requested and gained access to information from users who chose to sign up to his app, and everyone involved gave their consent. People knowingly provided their information, no systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked.”
This may feel like semantics, but it’s important when looking at the companies’ responses to the scandal...
Is that all Cambridge Analytica is accused of doing?
While initially the story was one of data harvesting without consent, there have been rumblings of Cambridge Analytica bending the rules around elections in order to help their clients. “Rules don’t matter for them," Wylie told the New York Times. “For them, this is a war, and it’s all fair.
“They want to fight a culture war in America. Cambridge Analytica was supposed to be the arsenal of weapons to fight that culture war.”
Hard evidence of this has so far been hard to come by, though when the UK’s Channel 4 sent a journalist undercover posing as a wealthy Sri Lankan hoping to buy the company's services, various figures at the company suggested many options beyond data analysis. Secretly filmed, senior figures at the company claimed it had a network of ex-spies, and could use bribes and sex workers to entrap politicians.
At the end of the film, Cambridge Analytica defends itself, suggesting that the dialogue was to detect wrongdoing in potential clients. “We routinely undertake conversations with prospective clients to try to tease out any unethical or illegal intentions...Cambridge Analytica does not use untrue material for any purpose.”
In the follow-up show based purely on Cambridge Analytica's work in America, the firm claimed that it fed negative campaign messages into the “bloodstream of the internet” to help the Trump campaign. “We just put information into the bloodstream to the internet and then watch it grow, give it a little push every now and again over time to watch it take shape. And so this stuff infiltrates the online community and expands but with no branding – so it’s unattributable, untrackable,” Mark Turnbull, managing director of Political Global is recorded as saying.
In the same documentary, CEO Alexander Nix was recorded boasting that the company uses an email system that automatically self-destructs to leave no trace. “So you send them and after they’ve been read, two hours later, they disappear,” he explains in the film. “There’s no evidence, there’s no paper trail, there’s nothing."
Before the footage was aired, Cambridge Analytica announced it was suspending CEO Alexander Nix pending a full investigation.
What does Facebook say about this?
Firstly, Facebook claims that Cambridge Analytica “certified” three years ago that it had deleted information stored on the request of Facebook. The report in the New York Times suggests that at least some of it remains, which is why the company has been banned from the service.
“We are moving aggressively to determine the accuracy of these claims,” the company wrote. “If true, this is another unacceptable violation of trust and the commitments they made. We are suspending SCL/Cambridge Analytica, Wylie and Kogan from Facebook, pending further information.
“We are committed to vigorously enforcing our policies to protect people’s information. We will take whatever steps are required to see that this happens. We will take legal action if necessary to hold them responsible and accountable for any unlawful behaviour.”
What about Cambridge Analytica?
Cambridge Analytica, for its part, denies any wrongdoing. Firstly it points to GSR being the company that broke Facebook’s terms and condition and claims it deleted the data as soon as it learned it wasn’t allowed access. Secondly, it denies using Facebook data in the Trump election campaign. Thirdly, it’s quite insistent that the whistleblower Christopher Wylie was a contractor and not the founder of the business as some reports suggested.
This Twitter thread from the company expands on this point:
Reality Check: Cambridge Analytica uses client and commercially and publicly available data; we don’t use or hold any Facebook data. 1/8— Cambridge Analytica (@CamAnalytica) March 17, 2018
And Christopher Wylie himself?
Despite CEO Alexander Nix telling MPs last month that Global Science Research company had not paid for any data for Cambridge Analytica, Wylie claims he has a contract and receipts of around $1 million showing the opposite.
His decision to go public, according to a friend, comes down to wanting to undo the damage he believes his work has done. “He created it. It’s his data Frankenmonster. And now he’s trying to put it right,” said friend told The Guardian.
In the same article, Wylie explains why this is important to him: “I think it’s worse than bullying, because people don’t necessarily know it’s being done to them. At least bullying respects the agency of people because they know. So it’s worse, because if you do not respect the agency of people, anything that you’re doing after that point is not conducive to a democracy. And fundamentally, information warfare is not conducive to democracy.”
Did this help elect Trump and get the UK to vote for Brexit?
Cambridge Analytica denies it used Facebook data in the Trump presidential campaign, though it reportedly had some involvement. It’s also worth noting that the president’s former chief of staff and one of many campaign managers, Steve Bannon, was a stakeholder in the company, and previously a vice president on the company board.
That denial may be semantic in nature. In a Channel 4 expose, representatives from the company were recorded boasting that they were the ones that got Donald Trump elected 45th president of the United States. Executives were recorded saying they “ran all the digital campaign, the television campaign and our data informed all the strategy” for the Trump campaign, including informing the “defeat Crooked Hilary” brand of attack ads.
Reports claimed that Cambridge Analytica was also used by the Leave campaign in the EU referendum, but testimony from Arron Banks claims the company only tendered a proposal and ultimately wasn’t hired. This seems to contrast with deleted tweets suggesting the relationship was deeper than this:
Conflicting reports, then, but looking at the question more generally, can social profiling help sway elections? The disappointing answer to that question is twofold: First, it depends who you ask, and second, nobody really knows.
To the first point, the answer varies even within Facebook. The company, until recently, had a whole page devoted to how an ad campaign helped the SNP win big at the 2015 UK general election. As former Facebook advertising executive Antonio Garcia Martinez said in 2016, “It’s crazy that Zuckerberg says there’s no way Facebook can influence the election when there’s a whole sales force in Washington DC that does nothing but convince advertisers that they can.”
But then it’s in the interests of Facebook’s ad department to say that, isn’t it? Real-world evidence is pretty hard to come by. Yes, Facebook’s own peer-reviewed research has proven that a simple “I voted” badge can boost voter turnout by pushing friends to do the same, which could theoretically be used by the company to tactically boost turnout in some regions, while suppressing it in others, but these options are (understandably) not open to advertisers. More importantly, you can’t run two identical elections with a third control election to test the theory.
Governments likely won’t like this. What’s going to happen to Facebook?
The Washington Post suggests that Facebook is likely going to be investigated by the FTC to see whether it adequately protected its data or not. The likely outcome of that is “massive fines.”
But more generally, this is likely a bit of a wakeup call to legislators about the power of internet giants and the importance of robust data protection laws – and it’s entirely possible that more regulation is on the way. Just today The Telegraph led with the news that digital minister Matt Hancock has declared that greater regulation of Facebook is required.
Downing Street has also gotten involved: “The allegations are clearly very concerning, it’s essential people can have confidence that their personal data can be protected and used in an appropriate way,” Theresa May’s spokesman said. “So it is absolutely right the information commissioner is investigating this matter and we expect Facebook, Cambridge Analytica and all the organisations involved to cooperate fully.”
Is that just hot air? Very possibly. Trying to regulate internet giants after years of letting them do their thing was never going to be easy.
But with the scandal likely to have irked multiple governments around the world, the opportunity for collaborative action makes a shift in the balance of power more likely than it’s been for years.