From The Psychometircs Centre at the University of Cambridge: Your favourite movies, music, books and more can reveal private attributes about you, such as your personality, intelligence or sexual orientation. This tool uses the Pages you like on Facebook to predict your psycho-demographic profile. It will explain which aspects of your digital footprint contribute to the way that others see you and is based on opt-in psychological ground truth and social media profiles from over 6 million volunteers.
From ProPublica: Facebook has a particularly comprehensive set of dossiers on its more than 2 billion members. Every time a Facebook member likes a post, tags a photo, updates their favorite movies in their profile, posts a comment about a politician, or changes their relationship status, Facebook logs it. Facebook also buys data about its users’ mortgages, car ownership and shopping habits from some of the biggest commercial data brokers. Facebook uses all this data to offer marketers a chance to target ads to increasingly specific groups of people. We built a tool that works with the Chrome Web browser that lets you see what Facebook says it knows about you.
Facebook provide a feature to enable users to download their Facebook data including timeline posts and activity logs. You can also review your ad profile.
Facebook have recently launched About Facebook Adverts which endeavours to explain the 98 personal data points it uses to target advertising.
Facebook is an enormously powerful corporation, harnessing both the self-disclosed and gleaned personal data of almost 2 billion people. Its user-base is larger than the population of any country. The company is all pervasive online, tracking and profiling users and non-users alike. Cracking the Code looks at the insides of this giant machine and how Facebook turns your thoughts and behaviours into profits—whether you like it or not. And it’s not just a one-way transaction either. Cracking the Code also explains how Facebook uses vast troves of web data to manipulate the way you think and feel, as well as act—all in the sole interests of Facebook, masquerading as “community.” What are the social implications of this—when one company basically controls the insights and experiences of the entire online world, with extremely personalised and targeted social and behavioural engineering on a scale never before seen?
A playlist of 26 videos from Privacy International. Starting with What Is Privacy? This video is a high level overview of why privacy is important. Privacy ensures that democracy is possible by allowing citizens to have safe spaces to discuss ideas, grow, and learn.
More information here
Privacy International investigates the secret world of government surveillance and exposes the companies enabling it and litigate to ensure that surveillance is consistent with the rule of law. Check out their Privacy 101 Explainers series which details how these important issues affect your life.
Does privacy matter? The line between public and private has blurred in the past decade, both online and in real life. Alessandro Acquisti explains what this means and why it matters. In this thought-provoking, slightly unnerving talk, he shares details of recent and ongoing research — including a project that shows how easy it is to match a photograph of a stranger with their sensitive personal information.
From Glenn Greenwald: It is true that as human beings, we\’re social animals, which means we have a need for other people to know what we\’re doing and saying and thinking, which is why we voluntarily publish information about ourselves online. But equally essential to what it means to be a free and fulfilled human being is to have a place that we can go and be free of the judgmental eyes of other people… Now, there\’s a reason why privacy is so craved universally and instinctively. It isn\’t just a reflexive movement like breathing air or drinking water. The reason is that when we\’re in a state where we can be monitored, where we can be watched, our behavior changes dramatically. The range of behavioral options that we consider when we think we\’re being watched severely reduce.
See also Tijmen Schep\’s Social Cooling
From Tristan Harris: if Facebook had a choice between showing you the outrage feed and a calm newsfeed, they would want to show you the outrage feed, not because someone consciously chose that, but because that worked better at getting your attention… I don\’t know a more urgent problem than this, because this problem is underneath all other problems. It\’s not just taking away our agency to spend our attention and live the lives that we want, it\’s changing the way that we have our conversations, it\’s changing our democracy, and it\’s changing our ability to have the conversations and relationships we want with each other.
Paul Lewis speaks to Justin Rosenstein (helped to develop the \’Like\’ button) and Loren Brichter (designed the pull-to-refresh feature) who, armed with hindsight, are critical of the tools and features they developed. Rosentein notes that it is very common for humans to develop things with the best of intentions that have unintended, negative consequences. Brichter adds I have two kids and I regret every minute that I’m not paying attention to them because my smartphone has sucked me in… Smartphones are useful tools, but they’re addictive … I regret the downsides.
Related: Leah Pearlman (also worked on the development of the \’Like\’ button) reflects on a validation-powered web: the experience of external validation compared to true inner validation doesn\’t even compare. Watch Leah\’s TED talk on internal/external validation here.
From Matt Haig writing in The Guardian: We are traditionally far better at realising risks to physical health than to mental health, even when they are interrelated. If we can accept that our physical health can be shaped by society – by secondhand smoke or a bad diet – then we must accept that our mental health can be too.
From Franklin Foer writing in The Guardian: In reality, Facebook is a tangle of rules and procedures for sorting information, rules devised by the corporation for the ultimate benefit of the corporation. Facebook is always surveilling users, always auditing them, using them as lab rats in its behavioural experiments. While it creates the impression that it offers choice, in truth Facebook paternalistically nudges users in the direction it deems best for them, which also happens to be the direction that gets them thoroughly addicted. It’s a phoniness that is most obvious in the compressed, historic career of Facebook’s mastermind.
From NPR\’s Hidden Brain: Millions of people around the world use social media every day to stay in touch with friends and family. But ironically, studies have shown that people who spend more time on these sites feel more socially isolated than those who don\’t…we explore the psychological effects that social media has on us, and how FOMO — or, the fear of missing out — can lead to actually missing out.
From BBC\’s Panorama: Facebook is thought to know more about us than any other business in history, but what does the social network that Mark Zuckerberg built do with all of our personal information? Reporter Darragh MacIntyre investigates how Facebook\’s powerful algorithms allow advertisers and politicians to target us more directly than ever before, and he questions whether the company\’s size and complexity now makes it impossible to regulate.
In this film, Aleks Krotoski travels the world to undergo challenges that explore our digital life in the 21st century. Watch her be stalked and hacked, fight to get leaked documents back, dive into open data and live in a futuristic home that monitors her every move.
From The Guardian, a detailed series that includes leaked guidelines for moderators – the people who monitor content on Facebook and decide what is approariate, and what is not. The guidelines suggest that it is acceptable to post threats against women and children. Videos of self-harm are also acceptable as the site doesn’t want to censor or punish people in distress.
By Julia Angwin at ProPublica: A trove of internal documents sheds light on the algorithms that Facebook’s censors use to differentiate between hate speech and legitimate political expression.
From Vicki Boykis: Facebook collects data about you in hundreds of ways, across numerous channels. It’s very hard to opt out, but by reading about what they collect, you can understand the risks of the platform and choose to be more restrictive with your Facebook usage.
Mozilla hosted the following discussion with Tim Wu (who coined the term “net neutrality\”): You don\’t need cash to search Google or to use Facebook, but they\’re not free. We pay for these services with our attention and with our data. While advertising-supported media was once confined to a small part of our lives like newspapers and radio, our work and lives are increasingly online and ads take the front row in our daily lives. This business model can have a democratizing effect: it makes products and information accessible to many more people, who might otherwise be priced out. But it also means that the main audience for these companies is not you – the person using their services – but rather, advertisers who keep the lights on.
History is punctuated by acts of refusal and outright revolt against this model, from the invention of the remote control, to the more recent rise of cord-cutting and ad-blocking software. Yet, whenever the attention merchants have seemed to lose their charm, they\’ve always found a way to reinvent themselves and to recapture us. What does this mean for the future of the open Internet?
Talk begins at 5:20
From Julia Angwin: the algorithm is making the decision that the newspaper editors used to make about what to put on the front page, so that\’s a fundamental difference in how you experience news; the human descision making, the automated decision making, and they both have their benefits and downsides.
From Eli Pariser: And the thing is that the algorithms don\’t yet have the kind of embedded ethics that the editors did. So if algorithms are going to curate the world for us, if they\’re going to decide what we get to see and what we don\’t get to see, then we need to make sure that they\’re not just keyed to relevance. We need to make sure that they also show us things that are uncomfortable or challenging or important.
From Alexis Madrigal: The informational underpinnings of democracy have eroded, and no one has explained precisely how… We’ve known since at least 2012 that Facebook was a powerful, non-neutral force in electoral politics… If every News Feed is different, how can anyone understand what other people are seeing and responding to?… The information systems that people use to process news have been rerouted through Facebook, and in the process, mostly broken and hidden from view. It wasn’t just liberal bias that kept the media from putting everything together. Much of the hundreds of millions of dollars that was spent during the election cycle came in the form of “dark ads.”
The truth is that while many reporters knew some things that were going on on Facebook, no one knew everything that was going on on Facebook, not even Facebook.
Report from the Children\’s Commissioner for England: The supposedly \’public space\’ of the internet is almost entirely controlled by a series of global private companies with too little responsibility towards children, operating significantly beyond the reach of national laws.
John Naughton writing in The Guardian: Facebook\’s proprietary algorithms choose which items to display in a process that is sometimes called \”curation\”. Nobody knows the criteria used by the algorithms – that\’s as much of a trade secret as those used by Google\’s page-ranking algorithm. All we know is that an algorithm decides what Facebook users see in their news feeds.
From ProPublica: Consumer data companies are scooping up huge amounts of consumer information about people around the world and selling it, providing marketers details about whether you\’re pregnant or divorced or trying to lose weight, about how rich you are and what kinds of cars you drive. But many people still don\’t know data brokers exist.
Additonally, Facebook gathers data on its users from external sources whom they describe generally as a few different sources but in real terms means data brokers. This enables Facebook to develop detailed user models incorporating both on- and offline lives.
…on the basis of an average of 68 Facebook \”likes\” by a user, it was possible to predict their skin color (with 95 percent accuracy), their sexual orientation (88 percent accuracy), and their affiliation to the Democratic or Republican party (85 percent). But it didn\’t stop there. Intelligence, religious affiliation, as well as alcohol, cigarette and drug use, could all be determined. From the data it was even possible to deduce whether someone\’s parents were divorced.
Our smartphone…is a vast psychological questionnaire that we are constantly filling out, both consciously and unconsciously. Above all, however—and this is key—it also works in reverse: not only can psychological profiles be created from your data, but your data can also be used the other way round to search for specific profiles.
Research article by Shoshana Zuboff: This article describes an emergent logic of accumulation in the networked sphere, ‘surveillance capitalism,’ and considers its implications for ‘information civilization.’… Facebook has consistently made inroads here too, as it conducts experiments in modifying users’ behavior with a view to eventually monetizing its knowledge, predictive capability, and control.
From Sir Tim Berners-Lee on the World Wide Web\’s 28th birthday:
1) The current business model for many websites offers free content in exchange for personal data. Many of us agree to this – albeit often by accepting long and confusing terms and conditions documents – but fundamentally we do not mind some information being collected in exchange for free services. But, we’re missing a trick. As our data is then held in proprietary silos, out of sight to us, we lose out on the benefits we could realise if we had direct control over this data, and chose when and with whom to share it. What’s more, we often do not have any way of feeding back to companies what data we’d rather not share – especially with third parties – the T&Cs are all or nothing.
2) Today, most people find news and information on the web through just a handful of social media sites and search engines. These sites make more money when we click on the links they show us. And, they choose what to show us based on algorithms which learn from our personal data that they are constantly harvesting.
3) Political advertising online has rapidly become a sophisticated industry. The fact that most people get their information from just a few platforms and the increasing sophistication of algorithms drawing upon rich pools of personal data, means that political campaigns are now building individual adverts targeted directly at users. One source suggests that in the 2016 US election, as many as 50,000 variations of adverts were being served every single day on Facebook, a near-impossible situation to monitor.
Mozilla polled 30,000+ Internet users about privacy and security online. This is what they said:
- Over 90% of survey participants said they don’t know much about protecting themselves online.
- Only 1 in 10 survey participants feel like they have total control over their personal information online. Nearly a third of respondents feel like they have no control at all over their personal information online.
- Nearly 1 in 3 survey participants said they know either very little, or nothing at all, about encryption.
- 8 out of every 10 respondents fear being hacked by a stranger.
- 61% of respondents are concerned about being tracked by advertisers.
- 60% of respondents own more than four devices connected to the Internet.
- Only 1 in 3 survey participants are willing to attend training about secure tools.
Ross Anderson, Professor of Security Engineering at Cambridge University, provides a multi-decade view of the evolving interaction between security, economics, psychology, cybercrime, and propaganda.
[Video – starting at 5:19]
When you look at systems like Facebook, all the hints and nudges that the website gives you are towards sharing your data so it can be sold to the advertisers. They’re all towards making you feel that you’re in a much safer and warmer place than you actually are. Under those circumstances, it’s entirely understandable that people end up sharing information in ways that they later regret and which end up being exploited. People learn over time, and you end up with a tussle between Facebook and its users whereby Facebook changes the privacy settings every few years to opt everybody back into advertising, people protest, and they opt out again. This doesn’t seem to have any stable equilibrium.
From Anil Dash: The tech industry and its press have treated the rise of billion-scale social networks and ubiquitous smartphone apps as an unadulterated win for regular people, a triumph of usability and empowerment. They seldom talk about what we\’ve lost along the way in this transition, and I find that younger folks may not even know how the web used to be.
From Aral Balkan: data is not a currency, data is not the new oil, data is people…if I have enough information about you I can simulate you and then the question is who owns that simulation of you? We must regulate corporations whose core business is the systemic violation of human rights.
Talk begins @ 12:20
From Me and My Shadow: Debunking common arguments around data and privacy: \”I\’ve got nothing to hide\”; \”Who cares if people know I eat cornflakes for breakfast?\”; \”I\’m just one in millions…how could anyone see me?\’\’ and others…
From Mozilla\’s (makers of Firefox) Internet Health Report: Networks like Facebook, WhatsApp and WeChat serve important social functions that people value highly. But they are largely closed gardens, controlled by a handful of companies that have outsize influence over what people see and do online.
Use the link below to see what\’s helping (and hurting) this valuable, global resource.
Michael Bazzell (a security consultant) demonstrated to Gizmodo how to check if you may be revealing important pieces of information to Facebook without knowing it. Visit this page and select the Facebook link on the left side of the page. Enter your Facebook username into the FB User Name field – this will return your profile number (a long series of digits). Then log into Facebook and try the URLs highlighted in the image to the right – inserting your own profile number for userID.
From Stratechery: the fact that communication is possible on other platforms is to ignore the fact that communication will always be easiest on Facebook, because they own the social graph. Combine that with the fact that controlling consumption is about controlling billions of individual consumers, all of whom will, all things being equal, choose the easy option, and you start to appreciate just how dominant Facebook is… Facebook is too dominant: its network effects are too strong, and its data on every user on the Internet too compelling to the advertisers other consumer-serving businesses need to be viable entities… Facebook’s great power does not entail great responsibility; said power ought to entail the refusal to apply it, no matter how altruistic the aims, and barring that, it is on the rest of us to act in opposition… serious attention should be given to Facebook’s data collection on individuals. As a rule I don’t have any problem with advertising, or even data collection, but Facebook is so pervasive that it is all but impossible for individuals to opt-out in any meaningful way, which further solidifies Facebook’s growing dominance of digital advertising.
The Data Selfie Chrome extension explores our relationship to the online data we leave behind as a result of media consumption and social networks. In the modern age, almost everyone has an online representation of oneself and we are constantly and actively sharing information publically on various social media platforms. At the same time we are under constant surveillance by social media companies and “share” information unconsciously. How do our data profiles, the ones we actively create, compare to the profiles made by the machines at Facebook, Google and Co. – the profiles we never get to see, but unconsciously create?
Why does Facebook need your phone number or your phone contacts or the data from your WhatsApp account? Why does Facebook track the time you spend looking at the posts in your news feed? Is the sole purpose of this data gathering to serve us more relevant ads? Is there something else afoot?
Data Selfie is an application that aims to provide a personal perspective on data mining, predictive analytics and our online data identity – including inferred information from our consumption. In our data society algorithms and Big Data are increasingly defining our lives. Therefore, it is important – especially for those who are indifferent to this issue – to be aware of the power and influence your own data has on you.
From the EFF: Panopticlick will analyze how well your browser and add-ons protect you against online tracking techniques. We’ll also see if your system is uniquely configured—and thus identifiable—even if you are using privacy-protective software.
From Open Whisper Systems: Using Signal, you can communicate instantly while avoiding SMS fees, create groups so that you can chat in real time with all your friends at once, and share media all with complete privacy. The server never has access to any of your communication and never stores any of your data.