Big Data, Privacy and Ethics
19th June, 2015 —
I hope you’ve recovered from last week’s mega roundup! This week’s roundup brings you the politics of privacy, the sharing economy, and how a societal shift could save us all.
Big Data
“Big Data” is the buzzphrase of the decade, but what does it mean? It’s been a few weeks since we shared a Do Not Track episode (we’ve got some catching up to do!) Episode 5, Big Data: Inside The Algorithm, helps explain “Big Data” in simple terms with another one of their fantastic interactive documentaries.
The politics of privacy
In an article in The Telegraph, Jamie Bartlett looks at the politics of privacy and technology, alongside how our awareness of internet privacy has increased:
“Broadly speaking, the left are fearful that state surveillance might encroach on the individual’s right to privacy. The right worry about how much power will pass to state bureaucracy. At the radical libertarian fringes, many believe that anonymity online will eventually lead to a stateless utopia. There are many others who don’t fit anywhere, but simply believe society is better served if people have a protected sphere of personal and private life.”
Jamie is cynical about the ability of new technology to make a difference, but emphasises the importance of privacy for everyone:
“it’s not just in the more hostile parts of the world that privacy matters. In most democratic societies, privacy creates a sphere of freedom for the individual, which allows for political, social and personal expression.”
Babak Siavoshy argues that it doesn’t matter if people care about their privacy or not, their privacy should still be protected, “while privacy surveys can be very valuable, it’s not clear that they are relevant to key policy questions about whether and how we should protect privacy.” Babak makes the point that privacy is like voting, you don’t take people’s right to vote away when they don’t vote. It’s a collective right that creates space for individual freedom. When privacy is considered as a collective right, it also emphasises that infringement on privacy is not just harming an individual, but causing societal harm.
In a great article on ethics in technology, Rafaël Vinot looks at how there are three types of people who are aware of the extent of our privacy and security problems:
“There those who want to make money off the disasters and don't care, the governments and institutions who want to use these problems for their power, and, I hope, the largest but (for now) least powerful group: the techies who want to fix infrastructure and inform the public.”
Rafaël continues to talk about the differences between the people who take advantage, and the people who care. Big data is collected and used by big corporations and governments because the everyday person isn’t well-informed enough to understand the technology, or to stop it before it goes mainstream. And nobody in governments or corporations are held to account for their actions in infringing human rights because we have no way to hold them responsible.
Edward Snowden brought awareness of privacy issues into the mainstream. DuckDuckGo’s founder Gabriel Weinberg told The Guardian this week that the use of DuckDuckGo’s search engine, a non-tracking alternative to Google search, has soared 600% since the Snowden revelations.
In revealing the extent of government mass surveillance, Snowden has made many enemies in government. On the weekend the Sunday Times published anti-Snowden UK government propaganda under the guise of journalism. Glenn Greenwald went on multiple news programmes, pointing out the flaws in the Sunday Times’ piece and the irresponsibility of journalists who disseminate this kind of unsubstatiated and unattributed information.
The reason the UK is attacking Snowden? The UK wants to keep its mass surveillance powers. David Anderson QC, the UK’s independent reviewer of terrorism legislation, published a report this week, stating the “United Kingdom’s surveillance regulations are obscure, incomprehensible, unnecessary, and ‘undemocratic’.” One of the report’s many recommendations is that warrants to intercept communications should have judicial authorisation, rather than the current system where warrants are issued by government ministers. Home Secretary Theresa May’s said that she would consider the report, but given her stance on the Snoopers’ Charter, we’re not hopeful.
In fact, Theresa May seems to have looked at the secrecy around trade agreements, and decided that the UK government would benefit from secrecy around the Snoopers’ Charter too. She has refused to share the full details of the charter with law enforcement agencies or communications companies “raising fresh fears that she is seeking to limit dissent in order to steamroller the controversial laws through parliament.”
In response to this kind of government behaviour, more than 25 civil society organisations made a joint statement at the 29th session of the UN’s Human Rights Council this week, calling on all governments to promote the use of strong encryption technologies, and to protect the right to seek, receive, and impart information anonymously online. They laid out specific recommendations for both governments and technology companies, which would go a long way to protecting human rights in a digital era.
Protecting yourself online
Jon Alexander explains why he is more scared of Silicon Valley than the Islamic State. Jon explains how a Marc Andreessen interview shows that we, as humans and individuals, are treated as mere consumers (and products) by Silicon Valley.
A fantastic essay by Umair Hague points out that the “sharing economy” services coming out of Silicon Valley are actually creating a “Servitude Bubble” where there are the privileged few who use such services, “like neofeudal masters, lords with a corncucopia of on-demand just-in-time luxury services at their fingertips,” alongside a “very large number of people glorified neo-servants…butlers, maids, chauffeurs, waiters, etcetera.” The sharing economy may create jobs, but it creates low-end, deskilled, service jobs with no way out or up. Umair explains that these businesses do nothing to enhance society, human potential, or create any real meaning. In fact, they take away from projects that could benefit society, as they deplete the available finances and people with the skills to create such products and services.
As Jon Alexander points out, the danger is that we’re letting Silicon Valley treat us this way, becoming our own worst enemies:
“We need to move on, stepping up to the opportunity to explore our agency and identity not just as Consumers, but as something more like Citizens – active participants in shaping the context of our lives and explorers of the collective interest, not just choosers acting in immediate, narrow, material self-interest.
We need to seize the moment to structure and regulate these organisations as what they are – public infrastructure – not let their exploitation of us gradually increase, like the proverbial frog in the saucepan.”
As a society
Rafaël Vinot’s article also emphasises the need for a societal shift: “It is going to be hard, take time, and require a lot of talking.” But also that it’s the responsibility of those of us who work with technology: “the ones behind computers digging into the lifes of others [should be] getting more conscious of what they are doing.” Certification and monitoring won’t make us better people, the only way we can be sure we won’t take advantage of what technology provides us, is to be more ethical. Rafaël suggests we need some kind of code of conduct. Not necessarily legally enforceable, which would be something that can have loopholes, but something more like the Hippocratic Oath: “something that can evolve fast enough so that we can call out the ones behaving unacceptably without having to wait for a legal framework to bootstrap every time someone is being a dick.” We need to become a real community of technologists who care about people:
“Ultimately, what we need is to raise the consciousness of our friends and peers. Let's use the nasty words: we need to be more political, more responsible, and to use our power for good.”
Shouldn’t we just read the small print?
Reading all the small print won’t make us more aware of what’s going on with the products and services we use. Alex Hern spent a week not doing anything without first reading the terms and conditions, discovering that many were out of date, and generally (often intentionally) hard to read. The key feature of all the agreements is that if you don’t agree with them, you don’t get to use that product or services. There’s no negotiation. It’s their way or the highway.
What are those cheeky corporations up to this week?
We missed out weekly catch-up on the dodgy Silicon Valley last week. So this week we have a bumper crop of threatening behaviour and human rights infringements from our favourite corporations:
Amazon and Google
Amazon and Google are racing to get our DNA data into their cloud storage. Academic institutions and healthcare companies find the fast, cheap, and easily-shared cloud services are much more efficient that their in-house analysis and storage solutions. While these may seem like savvy investments on the side of the institutions and healthcare companies, there’s no mention of the privacy implications of Amazon and Google having access to this data, let alone that it’s possibly sensitive DNA data.
Last November, it was revealed that Twitter would be collecting data on the other apps that users downloaded on to their phones. This week, Twitter announced that advertisers would now be serving ads to Twitter users based on those apps, allowing Twitter to make money from that data. It’s pretty scary that Twitter has access to that kind of information, but what makes it worse is that everyone is opted in to this tracking by default. You can opt-out, but only if you deselect the option in settings to “Tailor Twitter based on my apps”.
Google has released an internet-streaming home security camera called Nest Cam, the product of its acquisitions of both Nest and Dropcam. With its “Nest Aware” technology, Google is also offering to record up to 30 days of video, with audio, to the cloud, constantly analysing the information. While Nest states that it protects your privacy, won’t sell your data or use it to sell you ads, Nest users can share their data with Google, whether they know the implications or not.
Time for some good news
Bad news from big corporations doesn’t really give us many reasons to be optimistic, so I’m hoping that this new segment I’m calling “Time for some good news” will have more cheerful updates in the future…
Uber
The California Labor Commission has found that a driver for Uber in San Francisco is an employee of the company. This is a huge precedent that could lead to Uber drivers being given their due employment rights such as employee benefits and labour protection. Currently, Uber drivers are treated as independent contractors, which means they have to pay for all operation costs; including petrol, insurance, car cleaning and maintenance. These drivers are a part of the “Servitude Bubble” mentioned above. Here’s hoping this ruling leads to positive change for Uber drivers, and other people working in the “sharing economy.”
Apple
In a video at a conference in Barcelona, Edward Snowden said that Apple’s business model and Tim Cook’s stance on privacy are “a good thing for privacy” and “a good thing for customers.” He went on to say that we need to support, incentivise, and emulate organisations who are willing to innovate and take strong stances in favour of privacy. Snowden warned that we should be wary of companies potentially reversing their stances on privacy, and betraying our trust, but that he “would like to think that based on the leadership that Tim Cook has shown on this position so far, he’s spoken very passionately about private issues, that we’re going to see that continue and he’ll keep those promises.”
If you know of any more positive news in privacy and ethical technology, please let me know. I’d love for this segment to grow!