Brave New World
Three years ago, we started Schnailmail with an important, important but simple goal: to bring the world together, one letter, one postcard to a loved one, one package at a time. And boy, have we delivered a million times over. Today, we mark the one billionth letter sent. And I want to share it with you right now. It's from Susan Bennett to her mum. Let's see what Susan wrote. It says, "Mummy". That's so cute! "John and I are loving Barbados". Disposable income. Guys, disposable income. "I hope you're feeling better." Can we find out what's wrong with Susan's mum and just make sure her insurance company knows too? "See you soon", it says, Let's put some coupons for travel in there as well. "Love, Susan."
This is why I love doing what we do. It allows us to share in such intimate moments in your lives, and the more that we get to know you, the more we realise just how much more we want to get to know you.
And that's why last year we launched Schnailbox. Your real mailbox connected to the Cloud. Because, who has time to read mail? Let us read it for you. And you loved it. I love hearing from people every day who tell me how much they enjoy us doing the thinking for them. And we and our partners love doing that for you. It just makes us happy. And together, we have disrupted the dinosaurs of the old world. When the last Post Office closed its doors this year, barely anyone noticed. When FedEx and UPS joined the swathes of little independent couriers on the bankruptcy pile, nary a tear was shed. Where were the human rights terrorists then? We don't know.
And this has been a most amazing story, and I'm so glad that you've been part of our story, and today the Schnailmail family is excited to announce the next chapter in the story of the Schnail. Today, we're excited to announce that we've been acquired by Google.
(clock ticking; alarm bell rings).
Good morning. Boy, that was a nightmare, huh? Welcome to the Indie Tech Summit. In 2010, the New Yorker ran a piece about Mark Zuckerberg, in which he admitted to having an IM conversation when he first started Facebook with his friend at Harvard. This was when there were about four thousand people on what was then The Facebook. This was that conversation. So, Zuck says, "If you ever need info about anyone at Harvard, just ask. I have over four thousand emails, pictures, addresses, SNS…I think he meant SMS…it was a thing back then." And his friend says, "What? How did you manage that one?" And he says, "People just submitted it. I don't know why. They trust me. Dumb fucks." Profound words, dumb fucks, and of course this was when Zuck was young.
I'm sure that the intervening years of astronomic wealth and power have only gone to make him more reflective in his views. Although, I don't think that's what's happened if the latest news about Facebook is anything to go by; just this week, we found out that Facebook ran a study, again, a few years ago, that they published to study an emotional contagion, and they used about eight hundred thousand unwitting, unknowing Facebook users for this experiment, where they either showed them positive posts only from their friends or negative ones only, to see how it would influence their emotional state.
Now this is a big deal, right? The manipulation of emotion is no small thing. An estimated sixty per cent of suicides are preceded by a mood disorder, according to the New York Times article. And the author of the study actually said that he was sorry. But here's the apology. "My co-authors and I are very sorry for the way the paper described the research and any anxiety caused." Now, there's a reason it's worded that way, because they can't be sorry about the thing itself, because that's their business model, so Sheryl Sandberg this week said, "Facebook study was poorly communicated." But she was quite transparent about it; she said, this was part of ongoing research companies do. This was just the way things are. We apologise, right? But it was poorly communicated. That's what we're apologising about. We never meant to upset you. She could've concluded…dumb fucks.
And this is not an anomaly. We have to understand that this is not an anomaly. Michael Novak, a product manager at Facebook recently said, "Now we're thinking about privacy as a set of experiences that help people feel comfortable." Comfortable with what? With the fact that you don't actually have privacy. And still, our mainstream media are asking questions like, should Facebook manipulate users?
Let me think about that for a while. Maybe not. Maybe we shouldn't be doing that. But it's not just Facebook. Let's look at Nest; Nest, who make these beautiful, beautiful home appliances, right, that know when you're home and adjust the temperature and the settings accordingly, know when you're away from home and Nest got bought by Google. This happened in January of this year, and when that happened, Nest said that their customer data would not be shared with Google, right? Only used for Nest products, that was on January thirteenth.
The problem is that we keep getting shocked; we keep getting surprised. We keep thinking this is a bug, and it's because we don't understand the problem. Some of us still harbour the misunderstanding that Web 2.0 for example was about freedom. Remember open APIs? That if we all just used open APIs, we could build an open web, right? I fell for it. I started creating Twitter clients. But what I was doing was adding value to a closed silo, which is what Twitter is. An API key, an open API key is a key to a lock that you don't own, and a lock that can be changed at any time, so what we were doing in Web 2.0 was, we were adding value to closed silos; the same corporate surveillers that run the internet today, but we were told we were doing it in the name of openness.
I have an issue with that. Today, we're seeing the same people, the O'Reillys, etc, telling us that we should embrace the Internet of Things as long as they have open APIs, so today with Internet of Things, we have Things that feed into closed silos, into the companies that practise corporate surveillance. Without independent alternatives, the Internet of Things as it stands today is just an evolution of Web 2.0. It is the Internet of Things that spy on you. The difference is, these are things that we wear on our persons and that we invite into our homes. What's happened here is with Web 2.0, we started tapping the arteries of data and life was good. But we got that.
So what's the next step? We need more data. We need to start tapping the capillaries. The capillaries are the Internet of Things. We have to understand that this is not a bug. Corporate surveillance is a business model. It is the business model of spying on you. This is why they don't have a direct response ever, because they cannot deny their own business model. You can look into their financial documents and see what it is.
So, we look for the bug. We look for the bug in software because that's what we do, right? And we have to find the bug before we can fix it and finding the bug is ninety per cent of the work, isn't it? We look for the bug in the software and we write lines of code. But no matter how many billion lines of code we write, we cannot fix this bug, because it's not in the software, so let's look in the hardware. Is it in the hardware? Can we fix the bug with hardware? No, because that's not where the bug is.
So let's look elsewhere in the stack. Is the bug in the network? What if we just distributed the network, would that work? The bug's not in the network; there are things we can do to make it better, but if we're really going to address the bug, if we are going to address this bug, we need to find it, and we need to look outside the stack, because the bug is outside the stack. The bug starts at the business model. Until we understand this, we're going to be fixing the symptoms, not the bug itself.
The business of corporate surveillance feeds on data. We have to understand how that business model works, so let's take Google for example. Google has services; it started out as a service, right? Just search. They weren't even keeping your data; they weren't keeping tabs on you, they weren't spying on you necessarily. They fell into that business model later, after 9/11. But today, Google is a very different beast. It has a plethora of services, right? Do you want somewhere to put all your files? Put them on Google Drive. Will Google look through all of your files to get to know you better? Yes. Why? That's how they make money, that's their business model. There's nothing more sinister than that at work here. It is simply a business model, not some conspiracy theory. Do you want to put your pictures somewhere? Put them on Picasso. Will they run facial recognition, try to understand who you are, who your friends are? Why not? Data is what they feed on.
When you use GMail to send emails to your friends, will they read your emails? Yes, of course. That's how they make their money. And you might say that's fine, I'm happy with that. I'm willing to make that exchange of my privacy for this service, but what you have to understand, and this is essential, is that you're not just making a personal decision there. You're also making a decision on behalf of everyone that writes to you, that sends an email to you, because you're saying it's OK for Google to read their emails also, and if you have a custom domain, they may not even know that they're sending that email through Google and that it will be read. It's like second-hand smoke; it also affects other people.
And Google has games that they use to get data, like Recaptcha. Recaptcha's great, isn't it? It teaches them so much about the world by giving you some bit of functionality, so it's these little captures on forms, and they might show you a street sign that they haven't been able to read and so it protects the form from Spam, but at the same time, you're teaching Google Street View how to read street signs.
They have actual games like Ingress, where you walk around town and you play in teams and you're trying to find landmarks and you're trying to hack into them in the game world; it sounds awesome, it's free of course. But what they're really doing is they're getting very hard to come by data on pedestrian walking patterns, right? So, you are the lab rat. We want to find out how people walk in town and how they get from place to place, so let's incentivise them to do that.
And you might say, OK, well their services are evil; I'm not going to use their services. And then they've lost, right? So that's why they have devices, like beautiful, beautiful devices like this Nexus phone, which is half the price of an iPhone. Have you ever asked yourselves why? Do they have twice the economies of scale as Apple? Tim Cook is a supply chain guy. Is he asleep at the wheel? No. This is a subsidised device. It's a beautiful data entry device saying, buy this and let us get more of your data, because even if you don't use Google's services, they will still get some data, because they've made your log-in, your sign-in to your device, your Google sign-in. Same with their tablets. Same with their Chromebooks which they're trying to get into schools and which we will talk about today.
OK, let's not use their devices, and Google's lost. They don't want to lose. What's the next step? What's the end game? If your business feeds on data, what's your end game? Your end game is to provide connectivity for people. If you can make my sign-in to the internet my Google username and password like they're trying to do with Google Fiber, then it doesn't matter what device I use; you will still get valuable data from me. That's the end game, and they're not just trying to connect us; they're trying to connect the next five billion, the next billion people in India, the next five billion people around the world with projects like Google Loon using balloons or using drones. Why? Well, in the future, there might be a whole nation whose only notion of the internet is something you sign into with your Google username and password.
Facebook's trying to do the same thing with internet.org for the same reasons. So that's them having to know about you because that's valuable data; that's what they feed on, that's their food.
What about the world? They also need to know as much about the world as they can, and of course all of that knowledge that they gathered together is proprietary. So, we have Google Maps, right, which we love; they're great. We've got Google Earth and Google Satellite View, right? And then we've got Google Street Maps, Street View, and that's just here right now. And you've all seen the Google Street View Car that gathers data by driving, but there are places that they can't go with the Car, so there is the Google Street View Trike. And there are places that that can't go. So there's the Google Street View Snowmobile. Why? Because they need data. They're trying to get the data that's there.
What about indoors? They can't go in there with a Snowmobile. They need a trolley, and they have it. What if you can't go there with the trolley? What if it's rough terrain? Well, you know what? There's a Backpack. The Google Street View Backpack. They will get that data somehow. Why? They feed on it; it's their food. But there are still a few places where even if Google Street View showed up with their Backpack, you probably wouldn't let them in. One: it's probably your office, the other is probably your home. And that's why they need you. That's why one of their latest projects, Project Tango, is a phone that has a depth sensor and a motion tracker that when you're walking around with it in your own home, it maps your home in three dimensions. It can recognise objects in your home, and of course, share that information with Google. But you can also do cool things like play games in your own home which is really cool.
You're the lab rats. They can't get into your home; they can sell you a device that gets them into your home. If you need to understand the Internet of Things, that's all you need to know. Why? For a very simple reason. Your data is not the most valuable thing. We keep looking at symptoms. Your data is just raw materials.
But when you put all that data together, all of those individual data points, you start creating a profile, you start creating a digital self that's apart from your corporeal self. You start creating a simulation of the person. And even though I can't take you and lock you away in my lab and study you twenty four hours a day, because we have laws that protect your corporeal selves, I can actually take your simulation. Imagine it as everything but your body. I can take that and I can lock that away in a lab, and I can psycho-analyse it and I can prod it to see how it works, twenty four hours a day, because your simulation, your digital self, has no laws protecting it. And that's something that we need to change as well. Our corporeal selves have human rights; our digital selves do not.
So, free and the eco-system around it is a lie, but it's more than a lie; it's a con. It is a text book definition of a confidence trick. Who are the people that get had by confidence tricks? Who gets conned? Dumb fucks, right? That would be us. It's a concealed barter. We are exchanging something of value, but that is not made apparent to us, and we do not understand the extent of what it is we're giving up. And more than that, they will tell you, but you have a choice; don't use Google, it's fine. Use Yahoo! OK, what's their business model?
Oh, it's the same. So this business model has become a monopoly on the internet, and the lack of alternatives is hugely, hugely worrying. This business model is fuelled by a cycle of venture capital and exits. When you take venture capital, when you take equity investment, you have to have an exit; that's how the people who give you the money make their money back ten-fold or more. You either exit to people or you exit to a larger company.
So, how many of you have seen this? Spritz. It's a new way of reading. Anyone seen this? Who thinks this is cool? I'll put my hand up because I think it's cool. When I saw it, I was like…Wow! And then I saw, because it's awesome, it's like basically you can look at the screen right now and you read one word at a time and it's positioned so that you can read really, really quickly, and the CEO of the company was at a conference that I was speaking at recently, and in his talk he said, you know, we discovered, they don't have any scientific evidence to back this up, but he said, "We discovered that people with dyslexia find this really easy to read because there are no distractions and they can just focus." And then, at the dinner that we were talking at, I asked him what his business model was and he was like, well, we can see what you're reading, right? And this is an SDK. They want people to put this into their own applications. And he said during his talk, "We'd really like to see some email apps using this." I wonder why that was?
But what's worse is, what is the choice you're being given here, especially if what they say is true and you have dyslexia. What's the choice you're being given? Either read better or…and let us spy on you…or read worse. I believe that humanity deserves better than this myopic, myopic business model and its ramifications. This is an unacceptable, unacceptable decision to be put in the position of making.
So, corporate surveillance is the business model of spying on you. It's the business model of studying you, understanding you, to predict your behaviour and to manipulate you. So let's just at least stop getting surprised every time they act in line with their business model. It's not helping. In this business model, you are the quarry being mined. You're the livestock being farmed. You are the lab rat, and so am I. You are in the eternal words of Mark Zuckerberg, the dumb fuck.
Well, I'm here to say today with you: fuck that. We're not going to take that any more. And if we're going to change things, we need to change the way that we do things and the way we think. To paraphrase Einstein, past thinking and methods did not prevent corporate surveillance; future thinking and methods must. We must change our thinking and our methods.
And although we might be a tiny group, I believe we have our hearts in the right place and I believe that we have the right intent, and that usually is the only thing that matters. And we have you, and we have your support. And I believe that will be the difference that we need to make. There is always hope. And we need to embrace hope that we can have a different future. But we can do better; we can start to influence it. The time for inaction is over; it is time for us to start designing hope, and that's what we're going to be doing with Indie.
Today, we are launching this core: Indie, around which we will define a spot on the map. This is what Indie is. And that's what the manifesto is for. Agree with us; come and join us. Disagree with us; that's fine. We know where we are, and we will create a spot in the debate. That spot needs to exist. And to support those of you who want to build Indie Tech, we are looking into setting up a Foundation. This is where we need to go. This is more than any one product. We need to educate; we need to get involved in the conversations, and we need to support those of you who want to make a difference in this world. A viable, true, sustainable difference. One that breaks from the old world.
And of course, we need to actually make something to show people this is what we mean; we're not just talking about it, and that's the phone, and I'll have a few more words to say about that at the end of today. But the Indie Tech manifesto, quickly go through it.
Indie Tech is independent. We reject venture capital. We reject equity investment. We reject being sponsored by corporate surveillers. That is our spot on the map. We are independent and we value it. That doesn't mean that we reject all investment. There are other ways which don't involve exits. But we want them to be sustainable, and that's really, really important.
My criteria for success is whether I'm working on this in twenty years' time. Whether my children inherit what I do is my criteria for success. And in order to create the viable alternatives, we need to create organisations that can meet that demand; the demand is a consumer one. We need design-led organisations in order to create the products that can actually compete viably. And in order to create those products, this is a societal problem, and society is diverse. If our organisations are not diverse, we cannot tackle this problem. Young white guys alone are not going to fix this issue. Those are the kind of organisations we need to build.
What about the products? Well, the products have to be beautiful; and by beautiful, I don't mean aesthetics, I mean design. Holistically beautiful. We need to start thinking about design holistically in what we do, because we compete on experience in the consumer space, and if our products don't, we will not be able to compete. They have to be free, as in liberty. So that we protect the freedom of our work and the freedom of the individuals that use it, even more importantly. We need to wean people off of the current networks and the current social networks that they're part of. We can't just cut them off; we need to be social, and we need to be accessible. We need to reach a mainstream audience, and these all go together.
We of course need to be secure in the products that we make, security must be a core design tenet of what we do, and yes, part of the bug is in the network, and the solutions that we build that the poor have to be distributed. So those are the products. And that is Indie. We launch it here today, and I hope you will join us to help build it, because I believe in a very different future; a future where we own and control our tools and our own data, and through this, a future where we enjoy fundamental freedoms and democracy. I hope you'll join us on that journey.