Jaron Lanier knows the terrain of modern digital technology about as well as anybody. He is considered a pioneer in the field of virtual reality – a term he coined – having, among other things, co-founded VPL Research, Inc., the first company to sell VR goggles and gloves. He has worked for Microsoft since 2006 and has authored several books, including Ten Arguments for Deleting Your Social Media Accounts Right Now, an international bestseller.
In 2018, he gave an extended interview with Krishnan Guru-Murthy on Channel 4 News (in Britain), which I recently came across. Much of what he says is not going to be “new” per se to many readers, but I found it compelling enough that I wanted to transcribe and share some highlights.
I know that social media has gained a new importance in our lives since the pandemic began. It’s obviously hard to have a social life when you have to socially distance, and many tech platforms have helped provide a little bit of socializing to make up for the painful absence of the usual in-person sort. But Lanier does a really good job of explaining how social media actually manipulate us – like a kind of giant behaviorist experiment. And he covers a lot of other interested ground, such as why social media is bad for politics, why Trump is something of a “victim,” and how we might “fix” social media. Here are the highlights, with questions included from Guru-Murthy (KGM):
KGM: Is there a principal reason why I should delete my social media and if so what is it?
There are two. One of them is for your own good and the other is for society’s good. For your own good it’s because you’re being subtly manipulated by algorithms that are watching everything you do constantly and then sending you changes in your media feed, in your diet, that are calculated to adjust you slightly to the liking of some unseen advertiser. So if you get off that you can have a chance to experience a clearer view of yourself and your life.
But then the reason for society might be even more important. Society has been gradually darkened by this scheme in which everyone is under surveillance all the time and everyone is under this mild version of behavior modification all the time. It’s made people jittery and cranky. It’s made teens especially depressed… But it’s made our politics kind of unreal and strange where we’re not sure if elections are real anymore. We’re not sure how much the Russians [are interfering]…
KGM: You say it’s bad for me as an individual. Is it bad for me because I’m addicted? Have I become chemically hooked?
You have. The founders of the great Silicon Valley spying empires like Facebook have publicly declared that they intentionally included addictive schemes in in their designs. Now, we have to say this is what I would call almost a stealthy addiction. It’s a statistical addiction. What it says is: we will get the broad population to use the services a lot, we’ll get them hooked through a scheme of rewards and punishment – and the rewards are when you’re retweeted, the punishment is when you’re treated badly by others online – and then within that we will very gradually start to leverage that to change them. So it’s this very kind of stealthy manipulation of the population. It’s not as dramatic as a heroin addict or a gambling addict but it is the same principle.
KGM: But who’s doing the manipulating…?
Well, this is the peculiarity of the situation. The people who run the tech companies like Google and Facebook are not doing the manipulating, they’re doing the addicting. But the manipulating which rides on the back of the addicting is the paying customer of such a company, and many of those customers are not at all bad influences – they might simply be trying to promote their cars or their perfumes or whatever…
KGM: How is it different to just television advertising or billboard advertising or anything else?
The difference is the constant feedback loop. So when you watch the television the television isn’t watching you. When you see the billboard the billboard isn’t seeing you. And vast numbers of people see the same thing on television and see the same billboard. When you use these new designs – social media, search, YouTube – when you see these things, you’re being observed constantly and algorithms are taking that information and changing what you see next… They’re searching and searching and searching – and they’re just blind robots, there’s no evil genius here – until they find those patterns, those little tricks that get you and make you change your behavior.
KGM: In terms of society… [you mentioned] it’s making people depressed, but is there any actual evidence for that?
Yeah, unfortunately there’s a vast amount of evidence. There have been dozens of studies at this point including studies released by Facebook scientists – so this is something we can call a consensus – and when Facebook releases such things they say “oh, but we do all these good things too that balance it.” But there’s a general acknowledgement that depression correlates. The scariest example is a correlation between rises in teen suicide and the rise in use of social media. And so, yes, unfortunately this is real.
KGM: Are you sure you can blame it on social media? Is it not just that those two things may have happened at the same time for other reasons?
Well, here’s a distinction we have to make. It’s very similar to the problem of global climate change. We can say statistically – over the whole population – yes, the correlation is real. [For] any particular person, of course, we can’t. Just as we can’t blame any particular storm on global warming.
KGM: Why have you, sort of, turned on your own kind?
I love Silicon Valley and I do not at all feel that I’ve turned on my own kind. Just to be clear, I’m very much a part of this: I’ve sold a company to Google. I’m not in any sense an outsider. I believe that what we’re doing is not in our own self-interest. Business interests are a part of society; if they destroy society, they destroy themselves. I believe it’s very clear that we could offer all of the good things – and there are many, many good things in these services, and social media in particular – I’m convinced we can offer them without this manipulation engine in the background. There’s a world of other business plans and I think they’d be better for us. I don’t think we’re being evil so much as we’re being stupid.
KGM: When it comes to Facebook, has Facebook made itself safe yet in terms of data harvesting and scraping and all of that?
Well, Facebook’s fundamental design is one that the business model is too addict you, and then offer a channel to you to third parties to take advantage of that to change you in some way without you realizing it’s happening. I mean, that’s what it does, so I don’t think any amount of tweaking can fully heal it. I think it needs a different business plan…
KGM: So when Mark Zuckerberg says he’s taking action and… he regrets what’s happened and all the rest of it, you’re saying he can’t make his own product a safe or desirable product?
I believe that as long as his business incentives are contrary to the interests of the people who use it – who are different from the customers – then no matter how sincere he is, and I believe he’s sincere, and no matter how clever he is, he can’t undo that problem. He has to go back to the basics and change the nature of the business model.
Bad actors are able to use Facebook in ways that Facebook can’t understand, because the way the service is designed is fundamentally to be manipulative. So I think the data protection idea is a sincere and good idea but it’s certainly not adequate. It doesn’t address the core problem, which is the manipulation engine…
KGM: Do you think they’re all as bad as each other? I mean… why is something like YouTube, which is basically just a way of watching video, bad for you?
YouTube is not “necessarily” bad for you. Remember, this is a statistical distribution, so for some percentage of people it’ll have an effect of making them crankier around election time and feeling needier… and so forth. The way it works is that all the data Google can get on you – much of which comes from just your email or whatever else it might be – is fed into an engine that compares you with other people who share some similar traits. YouTube’s ordering of videos that are presented to you is designed to, on the one hand, maximize your engagement so you won’t stop watching – but that’s achieved not just by observing you, but a multitude of people who are similar to you – and then [on the other hand] when you do get an ad, it’s contextualized in a way that has been shown to be effective not only for you but for this whole population.
So… it’s bad for you because it leeches your free will. It makes you cranky. It makes the world a little darker because you’re not perceiving reality clearly anymore. It’s being manipulated… And the people who are paying – or maybe not paying, just using the system in a clever way – to get at you, are not necessarily pleasant people. They’re sort of the worst actors…
KGM: But don’t some users think, “Look, I can handle advertising… I know what I’m doing here, I’m getting a free service… They think they’re manipulating me, but I know what I’m doing”?
The problem is that behaviorist techniques are often invisible to the person who’s being manipulated. This has a long history – this has been done for a long time. It used to be that the only way to be subjected to continuous observation and modification was to either be in an experiment – you could be in the basement of a psychology building and have students tweaking you for their projects – or you could join a cult or… be in an abusive relationship. I mean, this has been done before, and often the people who are in these situations do not realize what’s happening to them. In fact, the whole point is that it’s sneaky, it’s a mechanical approach to manipulating people. And because it’s so algorithmic, it doesn’t involve direct communication and people don’t get the cues to understand what’s happening with them.
KGM: Why do you think social media has had the effect on politics that it has? You know, is it because of the way people respond to things on social media?
Well, I’d like to give you a slightly detailed answer as quickly as I can… In traditional behaviorism you would give an animal or a person a little treat – like candy or maybe an electric shock – and you’d go back and forth between positive and negative feedback. And when researchers try to determine whether positivity or negativity is more powerful, they’re roughly at parity. They’re both important.
But the difference with social media is that the algorithms that are following you respond very quickly. They’re looking for the quick responses. And the negative responses like getting startled or scared or irritated or angry tend to rise faster than the positive responses like building trust or feeling good – those things rise more slowly. So the algorithms naturally catch the negativity and amplify it and introduce negative people to each other… And so what this does is it means that the algorithms discovered there’s more engagement possible, say, by promoting Isis than promoting the Arab Spring, and so Isis gets more mileage. Or promoting the Ku Klux Klan than Black Lives Matter.
Now, in the big picture, it’s not true that negativity is more powerful. But if you’re doing this very rapid measurement of human impulses instead of accumulated human behavior, then it’s the negativity that gets amplified. So you tend to have elections that are more driven by rancor and abuse…
If you’re an algorithm that’s just looking at instant responses… you’ll find that engagement more often by irritating people than by educating them.
KGM: And so, is that how you create Trump or Duterte or, you know, any of the other populist leaders who are doing very well at the moment partly from the internet?
I have never known Trump, but I have met him a few times over a fairly long period – over thirty years actually, through different circumstances – and I will say that, while I never would have voted for him as president…, he never lost himself and became so strangely insecure and so weirdly irritable until he had his own addiction, in this case to Twitter. It’s really damaged him. I mean, I view Trump in a way as a victim… His character has been really damaged by his Twitter addiction.
KGM: Because of the reaction he gets from each tweet?
Yeah, so, you know what happens in addiction is that the addict becomes hooked not just on the good part of the addiction experience but on the whole cycle. So a gambler is not just addicted to winning but to this whole process where they mostly lose, and in the same way, the Twitter addict or the social media addict becomes addicted to this engagement which is often unpleasant where they’re engaged in these, you know, really abusive exchanges with other human beings and only once in a while… – you can watch Trump – like every once in a while there will be this tweet where somebody likes him and that’s when… he gets his little dopamine hit, and then he dives in for more negativity and… then he gets it again, and you can see the addiction playing out.
KGM: Do you think it’s possible to create a do-gooding social network?
Yes, I’m absolutely positive, and the way to do it is to have a different business model… So, right now, we’ve created this bizarre society… where if any two people wish to communicate over the Internet the only way that can happen – the only way it’s financed – is through a third party who believes that those two can be manipulated in a sneaky way. It’s an insane way to structure civilization.
So, [I believe] we can keep all the good stuff – and there is good stuff on social media, of course – we can keep all that and just throw away the manipulation business model and substitute in a different business model. There are many alternatives that would be better, they just have to be honest. It could be a paid service like a Netflix where you’re paying for it – you’re the genuine customer – it has to keep your interest. It could be like a public library… a public thing that isn’t commercial at all, that’s an option.
But what we did in Silicon Valley is that we wanted it both ways. We wanted everything open and free but we wanted hero entrepreneurs and hackers, and so the only way to get that was this advertising thing that gradually turned into the manipulation engine… and this weird business planet. Once you can see that there are alternatives you realize how strange it is and how unsustainable it is, this is the thing we must get rid of. We don’t have to get rid of the smartphone. We don’t have to get rid of the idea of social media. We just have to get rid of the manipulation machine that’s in the background.
KGM: Your advice tonight to everyone watching this is: delete all your accounts?
I would like to make two very quick pitches on that account. One, if you’re a young person and you’ve only lived with social media, your first duty is to yourself. You have to know yourself. You should experience travel, you should experience challenge… You need to know yourself. And you can’t know yourself without perspective. So at least give it six months without social media, and really quit them. Don’t, like, quit Facebook but keep… WhatsApp, because then it’ll still be spying and manipulating [you]. Get rid of the whole thing for six months and know yourself and then you can decide. I can’t tell you what’s right. You have to decide, but you can’t until you know yourself.
Then, for the rest of society, I’d say as long as we can have some small percentage of people who are off it, then the society can have voices to give perspective. If everybody’s universally part of this thing we cannot have perspective. We cannot have a real conversation. And it’s too lonely right now. You know, we need more people who are just outside of that loop, who are thinking without the manipulation, and I think we’ll find it extraordinarily valuable to have them.
Lots of good stuff there. Makes me want to delete my social media accounts again. If you want to watch the whole video, you can check it out below.