Be Aware, Be Very Aware

Let’s reclaim our minds from being hijacked by technology”—Time Well Spent

Sam Harris hosts the podcast Waking Up. He holds degrees in philosophy and neuroscience and has written on many topics including neuroscience, moral philosophy, politics, religion, rationality, free will, and ethics. In general he explores “how a growing understanding of ourselves and the world is changing our sense of how we should live.” In this episode called, “What is Technology Doing to Us?,” he talks with Tristan Harris, who describes himself as “an expert on how technology hijacks our psychological vulnerabilities”, is a former “Design Ethicist” with Google and founder of Time Well Spent, a “non-profit movement to reclaim our minds from the race for attention.”

It’s a wide ranging, provocative and eye-opening conversation about “the arms race for human attention, the ethics of persuasion, the consequences of having an ad-based economy, the dynamics of regret, and other topics”; and it has been bouncing around in my own mind since I happened upon this episode a bunch of weeks ago.

At just over an hour and forty minutes or so this episode itself represents a fair investment of time. But it’s an investment that could favorably influence how you engage with your technology and improve how you manage your most valuable and non-renewable resource: time.

The underlying premise of this conversation is that most of us feel technology is neutral and how we use it is up to us. But during their discussion they raise questions like: Is this behaviour always a conscious choice?; Do we know why we are clicking on “that?” We often think we do know why, but then we are also often misinformed. For example, we probably don’t realize that “there are 1,000’s of people on the other side of the screen whose job it was to get you to click on that.”

We humans are an easily gameable lot. The “choice architectures” that social media presents us with in what Tristan Harris calls the “attention economy” are designed to exploit the “backdoors into peoples’ minds.” In other words, we tend not to be aware of how calculated these “intrusions in our lives” actually are.

And we should acknowledge that the psychology of our minds work in specific, predictable and persuadable ways with “big holes waiting for things to pop in.” This presents a kind of existential problem because all of us are trapped inside the same psychological architecture and vulnerable to the techniques of persuasion.

“… persuasion is kind of like that. There is something that can subvert my architecture. I can’t close the holes that are in my brain, they are just there for someone to exploit. The best I can do is to become aware of some of them, but then I don’t want to walk around the world being just vigilant all the time of all the ways my buttons are being pressed.”

Tristan Harris says there’s a whole industry dedicated to this “dark art form” that people are not aware of. Consider, for example, that many people, when asked about the rise of big data, are not really all that alarmed that their personal data is out there. The familiar response is: “I’ve got nothing to hide.” But if they realized that this data is used to feed the attention economy and the underlying methods of persuasion that come with it they might be more concerned. What if this dedicated group of engineers develops a type of artificial intelligence that literally knows how to persuade you to do anything?

Tristan Harris elaborates on the potential of persuasion:

“When you really take that all the way, the idea that persuasion can undermine your beliefs, your attitudes, your behaviour, even some sense of identity … if you can shift all that? Then, where we put authority after the Enlightenment, which is to put authority in the individual feelings, and thoughts and beliefs of people, for elections, for markets. Both of these things are now subject to question. Because if you can persuade someone to have different beliefs about a product, or how much it will benefit them, or how much it will make them happy, and it doesn’t do any of those things; or persuade someone of a candidate with all these [persuasive] techniques and they don’t do any of those things … this is really, really dangerous. And it undermines where we put authority in our current age. And, of course, this opens up some very big questions about, ‘What is a kind of persuasive world that we want to live in?’ Because [these persuasive techniques are] only going to get better.”

And it’s very difficult to guard oneself from these influences, to be somewhere we are not exposed to these persuasive elements.

They suggest we start to question the status quo of these social media platforms and start envisioning the kinds of changes that would open up positive intellectual and social spaces instead of “maximizing clicks and attention until it turns the world into outrage.” Outrage? Nobody thinks this is what we want, but we seem to act like this is what we want.

At one point in the conversation they describe the current social media environment as a kind of “city of attention” a city that was “[basically built] by three or four companies.” It’s a nice analogy but, as Tristan Harris points out,

“… we don’t have public representation in that city. You know, when we see a pothole or we see this sort of an intersection where people are getting into a car crash all the time, it’s like … the equivalent of that on Facebook [is] people are getting into weaponized arguments all the time. … [but] it’s not acknowledged to be a car crash, ‘We’re a neutral platform and people will do what they do.’ No that’s not true: the size of the textbox matters, whether the three actions being ‘like,’ ‘comment’ or ‘share’ versus something else, versus organize a dinner conversation: that one click button to get together.”

It’s fascinating to me that self-worth is now measured by how many “likes” or “followers” you have, that is measured against the design of a social media site. And that sense of self-worth is something that can be easily exploited. Consider this comment:

“People will post a photo on Instagram and if it doesn’t get enough likes they’ll actually take it down because they feel their self-worth has been threatened. That’s how seriously these things have mediated people’s values. … Their representation system of the ‘number of likes that I got’ became the ‘way that I value myself. I’ve been infected, I am a robot. They’ve just drilled a hole in the back and they have got me, I now value myself based on that.’ And this is why people … need to see first that this is in deliberate service of maximizing attention and engagement and it’s not all built to help us.”

But we could look at things differently. What if there was a way to recognize conversations that are “open to truth” in social media? Or include an option to indicate that your mind has been changed about a particular issue? There could be a set of possible positive outcomes, like the “organizing a dinner” example touched on above. The idea being that an unresolved discussion started online then becomes a social gathering where the conversation can continue in person. What kinds of “choice architectures” would result in a sense that your time has been well spent or that your technology is contributing to a “life worth living?”

What a fascinating conversation. If you’re interested in this topic but not ready to commit to listening to the entire conversation I’d also recommend Tristan Harris’ shorter fifteen minute talk on the Personal Democracy Forum where he describes “the game that we live in” and characterizes his phone as a “slot machine.”

Or consider this short article from a year or so ago called, “How Technology Hijacks Your Mind.” In it he describes persuasion as operating very much like sleight of hand magic which, he says, is “exactly what product designers do to your mind. They play your psychological vulnerabilities (consciously and unconsciously) against you in the race to grab your attention.” The article outlines ten persuasion hijacks we should all be aware of including these three:

  • If You Control the Menu, You Control the Choices
  • Fear of Missing Something Important, and
  • Bottomless Bowls, Infinite Feeds, and Autoplay.”

All of which reminds me of that particularly disturbing episode of Black Mirror called, “Nosedive” and the protagonist’s quest for self-worth and the social media jackpot.

Comments are closed.