For the last two decades, I’ve watched the tech industry winning the battle for our consciousness. The emergence of artificial intelligence and XR technology could signal the endgame. But there’s still hope…
The concept of attention economics was first written about in 1971 by psychologist and economist Herbert A. Simon when he wrote:
In an information-rich world, the wealth of information means a dearth of something else…information consumes the attention of its recipients. Hence, a wealth of information creates a poverty of attention.
There’s an underlying reality behind this statement: we have a fixed amount of attention to give. Over the last 50 years, the tech industry has developed tools that force-feed us mass quantities of information, whether we need it or not.
You can say this began with the invention of the Gutenberg press in the mid-15th century. The ability to inexpensively recreate books led to an explosion of knowledge throughout Europe and eventually the world. Newspapers followed in the 17th century, and radio and television further increased access in the 19th and 20th centuries. But it’s the invention of the internet (thanks to Al Gore) and the World Wide Web that set us on our current crash course with disaster.
After the mass adoption of the web in the mid-1990s, the concept of attention as a currency became a topic among academics. Maybe the clue was there all along – who wasn’t admonished to “Pay Attention” in school or by our parents? Psychologists, cognitive scientists, neuroscientists, and anyone who practices meditation will tell you that our ability to focus on more than one thing at a time is limited. Our attention constantly shifts from one thing to another, over and over, throughout our days. Most of this happens without us noticing.
Meditation, or mindfulness as it’s being called now, is the practice of observing this phenomenon. Sitting still for 10 minutes and just watching your thoughts is direct experiential evidence of our inability to focus on more than one thing at a time. Beginning meditators are taught to focus on their breath. It typically takes only a few seconds before a thought steals their attention. Then, the meditation guide will remind them of what they’re doing, redirecting them to focus on their breath. This process is repeated endlessly, ad nauseam.
I’ve been meditating (pretty much) daily for over a decade. In a half-hour session, if I go more than 30 seconds without a thought popping up, it’s unusual. I’ve become much better at noticing distracting thoughts and gently returning to watching. That experience has taught me that I am not in control of my thoughts. They come without being invited. They stick around for a while, and then they go. This might be the most important thing I have ever learned.
Alan Watts described awareness like a spotlight. We shine our attention on an object, and everything outside of that beam of awareness fades into the background. Advanced meditators learn to widen their awareness, like a “floodlight,” so they can take in more sensory information at once. Instead of focusing solely on breath, they take in sounds, sensations, feelings, and even sights. This tends to quiet the mind, calm the nervous system, and shift consciousness to a state of bliss or flow, at least for a few moments.
If you’re reading this, I have captured your attention. Once you’re done reading (hopefully until the end), your attention will be captured by something else until that’s interrupted by the next thing. Perhaps it will be a phone call, an urge to use the bathroom, or a craving for a cup of coffee. Maybe it will be your task list, or a cat video on TikTok. But it will be something.
Before we allowed the internet into our pockets, it was easier to control when and how our attention was interrupted. I remember having a “Do Not Disturb” button on my landline office phone. I had an executive assistant who ran interference at my office door. My parents used to take the phone off the hook so they wouldn’t be disturbed by telemarketers while they watched television (it’s hard to imagine a time when we couldn’t pause TV).
Back in the early 2000s, I was dead set against getting a Blackberry, the first device that put email into your pocket. It was like a pager on steroids. There’s a great spoof commercial from the Mercer Report that foretells the risks of putting the internet in a mobile device.
I saw how my coworkers had become nearly Pavlovian in their response to email notifications. I had an early smartphone – the Samsung Blackjack II, which offered an almost painful internet experience. It was so hard to use that I didn’t use it, which was perfect for me. I could look modern and tech-forward without sacrificing what little agency I had over my attention. But the iPhone changed everything. Smartphones went from productivity devices to lifestyle tools. It was irresistible to me and hundreds of millions of others.
There’s “nothing that I can be more preoccupied or bothered by” than the potentially adverse effects smartphones have on their users. – Jony Ive – Lead iPhone Designer
When the iPhone introduced the first 500 mobile applications via the App Store in 2008, Facebook was among them. But it wasn’t until Mark Zuckerberg prioritized mobile development in 2012 that its popularity exploded. By the end of that year over they had over 600 million monthly active users. He also acquired Instagram, and soon after, WhatsApp, cementing Facebook as a mobile powerhouse.
By 2018, the majority of Facebook’s revenues came from mobile advertising, which required the attention of its audience. The more views, the more money from advertisers, and the more profit for Facebook and its investors. Today, the three apps owned by Meta (formerly Facebook) account for 3.4 billion daily active users. That’s 40% of the Earth’s population who use one of their apps every single day.
Many people point to Social Media as the root cause of many of society’s current ills. But if you go back a couple of decades to the early days of the web, you’ll find the problem predates smartphones and social media. The problem is the economic underpinnings of the web itself and people’s unwillingness to pay for content.
It became evident in the early days of the web that people expected free access to content. Sites that tried to paywall content struggled as there were too many free options, so technology companies invested heavily in the systems required to extract money from clicks and views.
While early systems were simple, over time, the ability to programmatically serve ads based on someone’s interest emerged. These algorithms soon evolved from simply ensuring the ad someone viewed was relevant to powering the entire content delivery system of social media, keeping those 3.4 billion people glued to their screens every day.
So with algorithms pulling our attention to an endless stream of information, what impact is that having our our society? I’ll delve further into that in the next Dropping In…