It’s not surprising to most that technology can be addictive. Not so long ago, when people talked about the subject the conversation used to circle around TV or the telephone, then electronic games and more recently social networks. But there are now many other categories, like mobile apps, chat rooms or even things like electronic gambling. Some of these are technological advances useful on their own right, and are attractive or even addictive to some but not most people almost unintentionally. Others products and services however, are intended and designed to hook people’s attention. The New School’s Heather Chaplin organized an excellent panel on Feb. 2018 in New York that discussed how and why companies utilize persuasive design elements in many of the products that litter our field of attention in the modern days, and the price society and individuals have to pay because of some of their more unfortunate product “features”.
The success of products is measured by their popularity, this can be observed when we considered indicators like number of downloads, items sold, reviews or revenue. Another example is how many electronic games announce their value through the number of hours a player can expect to spend on it, even in Steam (that is currently the top electronic distributor of computer games), the chart of simultaneous active gamers is prominently displayed as one way to entice gamers to make a purchase.
The pressure and competition is obvious, but when innovation and useful design ideas appear to be scarce, many product designers are willing to do anything to grab and hold of more users and extract from them as much usage time as possible.
These psychological hacks, as explained by Nick Eyal, are design elements resulting from years of research and that intend to get people to devote attention to products and technologies. They go beyond making electronic games more entertaining (sometimes modern games feel a lot like work) or apps more useful. Persuasion turns into coercion when people get literally trapped and start wasting their energy on activities that are not in their own best interest, people are enticed to do things that they really do not want to do or things that they would not do otherwise.
While procrastinating between writing the paragraphs for this story, I discovered one small thing that YouTube does in a form of coercion: it by default auto-starts the next video when the one you’re watching is finished. There is a way to remove this, but how this is done I did not find to be very intuitive, and I assume this is by design. No, this is not the same thing as auto-play – that’s another annoying “feature”. To rid my web experience from this, I first had to click 4 times and login to discover that it’s not under settings. After a short while I gave up trying to figure it out myself, then, as with many other intentionally-counter-intuitive apps or websites out there, I had to actually Google the darn thing. By this time, I was already on the edge… but when the page I clicked with a potential solution started blasting audio from one of four auto play videos on it, I found myself scrambling to figure out which one was it before I was able to stop it. At least, it indeed explained how to stop auto-play-next on YouTube. Talk about getting stuck in unwanted territory! Ultimately the setting was in the main page right there in front of my eyes, but it was just lost to me among the many other elements of a busy web page. Anyway, I hope the setting stays, and I won’t have to re-set it every time!
The above is an extremely annoying but mostly harmless example of the Machiavellian design philosophies that plague many of the products that we depend for our daily electronic entertainment needs. But others, have much darker intent and consequences.
When they discussed petty, low-level goals commonly seen on many of the social networks and other popular of products today, I agreed completely. Many companies’ products seek to entangle people for the sake of popularity, to sell more ads or micro transactions, literally exploiting people’s attention through things like reward pathways, peer recognition. One consequence of this, is that it robs people from real world experiences, replacing their enjoyment of moments and places, interactions with family or friends with just the useless joy of the next “share” or selfie.
Roger McNamee later explained in the panel, that the profit model of these companies mandates that they implement these dark features. Services are under incredible pressure to monetize their user base. Facebook for example, has the additional challenge of having to counter dwindling popularity of its service in some demographics, for example like younger people. The intention is to find ways to keep users reading, viewing or browsing by touching their interests and emotions. This has sometimes then been exploited by third parties to their own dark intents as we have seen recently. This is successful many times because of the disparity of understanding that exists between the designer and the user of the applications’ model and algorithms, providers are in a position of power, and they actively use this to their own advantage.
Tristan Harris said it brilliantly when he explained how Google, Twitter, Facebook and most (if not all) the other social networks, engineer your “updates” scroll page with the type of closed-loop reinforcement that tries to keep people hooked ad-infinitum. This “adversarial exponential tech”, or “game of chess against your own brain” is the one that is the most destructive, I think. As explained, it creates an imbalance of negative-over-neutral feelings on many users, which is the perfect recipe for the bad things like addiction and fake news. Tristan then continues and describes Snapchat’s Streak as another of those darker features that are only aligned to keep young people hooked to the application, as there is no other higher-level benefit from it.
One very interesting aspect was brought up by Alexis Lloyd when she put up the question about, what is it exactly the problem that these products and networks are trying to solve? Are these companies inserting these features just because they’re exploring the limits of technology? (this is a not particularly high goal) Or maybe because it brings more clicks? People need to identify these petty goal apps by asking themselves if there is any real benefit to them.
Nick then goes back a little bit and reminds us that we’re talking about behaviors embedded in human nature and that individuals should hold themselves accountable for their own well-being, and we should see ourselves as always able to break the entrapment of social media usage. Addictive Technologies, and the allegedly evil private corporations behind them, most of the time are really “hijacking our brains” only because we allow them to.
On this same subject of being careful to demonize the time people invest in gaming, social networks and such, Alexis underlined the reality that each generation’s values differ, and each one will defend their own. I almost jumped when she quoted Douglas Adams’s “Anything that is in the world when you’re born is normal and ordinary.”. I think that we cannot forget that each generation’s contribution to the world and society is built on top of the previous, it remains to be seen what will be the future outcome of these technologies.
Now, trying to avoid logical doomsday conclusions, there will be no solution without a change of the profit model of the social networks, and the current tendencies will continue unless there is also a change in the way products benefit from their sheer popularity. Alexis also chips-in on this and again I agree fully, when she underlines that “the pressure is not only because it’s ad supported, but that the expectation to maximize at a scale that is really unfeasible”.
For the moment I will remain content by the fact that the relentless advances of addictive technologies are counter-pressured by talks like this where awareness is the most important outcome.