Speaking at an event in 2017 Sean Parker warned about the unknown societal consequences of widespread social media use, saying he had become “something of a conscientious objector” against social platforms.
“The thought process that went into building these applications, Facebook being the first of them, … was all about: ‘How do we consume as much of your time and conscious attention as possible?’”
And that means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that’s going to get you to contribute more content, and that’s going to get you … more likes and comments.”
It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology. The inventors, creators — it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom on Instagram, it’s all of these people — understood this consciously. And we did it anyway.”
The proliferation of ‘retention’, ‘engagement’ and ‘time spent in product’ as valid business metrics are not in any way helping us to provide better experiences. Tying ourselves to these Pavlovian metrics is sailing digital products in the wrong direction and **its obvious that what is best at capturing our attention isn’t best for our well-being. 'Happiness' and ‘Task Success’ may be harder to measure, but are infinitely more powerful metrics in building successful brands than engagement.
The race to keep us on screen 24/7 makes it harder to disconnect, increasing stress, anxiety, and reducing sleep. Our children are caught in an endless cycle of checking their phones incase they miss out. We are replacing relationships with ‘like streaks’. YouTube and Netflix autoplay the next algorithmically crafted video, even if it eats into our sleep. That’s not even beginning to go down the rabbit hole of what is going on with algorithmic content creation. From James Bridle’s article earlier this month that caught significant attention.
In fact these sorts of engagement metrics are downright dangerous when abused and the Silicon valley giants have no interest in changing their approach. Algorithmic timelines designed to keep us glued to our devices have us smacked to the eyeballs on content consumption in a zero sum game of attention- whether it is healthy or not. Not only that, but these algorithms are also significantly impacting society.
How many of your have turned off notifications on the various social products that you use to sleep better, to concentrate, to stop you from picking up your phone whilst driving? Twitter’s notification timeline is a great example of how desperate these apps are for your ‘engagement’. The following is a quick list of the scenarios Twitter use when they send a notification:
- When someone you follow tweets after a while.
- When someone you have engaged with frequently tweets something
- When someone you follow retweets someone else.
- When someone likes a tweet you were mentioned in
- When two or more of your followers follow someone else
- When someone retweets you
- When someone likes something you tweeted
- When someone @replies you
- When someone follows you
- When there is new ‘Highlight tweets’ of people you follow
Facebook's list of notification settings also speaks volumes. When a product is sending notifications for external activity that you are aren’t responsible for in any way it is a pretty good indicator of a dark pattern in UX design. As gatekeepers during product development it is our responsibility to think deeply about the experiences we create and to consider whether they are positively impacting on our end users lives.
Anyone building digital products today should be taking a responsible stance towards building potentially addictive experiences. We need products built on true value, and genuine success of customers, not low barrier behavioural growth hacks that are damaging at best.