The general ethos of UX design is providing a user experience that is both pleasant and intuitive, that fosters a positive perception of the brand and service. In essence, UX is for the users, hence the name! But a set of malicious practices intended to confuse and manipulate the user loom like an ominous dark cloud.
Oops! Something went wrong while registering your email. Please check your email is correct and try again.
As part of our UX month at MadeFor., this is the beginning of a series of three articles on our site concerning with dark UX and its moral and ethical implications. This article will focus on the ways in which dark UX rears its head in our daily lives. The second article will focus on the financial implications of dark UX’s integration into e-commerce. And finally, the third article will open a moral and ethical discussion on dark UX’s place within our modern lives.
In our modern digital landscape, a strong case could be put forward that UX designers possess some of the strongest influence in the sector. User experience is an inherently malleable concept in the hands of its designers. At its best, it can be moulded into an efficient, intuitive and genuinely beneficial experience that the user engages with whilst interacting with the service.
On the inverse, lurking in the shadows are UX practices that are effectively forms of digital malpractice. Swapping simplicity and efficiency for misdirection and frustration, dark UX is exactly what it sounds like: dodgy tactics inherently designed to confuse, exasperate and manipulate their userbase. The difference in effect of the beneficial intentions of UX and these dark UX practices are stark and disheartening!
Harry Brignull defined this nefarious concept as ‘dark patterns’ in 2010. On his website, deceptive.design, he defines them as such:
“Deceptive design patterns (also known as "dark patterns") are tricks used in websites and apps that make you do things that you didn't mean to, like buying or signing up for something. […] When you use websites and apps, you don’t read every word on every page - you skim read and make assumptions. If a company wants to trick you into doing something, they can take advantage of this by making a page look like it is saying one thing when it is in fact saying another.”
Dark UX ideals have evolved in the last 15 years alongside the development of positive and enjoyable UX design, now rearing its head in almost every aspect of online life.
When was the last time you were killing time playing a free mobile game without being goaded into spending your money on extra lives? Or how about trying to close a pop-up ad only to be misdirected by a fake or hidden opt-out button!
The ways in which companies and brands attempt to manipulate and take advantage of your online presence for their own benefit in our present day seem almost limitless: such is the world we live in! We can guarantee that you’ve experienced countless instances of dark UX in your lifetime, but here are a few specific examples of these nefarious practices.
We witness the power of persuasion every single day – from advertising to our daily human interactions, those who possess the art of persuasive language have the skill to convince others to do exactly what they want them to do.
Of course, this skill is easily transferable to the field of dark UX, as that’s exactly the plan at the end of the day – getting users to interact with a service the exact way the company desires, via any methods necessarily.
The legal grey areas surrounding UX design effectively allow for any sort of language to be used. This enables the appearance of language that aims to guilt or goad its user into adhering with the mission of the company. - this is where the morally dubious practice of confirmshaming creeps in.
Confirmshaming is effectively the act of guilt tripping a user through use of calculated language that influences users towards unwitting compliance with the brand’s intentions for its userbase. Oftentimes, such language intentionally preys on the FOMO (fear of missing out) feeling that marketing tactics and social media have helped develop in the human psyche.
Imagine that your computer notifies you with a pop-up that your operating system has just received a new update that is immediately available for download. The intention is obviously for you to update immediately. So, the option to update will be designed to be inherently more attractive than not updating. The options could be a “Yes” and a “No, I don’t want to experience new features”. Or even worse, there could be no option to decline at all, and instead a “Remind Me Later” option for the rest of time!
If user conversion or retention is on the agenda, you can guarantee that confirmshaming is the tool for the job in the dark UX wheelhouse. However, the brazen nature of such a tactic means that many users can see past it and regard the companies that use it with a negative perception.
In a recent study by Ekroth and Sandqvist, they note that “users understand it is not directed towards them personally, that it is an obvious, see-through tactic that the users will not fall for, and because they have seen it too many times before to be affected”.
However, this dark UX tactic would not still be employed if it did not achieve a certain degree of success: users must still stay vigilant against these guilt trips!
Privacy Zuckering is a term that is honestly a little bit funny when you first hear it: what on earth is zuckering and what could such a silly looking word mean? We’re afraid that the true definition is disappointingly devious, but this is dark UX design after all!
Privacy Zuckering is the terminology for a user interface intentionally deceiving you into offering up more of your personal data than you originally intend to. It’s named after Meta’s Mark Zuckerberg for very good reasons: Facebook and its obtuse privacy settings are sort of the poster boy for this type of infringement. Brignull states that:
“In its early days, Facebook had a reputation for making it difficult for users to control their privacy settings, and generally making it very easy to "overshare" by mistake. In response to feedback from consumers and privacy groups, Facebook has created a clearer, easier to use privacy settings area.”
This was zuckering in its earliest form, but the terminology has evolved long past the annoyance and inconvenience of accidentally displaying your personal details to those outside of your Facebook friend circle. Today it spreads far wider, concerning the problematic selling of personal information by data brokers to third party companies.
The legitimacy behind services like Facebook being able to collect and use your personal data in such ways stems from Terms & Conditions options that intentionally goad users, much like confirmshaming, into providing their personal data for third party usage.
Here, you can see Facebook requesting permission to use its facial recognition features. The “Accept” button already highlighted in blue encourages the user to press it without considering the terms and conditions relating to the facial recognition data being gathered.
As seen in the second image, the comparatively less flashy option of “Manage Data Setting” leads to another choice of two options: a window that could’ve simply been rendered completely null by including an option to decline in the first place. Companies like Facebook take opportunities like this to ensure they’re able to collect their users’ data, purely by doing their very best to make sure that it’s more opt-out than an opt-in procedure.
It’s undeniable that doing well in a gaming scenario – whether it be sports, video games, or gambling – produces a dopamine rush in our brain. As many of us understand, it’s undeniably easy to get addicted to this quick hit to the synapses.
So perhaps one of the most alarming tactics of dark UX is the gamification of interfaces. Gamification harnesses these positive reactions in our brains, encouraging users to pursue this serotonin boost and commit to actions that positively affect the company’s wider goals.
For example, you’re bored, scrolling through your social media of choice. You feel that you’ve exhausted all the content available on your timeline, news feed, what have you, so you pull up on your phone screen and refresh the page. Suddenly, a delightful wealth of ‘new’ content becomes available to you! Then you exhaust that content, and the process repeats itself.
This is gamification in practice; akin to hitting the slots at a casino over and over hoping for something more substantial, this is how UX interfaces can keep you constantly glued to and interacting with them.
Gamification is everywhere in modern UX, and whilst adding aspects like rewards and progression to things like fitness and to-do list apps is undeniably fun, Ilia Markov notes:
“The biggest challenge with gamification is that it creates a level of micro-stress that adds to an already hectic and stress-inducing daily routine. […] Because our brains respond so well to the rewards of gamification, they also respond negatively when we fail to get the reward.”
The integration of gamification into modern UX is designed with the full intention to invoke feelings of addiction and compulsivity, creating an attachment to the service that can be extremely difficult for users to detach from.
The adverse effects can be so potentially detrimental that it may be worth it for users to forgo the gamification elements within apps and services completely, in order to avoid the negative stimuli associated.
In our present day, it’s clear that dark UX practices have become integrated into countless aspects of our daily lives. And from a corporate perspective, why not - It gets results after all, right?
The act of manipulating and tricking users to directly benefit the wider goals of the company without them realising is ethically dubious at best, and these three methods of confirmshaming, privacy zuckering and gamification are leading the charge in dark UX’s army of malpractice.
The moral questions of dark UX trickery only worsen whenever real financial implications for the users become involved. E-commerce is where the truly devilish aspects of dark UX come to light, which shall be discussed in depth in the next article. Keep an eye out for the next one on the MadeFor. website!