In our last two articles surrounding dark UX practices, we’ve witnessed the variety of methods that are constantly deployed in today’s digital landscapes to trick, manipulate and misdirect users into things they may not intend to do. Essentally, forget ‘honesty is the best policy’ whenever these practices are in use!
From looking at practices from confirmshaming to roach motels, it’s clear to see that there’s some serious shadiness occurring with how those who incorporate these practices treat their users. The uncertainty on how to deal with these issues is huge: the legislation is not currently in practice to officially declare whether these processes are even ethically or morally justifiable. The answer is steeped in confusion, hidden in muddy waters and legal grey areas.
It's difficult to come to terms with the idea that these practices are ethically unjust with inherently damaging effects when considering how deeply intertwined they have become with UX design and creating digital services. They’ve effectively become part of the playbook with smaller scale companies putting them in practice, perhaps without even realising!
A 2019 study from Princeton University on the modern prevalence of dark patterns reinforced these thoughts to a startling extent. The study revealed that out of 11,000 websites included in their data set, over 11% included some form of dark UX that relied on user deception. As well as this, they discovered that the more popular ecommerce websites were more likely to feature forms of dark UX.
This normalisation of dark UX practices is troubling when you take a step back and look at them from a holistically ethical viewpoint. It’s just undeniably wrong to misdirect users into actions that they didn’t intend in order to serve the company’s interests first.
It’s also ethically unjust to commit these actions due to the negative effects they entail for users and the standard it sets for the wider industry. They undoubtedly help portrays the company using them to a low standard, as a business that feels comfortable with underhanded tactics and is therefore the antithesis of ‘user-friendly’ design.
As well as this, we previously noted in our last dark UX article that their usage in an ecommerce setting is deeply troubling, primarily due to the financial implications involved for the user. We understood that the reliance on user deception and manipulation for short-term gain isn’t ethically sustainable from the way they shatter brand loyalty and customer trust.
So, we’ve established that these dark UX patterns are ethically shady at best and immoral at their very worst. And yet they’re still employed far and wide across our modern digital landscape, particularly ingrained into the fabric of larger companies. What can be done to prevent this, or at the very least make known the effects of these patterns?
On an individual level, Harry Brignull (the original anti-dark UX brigadier!) states that the best way to avoid these dark tactics is to remain keenly aware of these tricks and shame the companies who use them on online public forums.
On a wider industry level, the aspiration for change lays with governing legal bodies, and there have been significant steps in the right direction to disallow the usage of these dark patterns and regulate the companies making use of them.
In an interview with Marketplace, Brignull notes that in the last few years, the conversation around dark UX has thankfully exploded, with legal action being taken against its many forms. For example, the California Privacy Rights Act takes aim at companies who manipulate and trick users into providing private data with the intention of selling it to third parties.
In Europe, the Digital Services Act aims to forbid multiple forms of dark UX. It’s not yet been introduced - as Brignull says, the balance between helping users whilst not overly damaging business is a very fine line to walk on – but the Act’s existence implies the overt desire for a digital future where dark UX is less prevalent and less potentially damaging to the average user. On these impending laws, Brignull stated:
“Well, I think there’s going to be a bit more regulation. And there’s gonna be some more laws. And they’ll probably be some good ones written and some bad ones written. […] And I think, obviously, when the laws change, and they know that they’re up for really big fines or big problems, they’re going to pay attention. If there’s one language that businesses speak, it’s the language of money. And I think that’ll be very good for consumers who will then have a safer worldwide web and safer app stores to use.”
In our eyes here at MadeFor., it’s clear that these black hat tactics are backhanded, deceiving and outdated. They were conceptualised and created for a digital climate that was not user-centric and placed short term profit over long term user happiness, brand reputation and loyalty. The user experience landscape has shifted dramatically since then, and users genuinely do deserve better from the products and services that they use.
The future is well and truly open-ended for legislation surrounding dark UX to come into effect. The hopeful outcome is indeed a safer digital environment for users, but if there’s one thing we’ve learned, it’s that these practices are evasive and chameleonic.
Regulations will have to be suitably airtight to avoid loopholes ready to be abused, but will this rigidity end up becoming overly detrimental to user and company alike? Only time will tell, but the future looks bright for UX design to eventually be able to shun these dark tactics.