The dark side of UX design uses design choices to steer people into making decisions. Under normal situations, they wouldn’t make them. subscribing after a visual cue is straightforward. it’s when you decide to opt out. the opt-out button is either hidden on pages people don’t read. or compared to the subscribe button, is smaller.
An example of this is when you thought you canceled a subscription, yet after some time you’re billed. I remember canceling my Spotify subscription. on the website, there’s no way to reach their representatives. they urge you to contact them via their Twitter handle.
After canceling the subscription, I couldn’t find a way to remove my card from their file. In case, they choose to bill me for a clause I didn’t read (like I said info hidden on pages people don’t read).
I messaged them on Twitter and called them out for using dark patterns in UX design. There’s a visible “Renew” subscription button popping out on the website, but no way to clear your card data. The representative said they were aware of it. And will communicate my feedback to their UX design team.
I mean the balls! Spotify uses deception on purpose. and it isn’t only them, many companies do it.
Types of dark patterns in UX design
There are various types of dark patterns in UX design. Some often-used lie under the following broad categories:
- Interface interference
- Forced action
You’re continually pestered about the functionality and it isn’t limited to a single interaction. For example Google Location Services, Instagram, and Uber.
Google Location Services spam you by repeatedly asking for permission access. The only way out is to give the permission and the “don’t show again” checkbox is enabled. You can’t mark the “don’t show again” checkbox before you’ve granted the said permission.
The Instagram app asks you to either turn on the notification or it pesters you the next you open it. The notification popup has “not now” and “ok” as options. Meaning you can’t deny the app from showing you notifications.
Uber prevents you from stopping for the day by displaying some random income target or goal, say $40. And present you with two options, “keep driving” or “go offline”. A driver or a person may feel enticed to carry on driving.
You can further classify Obstruction into Roach Motel, Price Comparison Prevention and Intermediate Currency, etc. Obstruction comes down to making it difficult for the user to take the intended action. Instead, it seeks to dissuade those actions.
The Ladder: Premium Job Site asks users to sign up to browse jobs. But then to apply to those jobs you have to upgrade from Free (Basic) to a Premium plan. This falls under the “Price Comparison Prevention” dark side of UX design and also “Forced Continuity” because the user follows the procedure in the hopes of applying for a job when suddenly he’s stopped by the paywall.
The 1-month Premium plan continues to bill you even after the month’s expiry. That’s stated in tiny print. So you’d have to focus to read it.
A “Roach Motel” example is of Boston Globe which makes it hard to cancel the subscription. A user has to go to their FAQs only to find out the option to cancel is by contacting the Boston Globe over the phone.
Similarly, iOS hides the option to disable ad tracking in an irrelevant place in the Settings menu. iOS option uses the dark side of UX design by asking if you want to Limit Ad Tracking by turning the toggle on. It’s known as “Trick Questions”, meaning you can’t opt-out of ad tracking completely.
The dark side of UX design here uses the “Sneak into Basket” technique. For example, when you search for a domain name on Namecheap, ‘floweracne’ (it came randomly to me while writing this post).
As you can see, when I click Add to cart, the price displayed is $12.58, yet near the Checkout button, it shows $12.77.
Why? When you click View Item, Namecheap lets you know about the ICANN fee that’s compulsory and there’s nothing you can do about it.
It’s a meager amount but what I’m trying to establish is how a Sneak into Basket works.
You can classify most services that use the freemium model under Forced Continuity. Apps everywhere such as Spotify, tell you to try the first month free. And after the free month is over, it charges you.
Why is it among dark UX pattern examples when it’s SaaS? Because the banner draws attention to FREE! And the fact the service charges you at the end of the month is spelled out in the small text. You can easily miss it if you aren’t paying attention. Pretty sneaky.
Another tactic is “Hidden Costs”. In the example of “next.co.uk”, the radio button is marked by default and says, your first directory is free. Whereas, if you read the fine print, you end up consenting to a credit check and credit account opened. “Next UK” sends you brochures at £3.75 each, four times a year.
“Trick Questions” is brilliantly used by Time Warner Cable Chat Feedback. The menu to select rating turns to “10” when you type “1”. You’d notice most of these techniques overlap. You can refer to the above technique as “Bait and Switch” too.
4. Interface Interference
As implied, the UI wants you to prefer one option over the other. “Audible” and the Spotify example above, certainly fit the criteria. But let’s consider another example. Ever visit a website where the website asks you for cookie consent?
Of course, you have. The popup would normally say “Accept cookies” and “Manage settings”. In your bid to deny cookie consent, you’d click the latter options. I have, God knows how many times. Each time, the menu has essential cookies toggle “greyed out” aka disabled. Other options you can disable.
The point is, it gives you a false sense of achievement where you’ve prevented cookie consent. Yet those under the “essential” label are enough to follow you around on the web. And that’s what you didn’t want in the first place.
Interface Interference is designed to confuse and complicate the step where the user can’t take the desired action.
Another technique is called “Toying With Emotion”. You guessed it. It preys on your emotions. Let’s say you’re deleting an iOS app. On the Delete prompt, the description reads something on the following lines:
“Deleting this app will also delete its data, but any documents or data stored in the Cloud will not be deleted.”
And you’re presented with the following two options:
- Delete (In red)
Look how the “Cancel” button comes first. The human eye reads from left to right and that’s how you introduce dark patterns in UX design.
5. Forced Action
This technique is best understood through an app on your phone or a website. A particular ad, as shown below, appears to the user as a game. When the user hovers it, the target moves with the cursor or your finger. When you try and stomp the spider, you’re taken to a different app on the store or a website.
You’re also forced when you wait for the ad to finish. A small 20-second timer, sometimes more, displays in the top right or left of your screen. Once it finishes, you’re given the option to close “X” the ad. Most users would end up clicking at least once or more. In other words, 90% of the time.
Privacy Zuckering also takes place. It’s named after the King of Meta himself. It’s when you’re deceived into sharing more information than you originally intended. For example, marking the checkbox for “store my card info” on a website, permits them to share your data with third parties. That’s how online browsing profiles are created and retargeting thrives.
When did you consent, you ask? In the fine print under their Terms & Conditions, you’ve consented to it.
Check out this exercise to learn about dark UX pattern examples or if businesses are using ethical design.
How much deception is enough and where do you draw the line?
I mean you can’t complain. It’s the way of the world today. But if you’re a designer or a retailer, you must ask yourself, if it’s how you want to generate sales or drive clicks. Ethical companies aren’t likely to do business with you if you answered “Yes” to the aforesaid question.
Furthermore, the internet’s made users smart. For the most part. They seek transparency and loathes treachery (for lack of a better word). GDPR & CCPA are in place to provide users protection against unauthorized use of their data or that gained without their explicit consent.
In such an online atmosphere, if you’re relying on the dark side of UX design, it’s only going to damage your reputation and you’ll end up losing customers. Ideally, one must strive for an organic means of conversion and it applies to both business owners and designers. From the looks of it, there isn’t much you can do about dark patterns in UX design. Neither there’s an alternative. You have to traverse your way through it.
Be conscious when taking decisions online (visiting a website or clicking something). Such cautious measures can help you step out of the dark patterns. Designers do it in the name of creativity and it’s their responsibility to say “No” to manipulative designs when they’re asked.
Additional reading🔖: Learn to put your writing prowess to write text for buttons and tooltips.