We visit different websites every day, navigating them in an intuitive way. Teams of designers are working to ensure that websites are user-friendly and simple. However, if the X icon is so small that if it is clicked on, an ad is shown, or if deleting an account is much more difficult than creating one, you are dealing with a dark pattern. Such tricks force users to make decisions, such as renewing unwanted subscriptions or agreeing to the disclosure of personal data, under pressure or in the dark.
Digital tricks
When you go to the website of a given company, it informs you about the use of cookies and then asks you to “accept” them, which can usually be done by clicking on a big, prominent button. But if you want to reject cookies, you will have to go to the Settings menu and turn them off manually. Most people don’t have the time or desire to do that for every single website they visit.
This is an example of a dark pattern, or a deceptive design pattern, which is an interface that influences the decision-making of users. Researcher Harry Brignull coined the term “dark patterns” in 2010 and has been keeping tabs on them ever since on his website. Manipulation tools existed long before the Internet came along, but with the growth of the Internet and due to the e-commerce boom dark patterns have become ubiquitous. Mr. Brignull has been collecting such examples in a section called “Hall of Shame”; among the companies shamed are Google, Apple, Reddit, Instagram (it is owned by Meta Platforms, recognized as extremist in the territory of Russia), and so forth.
Dark patterns are everywhere. When you watch a movie on a video piracy website, a small pop-up ad appears at the bottom of the screen. Then, a bright X icon appears; a little later, a similar but nondescript and smaller one. To quickly close the annoying pop-up window, you click on the first X icon, but it turns out to be a link to the product promoted. To close the window, the second icon, the nondescript one, should have been used.
When analyzing dark patterns, the concept of affordance, that is, the ability of an object to give a hint about how it can be used, should likely be introduced. For example, the two bows of dressmaking shears, the big one and the smaller one, make it obvious how the shears should be held. A long bag strap is clearly supposed to pass across the shoulder. Clues can be obvious or hidden. Affordances used in web design allow us to navigate web pages in an intuitive way. The fact that our automatic behavior plays into the hands of unscrupulous companies and the bright Accept button does not necessarily mean that the choice is right is another thing.
Rejection troubles
The interface is not the only place where dark patterns hide. This term also refers to the substitution of concepts. For example, because social networks use the terms “activity” and “personalized” instead of “surveillance” and “targeting,” users are sometimes not really aware of what they give permission to.
Dark patterns are often directly related to monetary expenditure. A user who purchased a cheap short-term subscription to watch a movie or read an article will hardly receive notification that the subscription has expired or will be renewed next month.
The same goes for high demand or purchasing a single product. “We recommend that you hurry up and place your order. Only a few items left in stock,” says a notification of an online store. Dark patterns create an artificial frenzy and force customers into wasting money on unnecessary products.
Think about the websites of airlines, which offer to purchase an insurance plan or buy extra baggage allowance even if the customer has already chosen a ticket that does not provide such options.
The so-called Privacy Zuckering technique is usually used by services to collect personal data. The pattern is named after Facebook (it is owned by Meta Platforms, recognized as extremist in the territory of Russia) founder Mark Zuckerberg due to the huge number of dark patterns used in the company’s app. In 2019, Facebook was fined $5 million for using one of these manipulation techniques. The interface seemed to provide an opportunity to restrict access to personal information, but it soon turned out that the feature did not work. The investigation started after personal data of millions of users leaked online.
In some cases, the obstacles created in order to make people choose the option that is not contrary to the interests of companies are extremely hard to overcome. For example, if you want to delete your Amazon account, you will have to contact the marketplace directly. You will have to fill in a few forms, send an email requesting the closure of the account as well as a response to the confirmation notification, and acquaint yourself with the information on why that should not be done.
Dark patterns are an unethical way to interact with users. Doctor of Law Jacob Strahilevitz and Jamie Luguri of the University of Chicago Law School studied the effect they have. They offered members of two different groups to sign up for an insurance plan. Some of them were shown an “honest” interface; others, dark patterns. The manipulative options won. Then, the researchers decided to find out how the users who had purchased the insurance under pressure were feeling. It turned out that they were feeling angry and annoyed, and some of them preferred to stop working with the interface despite forfeiting money that they could have earned for completing the survey.
How to eliminate dark patterns?
There is now a growing movement to ban dark patterns. Since March 15, 2021, the California Consumer Privacy Act, for instance, has been prohibiting companies from using dark patterns that can mislead users and force them into sharing personal information.
Earlier, the UK Competition and Markets Authority banned property rental websites from using dark patterns that create an illusion of high demand.
In Russia, there are no such regulations today, but broadly speaking, the law on “Consumer Rights Protection,” for instance, does not allow hard selling, which can refer to manipulating people into buying extra products. Also, customers have a right to refuse to share their personal information when purchasing goods or services, which hampers data collection. However, practice shows that hardly anyone has adjusted to the new standard.
Besides the regulation, the best solution is to be more attentive and increase one’s consumer literacy. “If you know what cognitive biases are and the kind of tricks that can be used to change your mind to persuade you to do things, then you’re less likely to have them trick you,” Mr. Brignull believes.
The researcher also recommends calling out companies publicly. If a site is widely criticized for misleading customers, developers might take steps to correct the design in order to keep customers happy.