Skip to Main Content
Main Menu
Articles

Policy Briefing: UX Dark Patterns in Consent and Data Collection

Advertising standards traditionally focused on what companies can and can’t claim about their offerings. Now they’re just as focused on privacy as consumer rights, thanks to the ubiquitousness of omnichannel commerce and the many ways companies collect personal information.

While consumers are generally aware when they’re being sold to online and off, they don’t always know what they’re really paying or signing up for. User experience (UX) design has become trickier to navigate.

Conceptually, UX is meant to focus on the user’s needs.

As UX-pioneer Peter Morville explained 20 years ago with his User Experience Design Honeycomb, good UX should make products, services and systems useful, usable, desirable, findable, accessible, credible, and – most importantly – valuable. His thinking was that when companies address most or all those needs, they’ll win and keep more customers.

But these days it seems many companies are focusing more of their UX efforts on ‘dark patterns’ designed to generate quick wins for themselves, rather than addressing consumer needs.

A Reuters report on July 30, 2021, flagged dark patterns as a new frontier in privacy regulation, noting: “In the tech industry, it has become commonplace to measure product success through user engagement,” with the reporter arguing this “led to a singular business focus on growth at all costs, which as a result may gloss over or even incentivize use of manipulative practices in such pursuit”.

This kind of mainstream media reporting on dark patterns in recent years shows just how prevalent dark patterns in UX have become – but in some places at least, the law is beginning to catch up.

Dark pattern examples

Manipulative UX practices – dark patterns – typically keep users in the dark about what’s happening during online interactions. So, consumers might not be aware their privacy, online safety and/or consumer rights have been violated until after they experience harm, such as financial losses.

Common examples of dark patterns identified by Australian consumer advocacy organization Choice include:

  • Hidden costs – pre-selected add-ons (for example, extended warranties) automatically added to a user’s online shopping cart along with their purchase choice/s, which aren’t revealed until checkout. The user must then identify and remove any add-on costs they didn’t choose earlier in the transaction.
  • Confirmshaming – using coercive words to shame people into confirming actions they might otherwise feel go against their own preferences or interests. For example, a large email subscription pop-up with a discount offer if the user agrees to subscribe and share more personal information, versus a small opt-out message like “No thanks, I prefer to pay more for…(product or service type)”.
  • Forced continuity, roach motels, and ‘Hotel California’ tactics – complex and confusing navigation processes that make it very difficult to opt-out of marketing or collection of personal data, or very challenging to cancel an automatically charged paid subscription after a ‘free trial’. (‘You can check out any time you like but you can never leave.’)
  • Trick questions – messages using double negatives and other confusing words designed to trick users into confirming a choice they might not otherwise want to make. This trick often also makes the alternative options unclear or hard to find. For example, making the choice to consent to collection of personal information a simple ‘accept all,’ while other options are confusing or require reading through convoluted forms to unselect multiple options.
  • Scarcity cues and deception – displaying supposedly real-time messages generating a sense of urgency to pay for items before they run out (or disappear). For example, countdown timers showing a sale will end soon, or messages about scarce online game items that will help players level up (or survive a round) only available for a limited time.
  • Data grabs and default permissions – pre-setting privacy controls to be more permissive or potentially less safe options by default; or forcing users to share more personal information upfront (such as completing a detailed customer profile) before they can access a website, game, or service.

Key privacy and consumer rights laws prohibiting dark patterns

The Federal Trade Commission (FTC) Act

The FTC frequently alerts consumers when consumer protection rules are at risk. On September 15, 2022, it raised the alarm about dark patterns in a staff report titled Bringing Dark Patterns to Light.

The report reiterated the FTC’s commitment to combatting “tactics designed to trick and trap consumers,” including digital design features and functions that can “trick or manipulate consumers into buying products or services or giving up their privacy.”

The Federal Trade Commission Act prohibits unfair or deceptive ecommerce practices and tactics, and rules, acts, or practices are unfair if they “cause or are likely to cause substantial injury to consumers that consumers cannot reasonably avoid themselves and that is not outweighed by countervailing benefits to consumers or competition.”

The Restore Online Shoppers’ Confidence Act prohibits any post-transaction online third-party seller from “charging any financial account in an Internet transaction unless it has disclosed clearly all material terms of the transaction and obtained the consumer’s express informed consent to the charge. The seller must obtain the number of the account to be charged directly from the consumer.”

California Privacy Rights Act (CPRA)

The California Privacy Rights Act amendments to the California Consumer Privacy Act (CCPA) introduced several obligations for businesses, service providers, contractors, and third parties when they collect, manage and/or disclose personal information (including sensitive personal information).

The CPRA became effective on January 1, 2023, and is enforceable by the California Privacy Protection Agency from July 1, 2023.

The CPRA prohibits any “user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation.”

Consent mechanisms must ensure users can make informed choices about exercising their privacy rights, such as opting in or opting out of their personal data to being shared or sold. The CPRA section on consent explicitly states: “Agreement obtained through use of dark patterns does not constitute consent.”

Colorado Privacy Act (CPA)

The Colorado Privacy Act, which is effective from July 1, 2023, delivers many of the same personal data privacy rights as the CPRA and places similar privacy protection and user consent obligations on businesses that collect, manage and/or disclose the personal information of people in Colorado.

Similarly, it prohibits dark patterns, which are defined in the CPA as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice.”

How the FTC is combatting UX dark patterns in the U.S.

The FTC’s press release on dark patterns accompanying its report Bringing Dark Patterns to Light, mentions examples of cases it has brought against companies for using dark pattern tactics including:

The FTC’s biggest dark patterns case so far: Fortnite/Epic Games

Soon after publishing the Bringing Dark Patterns to Light report, the FTC pursued a massive case against Epic Games. It charged the maker of popular online game Fortnite with using “dark patterns to trick players into making unwanted purchases and let children rack up unauthorized charges without any parental involvement.”

The FTC reported Epic used a variety of dark patterns in the UX design of Fortnite to drive unintended and unauthorized charges including:

  • Designing a counterintuitive, inconsistent, and confusing button configuration which led players to incur unwanted charges based on the press of a single button.
  • Designing an in-game item purchase system which made it easy for children to buy items while playing, and without requiring any parental consent.

The FTC also found Epic violated users’ privacy rights by punishing users who disputed unauthorized charges with their credit card companies by locking them out of their accounts.

And in a separate case, the FTC alleged Epic Games violated the Children’s Online Privacy Protection Rule (COPPA) by collecting personal information from players of Fortnite under 13 years of age without notifying their parents or gaining parental consent.

The FTC also alleged Epic Games did not protect children from potential privacy invasions and harm while they played Fortnite because:

  • Voice and text communications were left open and live on-by-default.
  • Children and teens were allowed to be matched with strangers when playing Fortnite.

The FTC alleged these default privacy settings in Fortnite caused children and teens to be “bullied, threatened, harassed, and exposed to dangerous and psychologically traumatizing issues such as suicide” while playing matches.

And it also alleged some parents’ requests for their children’s personal information to be deleted were significantly delayed (further exposing children to privacy risks) or, in some cases, not honored at all.

On March 14, 2023, the FTC announced both cases against Epic Games were finalized, with orders against the game maker totaling $540 million in penalties and enforced actions including:

  1. $245 million refund settlement to be distributed by the FTC to Fortnite users tricked by dark patterns into making unintended charges.
  2. Ban on dark patterns or other methods of charging consumers for purchases without first getting their affirmative consent.
  3. Ban from blocking consumers’ access to their accounts when they dispute unauthorized charges.
  4. $275 million penalty to settle FTC allegations Epic Games had violated the COPPA Rule – the largest penalty to date for violating an FTC rule.
  5. Updates to privacy settings to comply with the COPPA Rule, including turning off voice and text communications by default for children and teens, and a ban on enabling voice and text communications for children and teens unless parents (of children under 13) or teenage users or their parent give their affirmative consent through a privacy setting.
  6. Deletion of all personal information collected from Fortnite users without parental notification and consent (a COPPA violation).
  7. Establishment of a privacy program that meets COPPA compliance and prevents the privacy and consumer protection issues identified by the FTC.
  8. Enforced regular and independent audits to monitor consumer protection and privacy rights are being met.

TrustArc helps make your online offerings ‘dark pattern proof’

We expect many more jurisdictions will soon introduce and enforce new privacy regulations explicitly prohibiting dark patterns and other deceptive user experience design tactics.

See the infographic

Get the latest resources sent to your inbox

Subscribe
Back to Top