Jump to content

Dark pattern: Difference between revisions

From Consumer Rights Wiki
Removed AI notice
 
(15 intermediate revisions by 11 users not shown)
Line 1: Line 1:
{{Incomplete}}
{{Incomplete}}


'''Dark patterns''' refer to deceptive design practices used in websites, applications, or digital interfaces to manipulate users into making decisions that benefit the organization implementing them, often at the expense of the user’s best interests. Coined by user-experience (UX) specialist Harry Brignull in 2010, the term has since become a critical focus in consumer advocacy, as these practices undermine transparency and user autonomy.
[[wikipedia:Dark_pattern| Dark patterns]] represent a growing concern in digital interfaces, referring to manipulative design practices that trick or influence users into making decisions that may not align with their true preferences or interests. These techniques exploit cognitive biases and behavioral psychology to benefit businesses, often at the expense of user autonomy. Initially coined by UX designer Harry Brignull in 2010, the concept has evolved into a significant focus of regulatory scrutiny and academic research .<ref name=":0">{{cite web |title=Bringing Dark Patterns to Light |url=https://www.ftc.gov/reports/bringing-dark-patterns-light |archive-date=September 16, 2025 |archive-url=https://archive.ph/TZ5v3 |publisher=Federal Trade Commission |date=September 2022}}</ref><ref name=":1">{{cite web |last1=Brignull |first1=Harry |title=Dark Patterns: inside the interfaces designed to trick you |url=https://www.deceptive.design/ |archive-date= |archive-url= |website=Deceptive.Design}}</ref>


===Types and examples of dark patterns===
The prevalence of dark patterns is remarkably widespread. A 2019 study examining 11,000 e-commerce websites found approximately 10% employed deceptive practices, while a 2022 European Commission report indicated that 97% of popular apps used by EU consumers displayed them.
Dark patterns can be found across multiple industries and platforms. While some examples overlap or share common tactics, all rely on manipulation and deception to achieve their goals. Listed below are notable types:


*'''[[Post-purchase EULA modification]]''': Critical [[End-user license agreement|end-user license agreements]] (EULAs) are hidden or presented only after purchase, making it difficult for users to review terms before committing. These agreements may be located inside packaging, under a lid, or displayed upon initial activation, limiting informed consent. This tactic overlaps with "forced continuity" and "roach motel" practices.
== Definition and terminology ==
The term ''dark patterns'' was originally defined by Harry Brignull as ''"design tricks that manipulate users into taking actions they didn't intend to."'' The Federal Trade Commission (FTC) describes them as ''"design practices that trick or manipulate users into making choices they would not otherwise have made and that may cause harm."''<ref name=":0"/><ref name=":1"/>


*'''Forced continuity''': Users are charged for a subscription or service after a free trial without adequate warning or an easy cancellation option.
There is ongoing discussion regarding the most appropriate terminology. Alternative labels include ''deceptive design'', ''manipulative UX'', ''coercive design'', or ''anti-patterns''. Some advocates argue for terms like ''deceptive patterns'' to more accurately describe the intentional nature of these designs and avoid potential racial connotations. Brignull himself has transitioned to using ''deceptive.design''.<ref name=":1"/>


*'''Roach motel''': Users can easily sign up for a service or subscription but will find it extremely difficult to cancel.
What distinguishes dark patterns from merely persuasive design is their exploitative nature – they are not about creating value for users but about benefiting the service provider through manipulation and deception.


*'''Hidden costs''': Additional charges are revealed only at the final stages of a transaction, after users have already invested significant time.
== Common types and examples ==
Research has identified numerous specific dark patterns, with one comprehensive study proposing a taxonomy comprising 68 distinct types. These manifest across various industries and digital contexts.


*'''Sneak into basket''': Items or services are automatically added to a shopping cart without explicit consent.
=== Obstruction patterns ===
These designs make desired actions (like rejecting tracking) significantly more difficult than accepting alternatives. A classic example is the ''Roach Motel'' pattern, where signing up for a service is straightforward but cancellation is excessively difficult. The FTC highlighted this pattern in their case against ABCmouse, where cancellation was made "extremely difficult" despite promising "Easy Cancellation."


*'''Confirmshaming''': Users are guilt-tripped into taking a particular action by framing alternative options unfavorably (e.g., "No, I don’t want to save money").
=== Interface interference ===
This category includes designs that manipulate interface elements to steer user behavior. Misdirection focuses user attention on one element to obscure another critical detail. Disguised ads blend advertisements with genuine interface elements, like fake "Download" buttons on software websites.<ref name=":0"/>


*'''Privacy Zuckering''': Tricking users into sharing more personal information than intended, often through misleading privacy settings or pre-checked consent boxes.
=== Forced action ===
These patterns require users to complete unnecessary actions to access desired functionality. Forced registration demands that users create an account to complete a task. Forced continuity involves automatically transitioning users from free trials to paid subscriptions without adequate notification. The FTC alleged that Adobe violated regulations by ''"tricking customers into enrolling in subscription plans without proper disclosure."''<ref name=":0"/><ref name=":8">{{cite web |title=FTC Charges Adobe |url=https://www.ftc.gov/news-events/news/press-releases/2024/06/ftc-charges-adobe-two-company-executives-hiding-early-termination-fees-making-it-difficult-cancel |publisher=Federal Trade Commission |date=June 17, 2024}}</ref>


*'''Disguised ads''': Ads are designed to look like native content or legitimate interface elements, tricking users into clicking.
=== Sneaking and information hiding ===
These practices involve concealing or obscuring material information from users. Hidden costs reveal unexpected fees only at checkout, a practice employed by ticketing platforms. Drip pricing advertises only part of a product's total price initially and then imposes other mandatory charges later.<ref name=":0"/>


*'''Trick questions''': Questions are confusingly or misleadingly phrasing to elicit unintended responses from users.
=== Social proof and urgency ===
These patterns exploit social influence and time pressure to manipulate decisions. False activity messages misrepresent site activity or product popularity. False scarcity creates pressure to buy immediately by claiming limited inventory. Baseless countdown timers display fake countdown clocks that reset when expired.


*'''[[Fear of missing out]] (FOMO)''': The use of time-limited availability to instill a sense of urgency in users, causing them to devote more time or money in order to acquire a product, service, or digital item. This tactic overlaps with "misleading scarcity message" practices.
== Mind tricks and business incentives ==
=== Cognitive biases exploitation ===
Dark patterns effectively manipulate users by leveraging well-established cognitive biases. Default bias describes the tendency to stick with pre-selected options, exploited through pre-ticked checkboxes. Inertia makes users more likely to choose the path of least resistance. The tendency to prefer avoiding losses, loss aversion, is triggered through messages suggesting users ''may lose functionality'' if they decline certain options.


*'''Misleading scarcity messages''': Phrases like "Only 3 left in stock!" or "Offer expires in 10 minutes" are displayed to create a false sense of urgency, even when the product is widely available.
The effectiveness is enhanced through A/B testing and data analytics, allowing companies to refine dark patterns based on actual user behavior. This data-driven approach represents a significant evolution from earlier deceptive practices.  


*'''Friend spam''': Users are encouraged to share personal information about their friends or contacts, which is then used for unsolicited marketing.
=== Incentives and short-term gains ===
The persistence of dark patterns is driven by their effectiveness in achieving short-term business objectives like increased conversion rates. Additionally, the competitive landscape fosters copycat behavior, as companies mimic their rivals' strategies.  


*'''Default-settings exploitation''': Options that benefit the company are pre-selected, such as extensive data sharing or auto-renewal of subscriptions.
Research suggests these short-term gains often come with long-term consequences . Studies indicate that "once users feel manipulated, they don't just avoid your settings—they avoid your brand." The erosion of trust can have significant business implications.  


*'''Obstruction''': Simple tasks are made unnecessarily complicated, such as requiring users to navigate multiple steps to cancel a subscription or delete an account.
== Legal and regulatory landscape ==
*'''Bait and Switch:''' A user sets out to do one thing, but a different outcome happens instead. For example, clicking a button that says “Download” initiates a purchase instead.
=== United States framework ===
*'''Forced Action (a.k.a. Forced Engagement):''' Users are required to perform an unrelated task to proceed - for example, making an account or subscribing to a newsletter just to access basic content or features.
In the United States, regulation occurs primarily through existing consumer protection statutes . The FTC Act empowers the Federal Trade Commission to take action against "unfair or deceptive acts or practices in or affecting commerce."<ref name=":9">{{cite web |title=FTC Act |url=https://www.ftc.gov/legal-library/browse/statutes/federal-trade-commission-act |publisher=Federal Trade Commission}}</ref>
*'''Misdirection:''' Attention is purposefully drawn to one element to distract from another - often used to downplay important opt-outs, costs, or alternatives.
*'''Visual Interference:''' Design elements such as misleading colours, button sizes, or placements make it hard for users to make informed choices (e.g., greyed-out opt-outs that are still clickable).
*'''Gamification for Manipulation:''' Using badges, streaks, or points to incentivise continued use or spending, beyond what’s rational or in the user’s best interest.
*'''Nagging:''' Repeatedly prompting the user to take an action they’ve already declined - e.g., “Are you sure you don’t want notifications?” shown on every login.
*'''Hidden Subscription (a variant of Forced Continuity):''' The cost and terms of a subscription are hidden during sign-up or obscured in fine print, often leading users to unknowingly commit to recurring charges.
*'''Price Comparison Prevention:''' Limiting a user’s ability to compare prices across competitors - e.g., by using unique product names or obscuring base pricing models.
*'''Intermittent Rewards:''' Randomised rewards (like loot boxes or algorithmic content feeds) are designed to mimic patterns from games in which players wager and encourage compulsive behaviour.
*'''Trick Timers:''' Timers that restart or extend themselves after expiring to simulate urgency and encourage immediate decisions based on false time pressure.


===Why dark patterns are problematic===
In October 2024, the FTC amended its Negative Option Rule to include specific requirements for cancellation mechanisms, implementing a "Click-to-Cancel" provision.<ref name=":10">{{cite web |title=FTC Strengthens Negative Option Rule |url=https://www.ftc.gov/news-events/news/press-releases/2024/10/ftc-strengthens-rule-protect-consumers-deceptive-subscription-practices |publisher=Federal Trade Commission |date=October 11, 2024}}</ref>
Dark patterns are more than just unethical design choices. They have real-world consequences for consumers and society. Key concerns include:


*'''Erosion of trust''': Users lose confidence in platforms that manipulate their choices, undermining long-term loyalty.
=== European Union's approach ===
*'''Financial loss''': Consumers often incur unexpected charges or fees, because of deceptive practices.
The European approach combines general consumer protection laws with data privacy-specific regulations. While the General Data Protection Regulation (GDPR) doesn't explicitly mention dark patterns, its requirements for valid consent effectively prohibit many deceptive designs.<ref name=":11">{{cite web |title=Guidelines on Dark Patterns in Social Media Platform Interfaces |url=https://edpb.europa.eu/our-work-tools/documents/public-consultations/2022/guidelines-32022-dark-patterns-social-media_en |publisher=European Data Protection Board |date=2022}}</ref>
*'''Privacy violations''': Trickery in consent-gathering leads to the misuse or overcollection of personal data.
*'''Exploitation of vulnerability''': Vulnerable populations are disproportionately affected, such as children or those with limited digital literacy.


===Regulatory efforts===
The Digital Services Act (DSA) and Digital Markets Act (DMA) further address dark patterns by prohibiting practices that "deceive or manipulate" users.<ref name=":12">{{cite web |title=Digital Services Act |url=https://digital-strategy.ec.europa.eu/en/policies/digital-services-act |publisher=European Commission}}</ref>
Governments and consumer-protection organizations are increasingly scrutinizing dark patterns. Key developments include:


*'''[[California Privacy Rights Act]] (CPRA)''': Prohibits the use of deceptive designs to obtain consent for data collection.
=== Enforcement cases and penalties ===
*'''[[EU General Data Protection Regulation]] (GDPR)''': Mandates that consent be informed and freely given, effectively targeting privacy zuckering.
Recent years have seen significant enforcement actions :
*'''[[Federal Trade Commission]] (FTC)''': In the U.S., the FTC has taken action against companies employing dark patterns, including fines and enforcement actions.
* Epic Games paid $245 million to settle charges related to deceptive patterns in Fortnite.<ref name=":13">{{cite web |title=Epic Games to Pay $245 Million |url=https://www.ftc.gov/news-events/news/press-releases/2022/12/epic-games-pay-245-million-ftc-refund-consumers-accused-tricking-users-making-unauthorized-charges |publisher=Federal Trade Commission |date=December 19, 2022}}</ref>
* Noom paid $62 million to settle charges regarding deceptive subscription practices.<ref name=":14">{{cite web |title=Noom to Pay $62 Million |url=https://www.ftc.gov/news-events/news/press-releases/2024/03/noom-pay-62-million-settle-ftc-charges-it-misled-consumers-about-its-diet-programs-use-consumer-data |publisher=Federal Trade Commission |date=March 7, 2024}}</ref>
* TikTok received multimillion-euro fines for failing to protect children's data through manipulative consent practices.


===Combating dark patterns===
== Impact on consumers and businesses ==
Consumers and designers can take steps to identify and combat dark patterns by:
=== Consumer harms ===
Dark patterns create multiple forms of harm for consumers, ranging from financial losses to privacy violations and emotional distress . Privacy harms occur when users are manipulated into sharing more personal data than they intended. Emotional and psychological harms include frustration, stress, and feelings of betrayal.<ref name=":0"/><ref name=":11"/>


*'''Raising awareness''': Educating users about common dark patterns helps them make informed decisions.
Vulnerable groups are disproportionately affected. "People with low digital literacy, cognitive impairments, or disabilities often struggle to recognize manipulative designs."
*'''Transparent design principles''': Advocating for ethical design practices that prioritize user autonomy and clarity.
*'''Policy advocacy''': Supporting stronger regulatory frameworks to hold organizations accountable for deceptive practices.
*'''Third-party tools''': Using browser extensions and tools designed to block or highlight manipulative elements.


===Conclusion===
=== Business implications ===
Dark patterns undermine the principles of fair commerce and user empowerment, exploiting human psychology for profit. As awareness grows, collaboration among consumers, designers, and regulators will be essential to curb their prevalence and ensure digital spaces remain transparent and trustworthy.
While dark patterns may deliver short-term benefits , they often create long-term risks for businesses. The erosion of consumer trust can have lasting negative impacts on customer retention and brand reputation. Businesses also face increasing regulatory risks as enforcement actions become more common and severe.<ref name=":0"/>
 
== Detection, avoidance, and mitigation ==
=== Technical detection and tools ===
Efforts to automatically detect dark patterns are evolving but face significant challenges. A comprehensive study found that existing tools could only identify 31 of 68 identified dark pattern types, a coverage rate of just 45.5%. The study proposed a Dark Pattern Analysis Framework (DPAF) to address existing gaps.
=== Ethical design alternatives ===
Companies can implement ethical alternatives that respect user autonomy. Providing balanced choice architecture where users can decline as easily as they accept represents an ethical approach for obstruction patterns. Designers should implement neutral default settings that don't assume consent.
 
Transparency and clear communication are essential. Companies should provide honest explanations of data practices and costs in clear, understandable language.
 
=== Consumer protection and advocacy ===
Consumer education plays a crucial role. Initiatives like the Dark Patterns Tip Line allow users to report deceptive designs they encounter. Advocacy organizations provide resources to help identify and avoid dark patterns.<ref name=":1"/>
==References==
{{Reflist}}


[[Category:Anti-Consumer_Practices]]
[[Category:Anti-Consumer_Practices]]
[[Category:Common terms]]
[[Category:Common terms]]

Latest revision as of 13:21, 16 September 2025

⚠️ Article status notice: This article has been marked as incomplete

This article needs additional work for its sourcing and verifiability to meet the wiki's Content Guidelines and be in line with our Mission Statement for comprehensive coverage of consumer protection issues.

This notice will be removed once sufficient documentation has been added to establish the systemic nature of these issues. Once you believe the article is ready to have its notice removed, please visit the Moderator's noticeboard, or the discord and post to the #appeals channel.

Learn more ▼


Dark patterns represent a growing concern in digital interfaces, referring to manipulative design practices that trick or influence users into making decisions that may not align with their true preferences or interests. These techniques exploit cognitive biases and behavioral psychology to benefit businesses, often at the expense of user autonomy. Initially coined by UX designer Harry Brignull in 2010, the concept has evolved into a significant focus of regulatory scrutiny and academic research .[1][2]

The prevalence of dark patterns is remarkably widespread. A 2019 study examining 11,000 e-commerce websites found approximately 10% employed deceptive practices, while a 2022 European Commission report indicated that 97% of popular apps used by EU consumers displayed them.

Definition and terminology

[edit | edit source]

The term dark patterns was originally defined by Harry Brignull as "design tricks that manipulate users into taking actions they didn't intend to." The Federal Trade Commission (FTC) describes them as "design practices that trick or manipulate users into making choices they would not otherwise have made and that may cause harm."[1][2]

There is ongoing discussion regarding the most appropriate terminology. Alternative labels include deceptive design, manipulative UX, coercive design, or anti-patterns. Some advocates argue for terms like deceptive patterns to more accurately describe the intentional nature of these designs and avoid potential racial connotations. Brignull himself has transitioned to using deceptive.design.[2]

What distinguishes dark patterns from merely persuasive design is their exploitative nature – they are not about creating value for users but about benefiting the service provider through manipulation and deception.

Common types and examples

[edit | edit source]

Research has identified numerous specific dark patterns, with one comprehensive study proposing a taxonomy comprising 68 distinct types. These manifest across various industries and digital contexts.

Obstruction patterns

[edit | edit source]

These designs make desired actions (like rejecting tracking) significantly more difficult than accepting alternatives. A classic example is the Roach Motel pattern, where signing up for a service is straightforward but cancellation is excessively difficult. The FTC highlighted this pattern in their case against ABCmouse, where cancellation was made "extremely difficult" despite promising "Easy Cancellation."

Interface interference

[edit | edit source]

This category includes designs that manipulate interface elements to steer user behavior. Misdirection focuses user attention on one element to obscure another critical detail. Disguised ads blend advertisements with genuine interface elements, like fake "Download" buttons on software websites.[1]

Forced action

[edit | edit source]

These patterns require users to complete unnecessary actions to access desired functionality. Forced registration demands that users create an account to complete a task. Forced continuity involves automatically transitioning users from free trials to paid subscriptions without adequate notification. The FTC alleged that Adobe violated regulations by "tricking customers into enrolling in subscription plans without proper disclosure."[1][3]

Sneaking and information hiding

[edit | edit source]

These practices involve concealing or obscuring material information from users. Hidden costs reveal unexpected fees only at checkout, a practice employed by ticketing platforms. Drip pricing advertises only part of a product's total price initially and then imposes other mandatory charges later.[1]

Social proof and urgency

[edit | edit source]

These patterns exploit social influence and time pressure to manipulate decisions. False activity messages misrepresent site activity or product popularity. False scarcity creates pressure to buy immediately by claiming limited inventory. Baseless countdown timers display fake countdown clocks that reset when expired.

Mind tricks and business incentives

[edit | edit source]

Cognitive biases exploitation

[edit | edit source]

Dark patterns effectively manipulate users by leveraging well-established cognitive biases. Default bias describes the tendency to stick with pre-selected options, exploited through pre-ticked checkboxes. Inertia makes users more likely to choose the path of least resistance. The tendency to prefer avoiding losses, loss aversion, is triggered through messages suggesting users may lose functionality if they decline certain options.

The effectiveness is enhanced through A/B testing and data analytics, allowing companies to refine dark patterns based on actual user behavior. This data-driven approach represents a significant evolution from earlier deceptive practices.

Incentives and short-term gains

[edit | edit source]

The persistence of dark patterns is driven by their effectiveness in achieving short-term business objectives like increased conversion rates. Additionally, the competitive landscape fosters copycat behavior, as companies mimic their rivals' strategies.

Research suggests these short-term gains often come with long-term consequences . Studies indicate that "once users feel manipulated, they don't just avoid your settings—they avoid your brand." The erosion of trust can have significant business implications.

[edit | edit source]

United States framework

[edit | edit source]

In the United States, regulation occurs primarily through existing consumer protection statutes . The FTC Act empowers the Federal Trade Commission to take action against "unfair or deceptive acts or practices in or affecting commerce."[4]

In October 2024, the FTC amended its Negative Option Rule to include specific requirements for cancellation mechanisms, implementing a "Click-to-Cancel" provision.[5]

European Union's approach

[edit | edit source]

The European approach combines general consumer protection laws with data privacy-specific regulations. While the General Data Protection Regulation (GDPR) doesn't explicitly mention dark patterns, its requirements for valid consent effectively prohibit many deceptive designs.[6]

The Digital Services Act (DSA) and Digital Markets Act (DMA) further address dark patterns by prohibiting practices that "deceive or manipulate" users.[7]

Enforcement cases and penalties

[edit | edit source]

Recent years have seen significant enforcement actions :

  • Epic Games paid $245 million to settle charges related to deceptive patterns in Fortnite.[8]
  • Noom paid $62 million to settle charges regarding deceptive subscription practices.[9]
  • TikTok received multimillion-euro fines for failing to protect children's data through manipulative consent practices.

Impact on consumers and businesses

[edit | edit source]

Consumer harms

[edit | edit source]

Dark patterns create multiple forms of harm for consumers, ranging from financial losses to privacy violations and emotional distress . Privacy harms occur when users are manipulated into sharing more personal data than they intended. Emotional and psychological harms include frustration, stress, and feelings of betrayal.[1][6]

Vulnerable groups are disproportionately affected. "People with low digital literacy, cognitive impairments, or disabilities often struggle to recognize manipulative designs."

Business implications

[edit | edit source]

While dark patterns may deliver short-term benefits , they often create long-term risks for businesses. The erosion of consumer trust can have lasting negative impacts on customer retention and brand reputation. Businesses also face increasing regulatory risks as enforcement actions become more common and severe.[1]

Detection, avoidance, and mitigation

[edit | edit source]

Technical detection and tools

[edit | edit source]

Efforts to automatically detect dark patterns are evolving but face significant challenges. A comprehensive study found that existing tools could only identify 31 of 68 identified dark pattern types, a coverage rate of just 45.5%. The study proposed a Dark Pattern Analysis Framework (DPAF) to address existing gaps.

Ethical design alternatives

[edit | edit source]

Companies can implement ethical alternatives that respect user autonomy. Providing balanced choice architecture where users can decline as easily as they accept represents an ethical approach for obstruction patterns. Designers should implement neutral default settings that don't assume consent.

Transparency and clear communication are essential. Companies should provide honest explanations of data practices and costs in clear, understandable language.

Consumer protection and advocacy

[edit | edit source]

Consumer education plays a crucial role. Initiatives like the Dark Patterns Tip Line allow users to report deceptive designs they encounter. Advocacy organizations provide resources to help identify and avoid dark patterns.[2]

References

[edit | edit source]
  1. 1.0 1.1 1.2 1.3 1.4 1.5 1.6 "Bringing Dark Patterns to Light". Federal Trade Commission. September 2022. Archived from the original on September 16, 2025.
  2. 2.0 2.1 2.2 2.3 Brignull, Harry. "Dark Patterns: inside the interfaces designed to trick you". Deceptive.Design.
  3. "FTC Charges Adobe". Federal Trade Commission. June 17, 2024.
  4. "FTC Act". Federal Trade Commission.
  5. "FTC Strengthens Negative Option Rule". Federal Trade Commission. October 11, 2024.
  6. 6.0 6.1 "Guidelines on Dark Patterns in Social Media Platform Interfaces". European Data Protection Board. 2022.
  7. "Digital Services Act". European Commission.
  8. "Epic Games to Pay $245 Million". Federal Trade Commission. December 19, 2022.
  9. "Noom to Pay $62 Million". Federal Trade Commission. March 7, 2024.