Introduction
In today’s digital economy, user interface (UI) and user experience (UX) design play a pivotal role in influencing consumer decisions. However, not all designs are user-friendly or consumer-centric. Some are deliberately manipulative — subtly nudging users into actions they did not intend. These manipulative designs are commonly known as Dark Patterns.
Recognising the increasing threat posed by such deceptive practices, the Central Consumer Protection Authority (CCPA) under the Consumer Protection Act, 2019, issued the “Guidelines for Prevention and Regulation of Dark Patterns, 2023”. These guidelines represent a landmark step in regulating misleading digital practices in India.
What Are Dark Patterns?
According to the guidelines, dark patterns refer to:
“Any practices or deceptive design pattern using user interface or user experience interactions on any platform that is designed to mislead or trick users to do something they originally did not intend or want to do, by subverting or impairing the consumer autonomy, decision making or choice.”
In essence, they are designed to exploit psychological biases, reduce transparency, and manipulate consumer behaviour for commercial gain — amounting to misleading advertisements, unfair trade practices, or violation of consumer rights.
Applicability of the Guidelines
These guidelines apply to:
- All digital platforms offering goods or services in India,
- Advertisers, and
- Sellers operating through such platforms.
No person, including any platform, is permitted to engage in any dark pattern practice.
Types of Specified Dark Patterns
The Annexure to the 2023 Guidelines lists 13 types of dark patterns, along with practical illustrations:
1. False Urgency
Creating a false sense of urgency or scarcity to pressurise consumers into making immediate decisions.
Example:
- Showing messages like “Only 2 rooms left!” or “30 people are viewing this product right now!” when such data is either false or misleading.
2. Basket Sneaking
Adding additional items (like paid services or donations) to the cart during checkout without the consumer’s explicit consent.
Example:
- Auto-adding insurance, subscriptions, or charitable donations at checkout unless the user manually deselects them.
3. Confirm Shaming
Using language that induces guilt, shame, or fear to discourage users from opting out.
Example:
- Displaying a message like “I don’t care about helping children” if the user opts out of a charity.
4. Forced Action
Forcing consumers to perform an unrelated action to access a product or service.
Example:
- Requiring users to share their phone contacts or download another app to proceed with the original service.
5. Subscription Trap
Creating barriers to cancel paid subscriptions or hiding cancellation options.
Example:
- Making the unsubscribe option difficult to locate or involving a complex process to cancel auto-renewals.
6. Interface Interference
Designing the UI in a way that prioritises certain actions or conceals others to manipulate decisions.
Example:
- Fading or hiding the “Cancel” button, or redirecting the close (“X”) icon to another advertisement.
7. Bait and Switch
Promoting one product or outcome and delivering another.
Example:
- Advertising a high-quality item at a low price but, at the checkout stage, offering a costlier or inferior alternative citing stock unavailability.
8. Drip Pricing
Hiding elements of cost and disclosing them only at the final stages of the transaction.
Example:
- Advertising a gym membership but revealing mandatory purchase of gloves or shoes only after payment.
9. Disguised Advertisement
Masquerading ads as genuine content such as news articles or user reviews.
Note: Such advertisements also fall under the 2022 Guidelines for Prevention of Misleading Advertisements.
10. Nagging
Repeatedly interrupting the user with prompts, pop-ups, or requests, without their consent.
Example:
- Constantly asking users to download the app, enable notifications, or share personal information.
11. Trick Questions
Using confusing language (e.g., double negatives) to mislead users into making unintended choices.
Example:
- Using phrasing like “Yes, I don’t want to receive updates” which creates confusion during opt-out processes.
12. SaaS Billing Abuse
Exploiting subscription models in Software-as-a-Service (SaaS) to generate recurring charges without clear user consent.
Example:
- Automatically converting free trials into paid plans without notifying the user.
13. Rogue Malwares
Using fake security alerts or ransomware to trick users into downloading malicious software or paying for fraudulent tools.
Example:
- Accessing pirated content leads to malware downloads or data being locked with a ransom demand.
Legal Basis and Enforcement
These guidelines are issued under Section 18 of the Consumer Protection Act, 2019, and supplement existing laws including:
- Information Technology Act, 2000
- The Consumer Protection (E-Commerce) Rules, 2020
- Misleading Advertisements Guidelines, 2022
Violators may face investigation and penal action by the CCPA. Moreover, consumers retain the right to file complaints before appropriate Consumer Dispute Redressal Commissions.
Conclusion
The Dark Patterns Guidelines, 2023, represent a major step in securing digital consumer rights in India. As businesses continue to evolve online, it becomes essential that consumer autonomy is respected, and transparency is maintained in all user interactions.
For consumers, being vigilant is key — always review terms, validate charges at checkout, and question unnecessary data requests.
For businesses, compliance is not just a legal obligation but also a mark of ethical digital conduct.
Have You Experienced a Dark Pattern?
If you believe you’ve been misled online, you can file a complaint through the National Consumer Helpline (NCH) or at https://consumerhelpline.gov.in.
Add comment