Regulating Dark Patterns in E-Commerce: A Comparative Legal Analysis of India and the European Union
Abstract
This research paper critically examines the growing phenomenon of scheming design interfaces, also known as “Dark Patterns,” which calculatingly exploits intellectualprejudices to effectcustomer behaviour and also undermines autonomy, transparency and informed decision- making in digital transactions. The paper explores how the European Union has created a thoroughregulatory response by enacting laws like the Unfair Commercial Practices Directives and the Digital Services Act, which specifically talkabout deceptive design practices and impose responsibility requirements on online platforms. India’s approach on the other hand, is still ambiguous and fragmented, primarily relying on general provisions found in the Consumer Protection Act of 2019, the E-Commerce Rules of 2020 and the Information Technology Act of 2000. It lacks a specific legal framework to effectively identify, classify or punish dark patterns. Comparative studies that evaluate how different legal systems interpret and regulate these sorts of misleading internet tactics are conspicuouslylacking, according to the research. Using a doctrinal methodology and the comparative method, which examines the roles of state and non-state actors, the functions of the legal systems and theregulatory mechanisms employed, the study methodically examines and contraststhe effectivenessand enforcement of current frameworks in India and the EU. The study intends to use this comparative analysis to give India a rational and adaptable regulatory roadmap that integrates global best practices, ensures better consumer protection and balances market innovation with ethically sound digital design standards in the ever-changing e-commerce ecosystem.
Key words: Dark Patterns, E-Commerce, Consumer Protection, Digital Regulation.
Introduction
The connection between businesses and consumers has fundamentally transformed as a result of the digital economy. E-commerce platforms have created sophisticated techniques aimed at maximizing profit at the expense of consumer welfare, despite the fact that they provide previously unheard-of ease and choice. The employment of “Dark Patterns”, user interface designs proposed to deceive or influence users into choosing what theyotherwise would not have made is the at the core of this problem. These patterns exploit consumers’ mental biases and insufficient reasoning to produce unwanted actions, such as unnecessary shopping or the involuntary revelation of personal data[1].Beyond conventional aggressive marketing, the idea of “dark patterns” uses behavioural economics more especially, the difference between “System 1” (rapid, intuitive) and “System 2” (slow, deliberative) thinking, to tempt users into rapid, ill-considered choices. Common examples include “sneak into basket,” “forced continuity,” “confirm-shaming,” and “roach motels”[2]. While pre-checked checkboxes on travel websites are still a prevalent nuisance, deceptive currency displays have been used as empirical instances in the European Union.
The regulatory response to this phenomenon has been varied globally. TheEU has emerged as a worldwide leader, establishing a comprehensive legal net that intertwines consumer protection, data privacy and competition law to safeguard “consumer autonomy”. In contrast, sectoral regulators in India, a country with a fast-expanding digital market, are struggling with a “uncoordinated regulatory response” that fails to properly harmonize.
Methodology
The objective of this research paper is to present a detailed comparison study between India and the European Union. The main goals are to analyse the regulatory frameworks by breaking down the legal architectures in the EU (Digital Service Act,(UCPD) and GDPR) and India (Consumer Protection Act2019 and CCPA Guidelines), as well as to investigate dark patterns to comprehend their theoretical foundations and manifestations in digital interfaces. In order to assess the efficacy of the proactive EU approach vs the reactive Indian one, the study compares jurisdictional outcomes while pointing out important differences and gaps, such as the absence of definitions and empirical data that impede enforcement in India. in order to provide practical methods for fortifying the Indian framework, it concludes with policy recommendations. The study uses a Doctrinal Research technique that relies on a methodical examination of key legal texts and legislation in order to accomplish these goals. In order to offer a thorough assessment of the regulatory effectiveness in both jurisdictions, this is complemented by an examination of secondary sources, such as reports, legal commentary and scholarly literature.
Literature Review
Understanding the type of harm is a prerequisite to comprehending the regulatory difficulty. The literature frames the dark pattern issue with a complex tapestry of definitions, taxonomies and criticisms.It is widely acknowledged in the literature that dark patterns operate by taking advantage of cognitive weaknesses. Given the susceptibility of low-income and low-literature users in emerging nations like India. Stress that market competition alone is insufficient to stop these tendencies. They advocate for “culturally contextualized control” as a means of separating benign customizing from manipulative design.[3]
By defining “autonomy” as the normative basis for regulation.Advances the conversation,he contends that rather than only stopping dishonesty, the main objectives of regulations should be to support informed and independent decision-making, he offers a taxonomy of six autonomy violations: undercutting required knowledge, manipulation, deceit, construction of unreflective agreements, negative frictions and non-neutral presentation of choice. This taxonomy is important because it shifts the legal test to a criterion of “regulating for” rather than the conventional “average consumer” standard, which frequently fails to protect prejudiced or disadvantaged customers.[4]
An alarming lack of empirical depth is revealed by research on the Indian landscape.Point out that there hasn’t been much empirical research done on the incidence and effects of dark patterns in India, especially in banking and e-commerce[5].India’s framework is “evolving and fragmented,” based on a patchwork of the 2023 CCPA Guidelines and The Consumer Protection Act2019.[6]The research identifies a particular “normative gap” in India:the legal system prioritizes “unfairness and deception” above “Consumer Autonomy” or cognitive vulnerability. This is supported by Viswanathan et al. (2023), who notes that although the CCPA Guidelines are a start in the right directions, they have unclear repercussions.[7] Despite their noble intentions,contend that the rules fail to evaluate institutional preparation and lack “legislative coherence and enforcement clarity.”[8]
In contrast, the EU literature depicts a robust, multi-layered defence structure. A “legal patchwork” of theGDPR, the UCPD, theDSA, and the Digital Market Act (DMA) is described.[9] Although this is often a plus, Santos points out that it sometimes causes conceptual disputes because various instruments have different definitions of harm, such as “damage” in the GDPR and “impairment of autonomy” in the DSA.[10]The EU specifically prohibits manipulative designs that breach free consent, emphasizing prevention. According to the literature, the EU’s strategy essentially “rephrases” consumer protection as “regulating for autonomy,” a notion that is mainly lacking in Indian law.[11]
Regulatory Frameworks: A Deep Dive
A fundamental change in legal paradigms is necessary to regulate dark patterns. Traditional contract law, which frequently assumes equal negotiating power and rational agency, must give way to a more sophisticated behavioural approach that recognizes the structural power imbalance present in digital ecosystems. In addition to outlining forbidden behaviour, a strong regulatory framework must set up enforcement procedures that can keep up with quickly changing A/b testing algorithms and interface designs. The legal frameworks of the European Union and India are critically examined in this part, and their different abilities to prevent manipulative design are evaluated. It compares the normative underpinnings of different countries, examining whether they place more emphasis on ex-post enforcement (corrective action performed after consumer damage has occurred) or ex-ante regulation (preventative measures similar to “safety by design”). This research attempts to show how the different legal philosophies focusing on “consumer autonomy” in the EU vs “prevention of unfairness” in India, translate into concrete regulatory results and differing degrees of consumer protection by breaking down certain legislation and recommendations.
The European Union: The Autonomy Model
The “proactive” nature of the EU’s regulatory system and its clear acknowledgement of consumer autonomy as a protected legal right are its defining characteristics. The pinnacle of the EU’s recent initiatives is the Digital Service Act(DSA), which particularly addresses misleading design techniques and imposes accountability standards on online platforms. It expressly forbids providers from creating or using interfaces in a way that misleads users or significantly impairs their capacity to make free and informed decisions. This is complemented by the Unfair Commercial Directives, which is a cornerstone law but is criticized by academics, for depending too much on the “Average Consumer” test, a standard that presumes a degree of rationality that dark patterns are intended to undermine.[12] Additionally, the GDPR establishes that dark patterns that deceive users into giving up their data are in violation of the standards for “freely given,specific, informed and unambiguous” permission, therefore integrating data privacy directly into consumer protection. Future laws like the Data Act and the AI Act, which seeks to further tighten the net around deceptive algorithmic techniques, will further develop the framework.
India beingthe Deceptive Model
India’s regulatory strategy is now shifting from a “reactive” to a more controlled one, although it is primarily focused on stopping “unfair trade practices” rather than preserving autonomy. The broad legal foundation for consumer rights is provided by the Consumer Protection Act2019, which defines “unfair trade practices,” despite the fact that digital behavioural manipulation was not initially considered. India’s most direct attempt to control the problem is represented by the 2023 CCPA Guidelines on Dark habits, which specify and outlaw particular habits including basket sneaking and fake urgency. However, Mamidwar and Bhutkar points out that these rules lack strong enforcement mechanisms and are enforced less frequently than in Western systems.[13] With regulation divide between the Competition Commission of India, the MeitY and the CCPA, the situation is made even more complex by fragmentation.This leads to an incoherent regulatory response where sectoral regulators interact without successfully harmonising, exposing the absence of a cohesive approach.[14]
Comparative Jurisdictional Outcomes
When the regulatory environments of the European Union and India are critically compared, it becomes clear that there are significant differences in terms of institutional preparedness, legal maturity and enforcement philosophy. Although the ultimate goal of both jurisdictions is to protect consumer welfare in the digital era, their approaches are very different. The European approach, which views dark patterns as a systematic threat to market integrity and democratic ideals, is an example of a unified, proactive policy that combines data protection with consumer rights. The Indian example, on the contrary, illustrates how an emerging digital economy struggles to strike a balance between the need for consumer protections and the quick adoption of new technologies, frequently leading to a patchwork of laws that treat symptoms rather than the underlying problems. This comparative study demonstrates how ingrained normative differences result in observable variations in consumer protection results, eventually establishing whether the legislation protects consumer or just acts as a corporate compliance checklist.
Regulatory Philosophy: Autonomy vs. Fairness: The most striking distinction between the two systems is their normative foundations, which influence the definition of violation itself. The EU framework is obviously “autonomy-driven,” and EU consumer law has evolved to protect “independent consumers” rather than only the theoretical “Rational Actor.”[15] The User’s freedom of choice is given priority in this method, which recognizes that digital designs can go beyond logical protections. As a result, European jurisprudence lowers the bar for involvement by permitting the regulation of design decisions that may not be factually false but are behaviourally manipulative. The India perspective, however, places more emphasis on “unfairness and deception.” India lacks a clear and explicit autonomy-based framework and instead focuses on whether the client was deceived rather than if their will was purposefully violated.[16] The EU model focuses on the more general “impairment of decision-making,” which enables regulators to target subtle nudges that undermine consumer agency without necessarily lying to them. In India, however, the burden of proof frequently rests on proving that a specific “unfair trade practice” occurred.
Enforcement and Efficacy Enforcement:The practical impact of the legislation is determined by the major differences in enforcement capacities and effectiveness between the two areas. The EU has robust enforcement mechanisms that are characterised by unambiguous restrictions and the ability to apply large fines up to 60% of worldwide sales for Very Large Online Platforms (VLOPs) especially under the DSA’s mechanism for non-compliance. Companies are compelled to use “safety by design” concept as a result of this credible deterrent impact. On the opposing hand, it is stated that India’s approach is complaint-driven and reactive. Insufficient institutional capacity and digital literacy have resulted in inefficient enforcement tactics, with regulators frequently waiting for a build-upof customer complaints before taking action.[17] Furthermore, there is a dearth of factual evidence demonstrating the scope of dark patterns in India in contrast to the EU, where case studies and enforcement precedents are more established through organisations like the European Commission and national data protection agencies. Due to this lack of data, it is difficult to pinpoint the precise extent of infringement or to supports proactive audits, which leaves Indian customers open todeceptive, frequent manipulation techniques that are missed by the monitoring mechanisms in place.
Scope of Protection:Ultimately, dueto the way their legal systems are structured, the various models provide wildly disparate degrees of protection. The EU employs a comprehensive framework that combines horizontal regulation (consumer protection under UCPD) with vertical regulation (data privacy under GDPR and digital services under DSA) to create a strict regulatory net. This all-encompassing strategy guarantees that if a deceptive practice manages to evade consumer legislation, it is frequently discovered by data protection laws, resulting in a “pincer movement” on malicious trends. India’s response is more disjointed; while trade practices are covered by the Consumer Protection Act and data issues are addressed by the Digital Personal Data Protection Act, there is still a significant regulatory gap between the two laws. Because dark patterns frequently emerge at the crossroads of consumer fraud and privacy violation, this fragmentation makes it impossible to take a holistic strategy to digital consumer protection. Due to a lack of inter-agency cooperation, a user interface design in India maybe legally permissible under the IT Act but violate the spirit of consumer protection, opening doors for it companies to take advantage of.
Identifying Gaps and Divergences
Comparing the regulatory environments of the EU and India reveals theoretically and structural flaws that extend beyond simple disparities in legislation. India’s framework is characterised by “regulatory silos” that frequently fail to interact, but the EU functions within a “holistic ecosystem” where data protection, competition legislation and consumer rights mutually support one another. This discrepancy goes right to the heart of legal efficacy rather than being merely an administrative efficiency issue. The lack of a cohesive theory of damage in India, particularly one that distinguishes classic fraud from cognitive manipulation, fosters an atmosphere that is favourable to IT companies. Moreover, Indian policy is frequently imported from the West without taking into consideration the unique digital literacy levels and cultural quirks of the Indian population due to a lack of regional study. As a result, the gaps listed below are current weakness that jeopardize the implementation of the 2023 Guidelines and the larger consumer protection requirement, rather than just academic issues.
The Empirical Gap
The glaring disparity in empirical evidence between the two jurisdiction is a recurrent and important subject in the literature. Strong, government-funded Human-Computer Interaction (HCI) research and market studies that quantify consumer harm are often the driving forces behind legislation in the European Union. For example, research on “cookies banners” and “consent fatigue” directly influenced the DSA and GDPR. The presence and effects of dark patterns in India, on the contrary, have not been “extensively empirically studied,” creating a gap in localized evidence[18].India does not have a “systematic database or quantitative evaluation of infractions”. Instead of doing extensive, long-term surveys or well controlled A/B testing procedures, the majority of Indian research depends on “descriptive case studies and secondary sources” or anecdotal information from social media[19]. Regulatory ability is severely hampered by the lack of independent data; authorises are unable to determinesuitable fines or demonstrate “material distortion” of consumer behaviour without knowing the exact conversation rates of manipulative designs vs neutral ones in the Indian environment. Because Indian digital customers, who are frequently mobile-first and linguistically varied, may be more susceptible to certain patterns like “false urgency” or “forced action” than their European counterparts, the dependence on Western research is problematic.
The Definitional and Taxonomy Gap
The Indian judicial system has a significant “conceptual deficit” in differentiating between aggressive marketing (persuasion) and manipulative design (dark patterns). India has not yet completely addressed these subtitles, even as the EU is now involved in complex discussions on “harm” definitions differentiating, for instance, between the concrete “damage” necessary for GDPR compensation and the abstract “loss of autonomy” forbidden by the DSA. The 2023 CCPA Guidelines are criticisedfor depending on “ambiguous definitions” that would not withstand close legal examination[20]. The recommendations, for instance, include some patterns (such as “basket sneaking”) but do not provide a strong, principle-based definition of “manipulation” that can identify new, unlisted tactics. Even inside the EU,there are “terminological disagreements,” but the EU’s shift to a “autonomy-based” norm offers a flexible safety net[21]. India’s existing system is “list-based” and static; even though a dark pattern is damaging, it can be acceptable if it does not cleanly fit into one of the 13 designated categories.This gap is made worse by the lack of “culturally contextualised control,” which makes it hard to distinguish between “benign customization” (assisting a user) and “manipulation” (exploiting a user) in Indian law[22].
The Institutional Gap
The “uncoordinated” character of India’s regulatory response in contrast to the EU’s increasingly centralised enforcement is maybe the most crippling disparity. To ensure that data protection authorities and consumer protection agencies work together, the EU has set up clear hierarchies and cooperative structures, such as the Digital Services Coordinators under theDSA. However, India has a disjointed authority. The CCP, the CCI, and theMeitY are India’s sectoral regulators that currently function independently, despite thesupport for regulatory frameworks akin to those of the EU and FTC[23]. A single occurrence of a dark pattern, such as a challengingcancellation procedure, may concurrently constitute an abuse of market dominance (CCI), a data privacy violation (DPDPA), and violation of consumer rights (CCPA). This results in regulatory arbitrage, when businesses take advantage of jurisdictional ambiguity to postpone compliance, in the absence of a “interagency coordination” system.“Weak enforcement mechanisms,” where legitimate complaints essentially slip through the cracks between ministries, are the result of this lack of institutional capability and clearly defined authority[24].
Policy Recommendations
In order to improve consumer protection in India, this paper makes the following policy proposals based on the lessons learned from the EU and the deficiencies found in the Indian framework.
Adopting an Autonomy-Based Standards:India ought to embrace “consumer autonomy” as a protected legal right in place of the conventional “unfair trade practice” criteria. Similar to the normative basis of the EU’s Digital Services Act, the Consumer Protection Act, 2019 should be changed or constructed by the judiciary in a way that expressly recognizes “loss of autonomy” and “cognitive manipulation” as separate types of consumer damage. At the moment, deception whether a customer was misled, is the main emphasis of Indian law. Regulators might punish interface designs that are technically accurate but behaviourally manipulative, such “nagging” or “confirm-shaming,” which weaken a user’s resistance without providing misleading information, under an autonomy-based norm. Regulating for autonomy ensures that the law protects actual, prejudiced people rather than hypothetical rational agents who are impervious to psychological prodding[25]. This links protection with empowerment. This change would provide sophisticated A/B tested designs that advantage of System 1 thinking a legal foundation to be contested, even if they don’t precisely qualify as fraud.
Harmonization of Regulatory Bodies: The Central Consumer Protection Authority, the Ministry of Electronics and Information Technologyand the CCI need to work much more closely together to solve the “fragmented” character of Indian regulation. To guarantee a consistent regulatory approach, the study suggests creating a permanent “digital Markets Task Force” or an interministerial group with members from the CCPA, CCI and data protection agencies. A crucial requirement emphasized that such harmonization would successfully avoid regulatory overlaps and unclear enforcement[26]. For instance, a single dark pattern such as “forced continuity” frequently breaches data consent requirements (MeitY), undermines market competition (CCI) and violates consumer rights (CCPA). Without a cohesive approach, digital companies may participate in regulatory arbitrage, taking advantage of agencies lack of communication to postpone compliance or pay little in one silo while ignoring others.
Evidence-Based Policymaking and Audits: India has to ensure that rules are based on local facts rather than merely copied notions in order to narrow the current empirical gap. Large-scale surveys and controlled trials should be commissioned by the government in order to precisely map the prevalence of dark patterns in the Indian fintech and e-commerce sectors and comprehend how particular demographics like first-time internet users in rural areas are particularly vulnerable. Additionally, it is crucial to create required “dark pattern audits” for major e-commerce platforms, which are comparable to the requirements for very major Online Platforms (VLOPs) under the EU’s DSA. These audits must to be more than just checklists; they ought to be through analyses of user interface (UI) results, evaluating the conversion rates of neutral vs manipulative designs. The enforcement actions will be oblivious to the true scope of the issue and lack the evidential weight to stand up in court in the absence of qualitative data indicating user susceptibility and the realeconomic effect of these designs[27].
Bright-Line Rules and “Safety by Design”:Recognizing that voluntary recommendations are frequently insufficient, India must create “bright-line standards”unambiguous, non-negotiable regulations instead of depending on nebulous notions of justice as voluntary guidelines are sometimes insufficient to stop profit-driven manipulation. As demonstrated by the EU’s recent enforcement efforts, this means turning the 2023 Guidelines into legally enforceable laws with particular, forbidden UI features, such as a total prohibition on pre-checked boxesfor additional services or “roach motel” cancelation procedures. Furthermore, before digital interfaces are made available to the public, they must be evaluated for non-manipulation in accordance with “Safety by Design” principles. This proactive approach radically changes the incentive structure for UI/UX designers by shifting the burden of proof from the user, who now has to be proved they were duped, to the platform, which has to prove its design is neutral and fair.
Global Cooperation:Lastly, India shouldn’t operate in isolation given that the digital economy is global and that many of the platforms that violate it are international corporations. In order to standardised definitions and enforcement tactics across countries, the study suggests that India take an active role in developing a “worldwide framework” or “Digital Fairness Charter”. In order to stop multinational tech companies from engaging in regulatory arbitrage offering EU citizens high-protection interfaces to comply with the DSA while using exploitative dark patterns in developing markets like India with lax enforcement, a global authority or standardised framework is necessary[28]. India may use worldwide evidence and pressure to hold multinational IT businesses responsible for their actions within its borders by harmonizing its rules with international best practices.
Conclusion
The maturity and philosophy of digital regulation differ significantly between the EU and India, according to a comparative study. With the help of network of enforcing laws like the DSA and GDPR, the EU has effectively constructed a “through regulator response” that prioritizes consumer autonomy. India is still in a “catch up” phase despite making impressive progress with the 2019 Consumer Protection Act and the 2023 Guidelines. This is due to a lack of empirical evidence, fragmented institutional authority, and a reluctance to completely adopt autonomy-based jurisprudence.
The “dark pattern”is a structural danger to the digital economy that challenges the idea of free markets by eliminating customer choice, making it more than just a design annoyance. The believe that these tactics reduce consumer autonomy and justice, particularly for underrepresented populations. India has to change from reactive to a proactive approach in order to adequately secure its digital people. This calls for a paradigm changes as well as new legislation, seeing the consumer not just as a target of deceit but also as an independent actor whose freedom of choice must be forcefully defended against the sophisticated computational manipulations of the twenty-first century.
India can close the gap with the EU and create a digital marketplace that is not only profitable but also moral, open and genuinely free by implementing the suggestions made in this paper, particularly the transition to autonomy-based regulation, the harmonization of agencies and the dedication to empirical auditing.
[1]Thukral, A., Aggarwal, N., Saini, C. P., & Kapoor, P., “Fake reviews as dark patterns: A nudge theory perspective using the SOR framework,” Interdisciplinary Journal of Information, Knowledge, and Management, Vol. 20, Article 33, 2025, pp. 1–17.
[2] Ibid./Id.
[3]Ibid./Id.
[4]Martin Brenncke, “Drafting Guidelines for Prevention and Regulation of Dark Patterns, Deprtment of Consumer Affairs, Government of India – Written evidence submitted by Martin Brenncke” (2024) SSRN.
[5]Thukral, A., Aggarwal, N., Saini, C. P., & Kapoor, P., “Fake reviews as dark patterns: A nudge theory perspective using the SOR framework,” Interdisciplinary Journal of Information, Knowledge, and Management, Vol. 20, Article 33, 2025, pp. 1–17.
[6]James Baumeister, Ji-Young Park, Andrew Cunningham, Stewart Von Itzstein, Ian Gwilt, Aaron Davis and James Walsh, “Pattern in the Dark: A Report to the Data Standards Chair “(Australian Research Centre for Interactive and Virtual Environments, University of South Australia)” 2024.
[7]Namita Viswanath, Sana Khan, Aditya G., and Himangini Mishra, “Shedding Light on Dark Pattern Regulations in India” Induslaw (December 2023).
[8]Akhil Raj & Ekta Gupta, “Illuminating the Shadows in India’s Dark Pattern Guidelines: A Flawed Regulatory Attempt,” (2024) National Law University Odisha.
[9]Cristiana Santos, Viktorija Morozovaite, and Silvia De Conca, “No harm, no foul: how harms caused by dark patterns are conceptualised and tackled under EU data protection, consumer and competition laws” Working Paper 2-2024, Department of International and European Law, Utrecht University and Vrije Universiteit Amsterdam (2024).
[10]Ibid./Id.
[11]Johanna Gunawan, Cristiana Santos & Irene Kamara, “Redress for Dark Patterns Privacy Harms? A Case Study on Consent Interactions,” CSLAW ’22 Proceedings of the 2022 Symposium on Computer Science and Law, ACM, Washington DC, 2022.
[12]Martin Brenncke, “Drafting Guidelines for Prevention and Regulation of Dark Patterns, Deprtment of Consumer Affairs, Government of India – Written evidence submitted by Martin Brenncke” (2024) SSRN.
[13]Aryan Mamidwar and Ganesh Bhutkar, “An Overview of Guidelines on Dark Patterns” CEUR Workshop Proceedings, CHI24 Mobilizing Research and Regulatory Action on Dark Patterns and Deceptive Design Practices, Honolulu, HI, USA, May 2024.
[14]Beni Chugh and Pranjal Jain, “Unpacking Dark Patterns: understanding Dark Patterns and their Implications for Consumer Protection in the Digital Economy” 7(1) RGNUL Student Research Review 2021.
[15]Martin Brenncke, “Drafting Guidelines for Prevention and Regulation of Dark Patterns, Deprtment of Consumer Affairs, Government of India – Written evidence submitted by Martin Brenncke” (2024) SSRN.
[16]James Baumeister, Ji-Young Park, Andrew Cunningham, Stewart Von Itzstein, Ian Gwilt, Aaron Davis and James Walsh, “Pattern in the Dark: A Report to the Data Standards Chair “(Australian Research Centre for Interactive and Virtual Environments, University of South Australia)” 2024.
[17]Prithivi Raj, Soubhagya Sundar Nanda, and Murtaza S. Noorani, “Safeguarding the Digital Consumer: A Comparative Legal and Psychological Analysis of Dark Patterns in E-Commerce” 2(4) Advances in Consumer Research 3567-3574 (2025).
[18]Thukral, A., Aggarwal, N., Saini, C. P., & Kapoor, P., “Fake reviews as dark patterns: A nudge theory perspective using the SOR framework,” Interdisciplinary Journal of Information, Knowledge, and Management, Vol. 20, Article 33, 2025, pp. 1–17.
[19]James Baumeister, Ji-Young Park, Andrew Cunningham, Stewart Von Itzstein, Ian Gwilt, Aaron Davis and James Walsh, “Pattern in the Dark: A Report to the Data Standards Chair “(Australian Research Centre for Interactive and Virtual Environments, University of South Australia)” 2024.
[20]Namita Viswanath, Sana Khan, Aditya G., and Himangini Mishra, “Shedding Light on Dark Pattern Regulations in India” Induslaw (December 2023).
[21]Cristiana Santos, Viktorija Morozovaite, and Silvia De Conca, “No harm, no foul: how harms caused by dark patterns are conceptualised and tackled under EU data protection, consumer and competition laws” Working Paper 2-2024, Department of International and European Law, Utrecht University and Vrije Universiteit Amsterdam (2024).
[22]Thukral, A., Aggarwal, N., Saini, C. P., & Kapoor, P., “Fake reviews as dark patterns: A nudge theory perspective using the SOR framework,” Interdisciplinary Journal of Information, Knowledge, and Management, Vol. 20, Article 33, 2025, pp. 1–17.
[23]Ibid./Id.
[24]Prithivi Raj, Soubhagya Sundar Nanda, and Murtaza S. Noorani, “Safeguarding the Digital Consumer: A Comparative Legal and Psychological Analysis of Dark Patterns in E-Commerce” 2(4) Advances in Consumer Research 3567-3574 (2025).
[25]Martin Brenncke, “Drafting Guidelines for Prevention and Regulation of Dark Patterns, Deprtment of Consumer Affairs, Government of India – Written evidence submitted by Martin Brenncke” (2024).
[26]James Baumeister, Ji-Young Park, Andrew Cunningham, Stewart Von Itzstein, Ian Gwilt, Aaron Davis and James Walsh, “Pattern in the Dark: A Report to the Data Standards Chair “(Australian Research Centre for Interactive and Virtual Environments, University of South Australia)” 2024.
[27]Thukral, A., Aggarwal, N., Saini, C. P., & Kapoor, P., “Fake reviews as dark patterns: A nudge theory perspective using the SOR framework,” Interdisciplinary Journal of Information, Knowledge, and Management, Vol. 20, Article 33, 2025, pp. 1–17.
[28]Aryan Mamidwar and Ganesh Bhutkar, “An Overview of Guidelines on Dark Patterns” CEUR Workshop Proceedings, CHI24 Mobilizing Research and Regulatory Action on Dark Patterns and Deceptive Design Practices, Honolulu, HI, USA, May 2024.


