Author : Tanusha Tyagi

Expert Speak Digital Frontiers
Published on Feb 28, 2025

The emerging tactics of digital manipulation in the mobility sector need to be addressed with concrete measures

Tackling dark patterns in mobility apps

Image Source: Getty

With the advent of ride-hailing apps in India, urban mobility has transformed significantly. Applications within such a category provide on-demand services that link riders to drivers using their vehicles as commercial vehicles. These apps have disrupted the traditional taxi industry as they offer customers more convenient, efficient, and often cost-effective transportation alternatives. However, the growth trajectory of these apps is closely linked to their slightly concerning use of 'dark patterns' or deceptive design strategies to manipulate user behaviour.

Reports allege that Uber manipulates driver behaviour to keep drivers engaged on the platform using psychological tricks such as presenting ‘unworked time’ as 'financial losses rather than potential gains.’ Similarly, these apps make cancelling their membership deliberately difficult, requiring users to navigate multiple steps and misleading prompts, discouraging them from opting out easily.

The growth trajectory of these apps is closely linked to their slightly concerning use of 'dark patterns' or deceptive design strategies to manipulate user behaviour.

This article assesses how dark patterns are being increasingly deployed in ride-hailing apps, the legal framework governing such dark patterns, and analyses their potential implications.

Growing use of dark patterns in mobility apps

Dark patterns are any practices or dishonest design patterns that use User Interface(UI)/User Experience (UX) interactions on any platform to deceive  users into doing something they either did not intend or want to do. This practice utilises undermining and impairing the user’s autonomy, decision-making, or choice. There have been concerns about the increasing use of dark patterns in mobility apps in recent times.

A report by the Advertising Standards Council of India (ASCI) analysed 53 popular Indian apps across various sectors, including mobility apps, revealing 98 percent of them employed at least one deceptive pattern, with an average of 2.7 such patterns per app. Ride-hailing apps had an average of 3 instances of deceptive patterns. Similarly, a recent survey of over 33,000 users suggested alarming trends in ride-hailing platforms where 42 percent of users reported hidden charges, 84 percent faced forced cancellations, and 78 percent encountered misleading wait times.

Regulatory framework to tackle dark patterns

With the growing incidence of dark patterns in mobility apps, it becomes essential to examine the existing regulatory framework and understand how different jurisdictions approach the issue of dark patterns. In India, the Consumer Protection Act, 2019 (CPA), and the Guidelines for Prevention and Regulation of Dark Patterns, 2023 (the Guidelines) provide the legal framework to prevent the use of dark patterns in the digital economy. The Guidelines prohibit any person from engaging in dark pattern practices  listed in the Annexure of the Guidelines. These include a list of 13 identified dark patterns  subject to modification by the Central Consumer Protection Authority (CCPA), amongst which the most commonly prevalent are basket sneaking, forced action, and false urgency.

With the growing incidence of dark patterns in mobility apps, it becomes essential to examine the existing regulatory framework and understand how different jurisdictions approach the issue of dark patterns.

However, these guidelines are more directive than they are legally binding regulations, making enforcement difficult. Without statutory penalties in place,  platforms engaging in dark patterns cannot be penalised based on the Guidelines alone.

On the other hand, the European Data Protection Board (EDPB) came up with the Guidelines on Dark Patterns in Social Media Platform Interfaces in 2022, offering practical recommendations on assessing dark patterns in social media platforms. Albeit exhaustive, these guidelines identify existing dark patterns and also provide recommendations to assess and avoid them. The recently implemented Digital Services Act (DSA) specifically targets dark patterns online. Under Article 25 of the DSA, usage of deceptive patterns by online platforms is prohibited.

Meanwhile, in the United States, the approach to dealing with dark patterns remains rather less comprehensive. Although no explicit federal laws currently target dark patterns directly, in 2022, the Federal Trade Commission (FTC) published a detailed report titled, Bringing Dark Patterns to Light, confirming that dark patterns are ‘squarely on the FTC’s radar’ Additionally, the California Privacy Rights Act (CPRA), the Colorado Privacy Act (CPA), and the Connecticut Data Privacy Act all exclude agreements obtained through dark patterns from the definition of valid consent. Failure to comply with these regulations carries severe penalties.

Understanding dark patterns in mobility apps

The rise in the usage of dark patterns in ride-hailing apps is making it increasingly difficult for users to navigate their services fairly. Table 1 analyses the practices adopted by mobility apps vis-à-vis The Guidelines to demonstrate the existence of dark patterns on such platforms.

Table 1: Dark patterns observed in mobility apps

Dark Pattern Description Applicability in Mobility Apps
Basket Sneaking Adding unwanted items to a user's online shopping cart without their explicit consent.   For instance, when booking a ride on Uber, users may notice that a small insurance fee (around INR 3 per trip) is automatically added to the fare without explicit consent. While this charge is mentioned in Uber’s policy, it is often pre-selected, meaning users must manually opt out if they do not wish to pay for it. This subtle addition of a cost without proactive user agreement is an example of basket sneaking, as users may unknowingly pay for a service they did not deliberately choose. Similarly, various Ola users shared their experiences online, where the app had by default enabled their ride insurance without asking for their consent.
Subscription Trap Enrolling users into recurring services or ride passes with hidden terms and difficult cancellation processes.   An Uber customer recently shared in a post how the mobility app made it complicated for him to cancel his Uber One membership. The cancellation process was made technical and tough on purpose to lock the users into the subscription. This forms a classic example of creating a subscription trap, recurring in nature for users without their consent.
Interface Interference When the user interface or design elements in the apps are complex and vague, it steers the user away from critical information or toward undesired actions.     Mobility/ Taxi Apps like Ola and Uber make refunds and reporting of issues complex by creating multiple steps, vague policies, and limited user control. The steps to avail refunds are usually made so complex by burying these options behind multiple menu layers that it becomes inconvenient for the user to avail the same. Ola, for instance, does not refund the fare amount in case of cancellations but rather provides the user with a coupon code. Various users have also highlighted their experience with ride-hailing apps on Linkedin, wherein they described how cancelling a ride on these apps is hidden under three layers of options and is a cumbersome process for users.
Drip Pricing A dark pattern where extra costs such as convenience fees, service charges, or wait-time surcharges are gradually revealed, ultimately leading to a final fare that significantly exceeds the upfront price. A survey found that this dark pattern was reported by consumers across almost all mobility apps, including BluSmart, InDrive, Rapido, Ola, and Uber. In fact, around 40 percent app-based taxi users have faced this issue. Several Ola customers have shared their ordeals online, explaining how the initial price displayed at the time of booking the cab was much lower than the final amount they charged.

Source: Compiled by the author.

Aside from the aforementioned deceptive designs, a few more potential dark patterns have been identified. When mobility apps price their fares differently on different UI/UX for different customers, it is termed differential pricing. This potential deceptive mechanism has come into the picture recently, with Ola and Uber being accused of indulging in the same, where commuters noticed that there was a difference in fares for identical routes and timings based on whether the booking was made using an iPhone or an Android device.

Way forward

Recent regulatory actions, such as the CCPA’s intervention against Ola and Uber, signal a growing concern over emerging tactics of digital manipulation in the mobility sector. It helps us understand that new dark pattern designs can slip through the cracks of traditional definitions of deceptive design. Thus, while these practices might not fall under the classic list of dark patterns, they underscore the urgent need for stronger regulations.

By gathering and analysing real-time data on the prevalence and damage of such dark patterns—like drip pricing or subscription traps—researchers can provide the solid evidence policymakers need to take meaningful and targeted action.

The current guidelines lack the enforceability required to hold these platforms accountable, leaving consumers vulnerable to such manipulative practices. For India to keep up its pace with the evolving technology, it must embed adaptable and forward-looking principles into its existing laws. This will help build a regulatory framework that not only protects consumer rights today but also remains effective in the face of future digital challenges.

Additionally, it is equally crucial to conduct thorough market studies that shed light on these deceptive practices. By gathering and analysing real-time data on the prevalence and damage of such dark patterns—like drip pricing or subscription traps—researchers can provide the solid evidence policymakers need to take meaningful and targeted action. This evidence-based approach is essential for creating effective and fair regulations to foster a digital environment where users are respected and market practices are transparent.


Tanusha Tyagi is a Research Assistant with the Centre for Digital Societies at the Observer Research Foundation

The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.