Deceptive Design Patterns (DDPs) or “dark patterns” are often found on websites and apps. They “nudge” consumers into making decisions that may not be in their interests. Examples include additional charges at the end of a transaction after a person has committed time to the purchase process, or users being forced to provide payment details for a free trial which converts to a paid subscription without clear reminders or consent.
Regulators have been taking a keen interest in DDPs across the world and there have been new laws, including the UK’s Digital Markets, Competition and Consumers Act 2024 and the EU’s Digital Services Act. Last year, the UK’s ICO and the Competition and Markets Authority issued a joint report in which they called for businesses to stop the use of harmful website designs and warned that they were working together to “stop harmful design practices”. The CMA has also been carrying out its online choice architecture project, including action against Emma Sleep, Simba Sleep and Wowcher.
GPEN was established in 2010 following recommendation by the Organisation for Economic Co-operation and Development (OECD). Its aim is to foster cross-border cooperation among privacy regulators. Its members work together to strengthen personal privacy protections in this global context. The informal network is comprised of over 80 privacy enforcement authorities from around the world.
The Global Privacy Enforcement Network (GPEN) recently carried out a study on the use of DDPs on regularly visited apps and websites and has now published the results of that study.
Summary of findings:
The study aimed to replicate the user experience by engaging with websites and apps to assess the ease with which they could make privacy choices, obtain privacy information, and log out of or delete an account. It evaluated the sites and apps based on five indicators identified by the OECD as being characteristic of deceptive design patterns.
For each indicator, the GPEN report found:
- Complex and confusing language: More than 89% of privacy policies were found to be long or use complex language suited for those with a university education.
- Interface interference: When asking users to make privacy choices, 42% of websites and apps swept used emotionally charged language to influence user decisions, while 57% made the least privacy-protective option the most obvious and easiest for users to select.
- Nagging: 35% of websites and apps repeatedly asked users to reconsider their intention to delete their account.
- Obstruction: In nearly 40% of cases, sweepers faced obstacles in making privacy choices or accessing privacy information, such as trying to find privacy settings or delete their account.
- Forced action: 9% of websites and apps forced users to disclose more personal information when trying to delete their account than they had to provide when they opened it.
GPEN encourages organisations to design their platforms, including associated privacy communications and choices, in a manner that supports users in making informed privacy choices that reflect their preferences. Good design includes default settings that best protect privacy; an emphasis on privacy options; neutral language and design to present privacy choices in a fair and transparent manner; fewer clicks to find privacy information, log out, or delete an account; and ‘just-in-time’ contextually relevant consent options. GPEN says that by offering users online experiences that are free from influence, manipulation, and coercion, organizations can build user trust and make privacy a competitive advantage.