We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
- Deceptive defaults: Pre-checked boxes, opt-out settings, or buried privacy controls trick users into sharing data by making sharing the easiest or default action.
- Misleading language: Ambiguous, technical, or false wording (e.g., “help personalize your experience”) obscures what data is collected and how it’s used.
- Forced consent and bundling: Combining consent for essential service with consent for data sharing (no granular choice) coerces users into surrendering data to access features.
- Obstruction and friction: Hiding privacy settings behind many clicks, small buttons, or time-limited prompts discourages users from limiting data collection.
- Privacy Zuckering: Interfaces designed to make users publicly share more information than intended (named after Facebook examples).
- Sneaky notifications and bait-and-switch: Promising one outcome (discount, feature) but requiring data access, then using that data for profiling or marketing.
- Dark pattern-driven data harvesting for profiling: Collected data is combined, inferred, and sold to advertisers, brokers, or used to micro-target vulnerable users (price discrimination, political persuasion).
- Continuous and persistent tracking: Using subtle UI cues or consent resets to maintain long-term access to location, contacts, or behavioral data.
Consequences: loss of control over personal information, unwanted targeted advertising, discrimination, security risks, and erosion of informed consent.
References: Brayne (2017) on surveillance and data markets; Gray et al. (2018) “The Dark (Patterns) of UX”; Nissenbaum (2010) on privacy as contextual integrity.
Privacy zuckering refers to interface designs and flows that nudge, trick, or pressure users into revealing more personal information or making more of their data public than they intended. Common techniques include burying privacy settings in deep menus, using confusing or asymmetric defaults (e.g., defaulting to public sharing while “private” requires extra steps), presenting misleading language that minimizes perceived risk, and framing prompts so the privacy-preserving option looks like a loss or inconvenience. The result is that users unknowingly expose contacts, photos, location, or profile details — data that platforms can monetize or share. The term evokes high-profile examples where social-network interfaces encouraged broader sharing by default (see: Facebook’s early settings controversies).
References: Brignull, H. “Dark Patterns” (http://www.darkpatterns.org); A. Gray et al., “Designing for Privacy” discussions in HCI literature; ACLU and FTC reports on deceptive privacy practices.
Obstruction and friction are design tactics that make it difficult for users to find or use privacy-protecting options. By burying settings behind multiple menus, using tiny or low-contrast buttons, or presenting choices in fleeting pop-ups, interfaces increase the effort, time, and confusion required to opt out of data collection. Most users respond to this added cost by accepting defaults or skipping settings, so platforms retain access to more personal data. This technique exploits cognitive limits and attention scarcity: people are less likely to pursue a protective action when it requires persistent navigation or quick responses. Empirical studies and regulators (e.g., GDPR guidance on dark patterns) identify these tactics as deliberate obstacles to informed consent.
Forced consent and bundling occur when a service ties essential functionality to broad data-sharing permissions, removing or hiding granular choices. Users who need the core feature are effectively pressured to accept all data collection practices — including advertising, profiling, or third‑party sharing — because declining means losing access. This exploits asymmetric power: the provider controls access and designs the interface (pre-checked boxes, single “accept” button, no separate toggles) so the only practical option is to consent. The result is not genuine, informed consent but a coerced trade-off that prioritizes business interests over user autonomy and privacy.
Relevant sources: GDPR guidance on consent (EU Commission, WP29/EDPB) and research on dark patterns (Brignull; Gray et al., “The Dark (Patterns) of UX Design,” 2018).
Dark patterns use ambiguous, technical, or euphemistic wording—phrases like “help personalize your experience”—to hide what data is actually collected and how it will be used. Such language exploits users’ limited attention and technical knowledge by:
- Obscuring scope: Vague terms (e.g., “personalize”) don’t say whether browsing history, contacts, location, or biometric data are gathered.
- Masking purposes: Broad phrases let companies claim many downstream uses (analytics, advertising, sharing with partners) without explicit consent.
- Framing consent: Polite or positive phrasing makes users more likely to agree, even when they wouldn’t if the consequences were clear.
- Creating information asymmetry: Technical jargon or legalistic wording prevents users from understanding risks or exercising meaningful choice.
Result: Users consent to data collection and processing they wouldn’t otherwise approve of, enabling profiling, targeted advertising, resale of data, or privacy-invading analytics while maintaining plausible deniability for companies.
For further reading: see the Norwegian Consumer Council’s “Deceived by Design” report and the Oxford Internet Institute’s work on dark patterns.
Explanation: Continuous and persistent tracking uses deceptive or opaque interface tactics to keep collecting users’ location, contacts, and behavior over long periods. Examples include hiding consent controls, resetting opt-outs after updates, using vague or buried language about “background activity,” or offering temporary permissions that automatically renew. These tactics exploit user attention and cognitive limits—users may miss small UI cues, consent dialogs, or settings changes—so tracking continues without informed, ongoing consent. The result is sustained access to sensitive data for profiling, targeted advertising, or resale, while users lack clear, effective means to stop it.
Why it’s harmful:
- Violates meaningful consent: users do not fully understand or cannot easily revoke tracking.
- Amplifies surveillance: long-term behavioral records enable precise profiling and manipulation.
- Disproportionately affects vulnerable users who are less tech-literate.
References:
- Gray et al., “The Dark (Patterns) of UX Design” (2018).
- Mathur et al., “Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites” (2019).
- GDPR Article 4 and Recital 32 on informed consent (for legal context).
Deceptive defaults are interface choices that make sharing or surrendering personal data the easiest option. Examples include pre-checked consent boxes, automatic opt-ins, or privacy settings buried beneath layers of menus. Philosophically and ethically, they exploit human cognitive biases—especially status quo bias and decision inertia—by converting a meaningful consent decision into passive acceptance. Because people often accept the default to save time or avoid complexity, deceptive defaults systematically shift control away from users and toward companies, undermining genuine informed consent and treating privacy as an opt-out burden rather than a right.
Key effects:
- Distorts autonomy: Users don’t make active, informed choices; design nudges them toward company-favored outcomes.
- Increases data collection: Easier defaults inflate the amount and sensitivity of data companies harvest.
- Erodes trust: Discovering hidden or pre-set data sharing can create distrust and harm reputation.
References:
- Sunstein, C. R. (2013). Simpler: The Future of Government. (On defaults and choice architecture.)
- Gray, C. M., et al. (2018). “Dark Patterns and the Ethics of Persuasive Design.” Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (discusses deceptive defaults).
Dark patterns—design tricks that manipulate users into actions they wouldn’t otherwise take—are used to harvest vast amounts of personal data. When users are nudged (or tricked) into consenting to data sharing, granting excessive permissions, or revealing sensitive details through confusing interfaces, that data is aggregated, inferred, and repurposed. Data brokers and advertisers combine explicit inputs (forms, purchases, clicks) with behavioral signals (clickstreams, dwell time, device and location data) to build detailed profiles. Machine learning fills gaps by inferring attributes (interests, income, health, political leanings), enabling micro-targeting: tailored ads, dynamic pricing (price discrimination), and finely tuned political messaging aimed at susceptible groups. Vulnerable users—those with lower digital literacy, economic need, or psychological susceptibilities—are disproportionately impacted, as profiling lets actors exploit their specific weaknesses for profit or influence.
Key harms: loss of privacy, manipulation of choices, economic exploitation (higher prices or predatory offers), and erosion of democratic processes via targeted political persuasion. For further reading, see the work on dark patterns by the Norwegian Consumer Council and research on data brokerage and microtargeting by the NYU/Stanford Cyber Policy Center.Title: Dark-Pattern Driven Data Harvesting for Profiling
Dark patterns—interface designs that nudge or trick users into choices they would not otherwise make—are used to harvest extensive personal data. Through deceptive consent prompts, confusing opt-outs, pre-checked boxes, and disguised data requests, companies collect both explicit data (forms, purchases, location) and implicit signals (clicks, dwell time, navigation paths). That raw data is then combined with other sources, run through inference algorithms, and sold to advertisers or brokers or kept for in-house use.
The harms:
- Profiling and segmentation: Aggregated and inferred attributes (income, health, political leaning, vulnerabilities) create detailed user profiles.
- Micro-targeting and price discrimination: Profiles enable tailored offers and dynamic pricing that charge different users different amounts or steer them toward specific products.
- Political persuasion and behavioral manipulation: Precise profiles make it possible to deliver persuasive messages targeted at psychologically or socially vulnerable groups to influence opinions or voting behavior.
- Reduced autonomy and privacy: Users lose control over how their data is used; opaque flows to third parties magnify risk (data breaches, misuse).
Why it’s effective: Dark patterns exploit cognitive biases (default bias, choice overload, trust in interfaces) and the opacity of data economies, so users often do not realize what they’ve consented to or how profiled inferences are made.
Relevant sources: work on dark patterns by Mathur et al. (2019), the concept of surveillance capitalism by Shoshana Zuboff (2019), and research on microtargeting and political persuasion (e.g., Tufekci; Persily & Tucker).
Sneaky notifications and bait‑and‑switch techniques lure users with an appealing promise (a discount, free feature, or urgent alert) but condition access on granting permissions or entering personal details. Once users consent — often without full awareness — the requested data (contacts, location, email, usage patterns) is collected and repurposed for profiling, targeted advertising, resale to data brokers, or other monetization. The original promise is secondary or never delivered; the primary aim is to expand a detailed behavioral or identity profile that increases future persuasion power and revenue.
Why this is harmful:
- Consent is manufactured or uninformed: users agree under a misleading premise.
- Data use exceeds expectations: information is used for profiling/marketing rather than the stated purpose.
- Harm compounds over time: accumulated profiles enable increasingly intrusive personalization and discrimination (price steering, manipulative ads).
References:
- Gray, C., et al., “The Dark (Patterns) Side of UX Deception,” (research on deceptive design).
- Mathur, A., et al., “Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites,” Proceedings of ACM CHI (2021) — documents bait‑and‑switch and consent manipulation.