• Definition: Dark patterns are design choices that steer users toward decisions that benefit providers (insurers, hospitals, vendors) at the user’s expense by exploiting cognitive biases and information asymmetries. (Brignull; Acar et al.)

  • Common types used in healthcare:

    • Friction/Obstruction: Making complaint submission, appeals, or prior authorization difficult via long forms, obscure portals, or repeated documentation requests to deter claims. (e.g., insurer denial appeal complexity)
    • Hidden Costs & Fees: Concealing out‑of‑network charges, surprise billing, or extra facility fees until after services are rendered. (studies on surprise billing)
    • Misdirection/Visual Emphasis: Highlighting expensive “recommended” options or branded services while downplaying equally effective, cheaper alternatives (generic drugs, conservative treatment).
    • Forced Continuity & Sneak into Basket: Auto-enrolling patients into recurring services, subscriptions, or ancillary programs (wellness plans, premium support) and making cancellation hard.
    • Social Proof & Scarcity: Using “limited spots” or testimonials to pressure rapid consent to elective procedures, add‑ons, or high-margin treatments.
    • Privacy Dark Patterns: Defaulting to broad data sharing, burying consent in long EULAs, or obfuscating opt-out of secondary uses (research, marketing).
    • Confirmshaming & Nagging: Guilt‑laden language to push patients toward paid upgrades, tests, or products (e.g., “Don’t protect your family?”).
    • Decoy Pricing & Anchoring: Presenting an inflated option to make another high‑margin option appear reasonable (expensive bundled tests vs. single test).
    • Opaque Algorithms: Clinical decision support or triage bots that prioritize profitable services without disclosing incentives or limits.
  • Harms:

    • Financial: Surprise bills, unnecessary procedures, higher out‑of‑pocket costs.
    • Clinical: Overtreatment, delayed appropriate care, lowered adherence.
    • Ethical/Trust: Erosion of patient autonomy, consent quality, and trust in providers.
    • Privacy: Unwitting data sharing and targeted marketing of sensitive health information.
  • Remedies / Safeguards:

    • Transparency: Clear pricing, plain‑language consent, disclose algorithms and conflicts of interest.
    • Regulation: Enforceable rules against manipulative design in health tech, strict surprise-billing protections, privacy limits (HIPAA + extended consumer protections).
    • Defaults that favor patients: Opt‑in for data sharing, clear cancellation paths, minimal friction for appeals.
    • Oversight & Design Ethics: Patient-centered UX review, audits, and penalties for deceptive practices.

Key references: Harry Brignull, “Dark Patterns” (2010–); Acar et al., “Dark Patterns in the Design of Consumer Health Technologies” (CHI 2021); recent policy analyses on surprise billing and digital health ethics.

Back to Graph