create a series of research dissertation ideas for interaction digital design also include wicked problems

Dissertation Research Ideas in Interaction Design (including Wicked Problems)

  1. Designing for Digital Empathy in Telehealth Interfaces
  • Aim: Develop interaction patterns that increase perceived empathy in remote clinical consultations.
  • Wicked aspects: conflicting stakeholder needs (clinicians vs. patients), measurement of subjective outcomes, adaptive contexts.
  • Methods: co-design, qualitative studies, prototype evaluation with physiological and self-report measures.
  • Key refs: Bødker (2006), Norman (2013).
  1. Inclusive Voice UI for Multilingual, Low-Literacy Populations
  • Aim: Create voice-interaction systems that support accessibility across languages and literacy levels.
  • Wicked aspects: cultural variability, ambiguous success metrics, privacy vs. personalization trade-offs.
  • Methods: field studies, participatory design, iterative usability testing.
  • Key refs: Benetech (inclusive tech), Seeman et al. (voice UX research).
  1. Designing for Trust and Transparency in Recommender Interfaces
  • Aim: Explore interactions that make algorithmic recommendations understandable and trustworthy.
  • Wicked aspects: opacity of models, business incentives, diverse user mental models.
  • Methods: controlled experiments, explainable UI prototypes, longitudinal studies.
  • Key refs: Eslami et al. (2016), Kocielnik et al. (2019).
  1. Interaction Design for Digital Habit Change Without Coercion
  • Aim: Identify interaction strategies that help users change harmful digital habits while respecting autonomy.
  • Wicked aspects: ethical limits, measuring long-term behavior, platform incentives to maximize engagement.
  • Methods: behavioral trials, diary studies, value-sensitive design.
  • Key refs: Fogg (2009), Lockton et al. (2010).
  1. Designing Collaborative AR Workspaces for Distributed Teams
  • Aim: Create interaction techniques for persistent, shared augmented reality that support coordination and awareness.
  • Wicked aspects: technical constraints, privacy and presence tensions, varied work practices.
  • Methods: prototyping, field deployment, mixed-method evaluation.
  • Key refs: Billinghurst et al. (2015), Dourish (2004).
  1. Interaction Design for Climate Action Apps (Citizen Engagement)
  • Aim: Design interfaces that motivate sustained pro-environmental behavior and civic participation.
  • Wicked aspects: politicization, measuring impact on emissions, conflicting individual vs. collective incentives.
  • Methods: participatory design, A/B testing, impact assessment frameworks.
  • Key refs: Manzini (2015), Stoknes (2015).
  1. Designing for Older Adults’ Autonomy with Smart Home Systems
  • Aim: Interaction models that balance safety, privacy, and autonomy for aging-in-place.
  • Wicked aspects: trade-offs between surveillance and independence, family vs. user priorities, regulatory constraints.
  • Methods: ethnography, co-design with caregivers, simulation studies.
  • Key refs: Czaja & Lee (2007), Rogers et al. (2011).
  1. Multi-Modal Interaction for Mental Health Self-Management
  • Aim: Integrate text, voice, and sensor data to support adaptive self-help interfaces for mental wellbeing.
  • Wicked aspects: data sensitivity, harm minimization, personalization vs. generalizability.
  • Methods: ethics-informed design, clinical collaborations, pilot RCTs.
  • Key refs: Torous et al. (2018), Luxton et al. (2016).
  1. Designing Ethical Persuasive Interfaces for Public Health Campaigns
  • Aim: Create interaction frameworks that persuade beneficial behavior without manipulation.
  • Wicked aspects: differing morals about persuasion, cultural variability, unintended consequences.
  • Methods: normative analysis, user testing, stakeholder workshops.
  • Key refs: Fogg (2003), Susser et al. (2019).
  1. Interaction Design for Data Privacy Negotiation
  • Aim: New UI paradigms that let users negotiate data sharing granularly and meaningfully.
  • Wicked aspects: legal/technical complexity, power imbalances, cognitive load.
  • Methods: prototyping, lab studies, policy analysis.
  • Key refs: Nissenbaum (2010), Cranor (2008).

Notes on choosing a dissertation topic:

  • Pick a wicked problem that aligns with your values and access to stakeholders.
  • Combine qualitative grounding with prototyping and measurable evaluation.
  • Engage ethical review early for sensitive domains.

If you want, I can expand any idea into a 2–3 page proposal (research questions, methods, contributions, timeline, and bibliography).

Designing Ethical Persuasive Interfaces for Public Health Campaigns

Explanation: This research topic examines how interactive digital designs (mobile apps, websites, kiosks, wearable interfaces) can ethically persuade individuals to adopt healthier behaviours in public-health contexts—vaccination, smoking cessation, diet, exercise, or pandemic hygiene. It balances persuasive design techniques (nudges, tailored messaging, gamification, social proof) with ethical constraints: respect for autonomy, informed consent, privacy, fairness, and avoidance of manipulation or coercion. Key research questions include: Which persuasive tactics are effective yet ethically acceptable? How should designers disclose persuasive intent and obtain meaningful consent? How can algorithms personalize messages without exacerbating health disparities or invading privacy? What governance, transparency, and accountability mechanisms are needed?

Methodologically this project can combine literature review (persuasive technology, bioethics, behavioural science), qualitative studies (stakeholder interviews, focus groups with users and public-health officials), design probes and prototypes, and experimental evaluation (A/B tests, randomized controlled trials) that measure both behaviour change and ethical outcomes (perceived autonomy, trust, informedness). Consideration of wicked problems is central: public health goals can conflict with individual freedoms, cultural values, and economic incentives; interventions may produce unpredictable downstream effects (stigma, surveillance creep, unequal access). The dissertation should propose practical design guidelines and an ethical framework for deploying persuasive interfaces in public-health campaigns, plus policy recommendations for regulation and oversight.

Relevant references:

  • Fogg, B. J. (2003). Persuasive Technology: Using Computers to Change What We Think and Do.
  • Narayanaswamy, K., et al. (2020). “Designing for Health: A Primer for Digital Health Designers” (selected readings).
  • O’Neill, O. (2002). Autonomy and Trust in Bioethics.
  • Debates on nudging and ethics: Thaler, R. H., & Sunstein, C. R. (2008). Nudge: Improving Decisions About Health, Wealth, and Happiness.

Designing for Trust and Transparency in Recommender Interfaces

Explanation: Recommender systems increasingly shape what users see, buy, and believe. Designing for trust and transparency focuses on how interface choices (explainable cues, provenance, control mechanisms, and feedback channels) affect users’ understanding, perceived fairness, and willingness to rely on recommendations. Research should examine which explanations (e.g., item-based, collaborative, model-based), presentation formats (textual, visual, progressive disclosure), and interaction affordances (filters, weighting controls, counterfactuals) improve calibrated trust — where users appropriately accept helpful suggestions and resist biased or irrelevant ones. Key questions include: How does transparency impact user satisfaction, decision quality, and perceived privacy risk? When does more information reduce trust by exposing complexity, and when does it increase trust by reducing uncertainty? Methods can combine lab experiments, field deployments, qualitative interviews, and log analysis to measure behavior, comprehension, and long-term engagement.

Wicked problem connection: This topic is a wicked problem because recommendations implicate diverse stakeholders (users, platforms, advertisers, regulators), conflicting values (personalization vs. privacy, effectiveness vs. fairness), dynamic data and models, and unpredictable social consequences (filter bubbles, manipulation). Designing transparent interfaces must therefore navigate trade-offs without definitive solutions, requiring iterative, participatory, and context-sensitive approaches.

References (select):

  • T. Miller, “Explanation in artificial intelligence: Insights from the social sciences,” Artificial Intelligence, 2019.
  • E. Kizilcec, “How much information? Effects of transparency on trust in an algorithmic interface,” CHI, 2016.
  • S. K. Sacha et al., “Human-Centered Machine Learning: A Challenge to the Interdisciplinary Field,” IEEE Computer Graphics and Applications, 2020.

Interaction Design for Climate Action Apps (Citizen Engagement)

Explanation: This dissertation topic investigates how interaction design can motivate, sustain, and scale citizen engagement with climate action through mobile and web applications. It focuses on designing interfaces, feedback loops, behavioral nudges, social features, and information architectures that translate awareness into meaningful, sustained pro-environmental behaviors (e.g., reduced energy use, sustainable transport, civic participation in climate policy). Key research questions include: what interaction patterns increase long-term engagement; how to balance persuasive design with respect for autonomy and avoid fatigue or greenwashing effects; how to surface trustworthy data and localize actions; and how to design for inclusivity across socioeconomic and digital-literacy differences.

Why this selection matters:

  • Climate change is a global wicked problem—complex, uncertain, value-laden, and resistant to single-solution approaches—making citizen engagement a crucial lever for societal change.
  • Interaction designers can shape how people perceive agency and collective efficacy; well-designed apps can lower friction for behavior change and civic participation.
  • The topic bridges technology, ethics, behavior science, and policy, offering rich empirical and design-research methods (field trials, A/B testing, ethnography, participatory design).
  • Practical outcomes can inform public-sector, NGO, and commercial interventions while addressing risks like user burnout, privacy trade-offs, and unequal access.

Relevant angles and methods:

  • Comparative studies of existing climate apps and their interaction strategies.
  • Co-design with underserved communities to surface contextual needs.
  • Longitudinal field experiments measuring behavior change and engagement retention.
  • Design probes and prototypes that test feedback modalities (visual, gamified, social proof).
  • Ethical analysis of persuasive techniques and data practices.

References (select):

  • Norman, D. A. (2013). The Design of Everyday Things.
  • Thaler, R., & Sunstein, C. (2008). Nudge: Improving Decisions About Health, Wealth, and Happiness.
  • Rittel, H. W. J., & Webber, M. M. (1973). Dilemmas in a General Theory of Planning (on wicked problems).
  • Froehlich, J. (2016). Interaction design and sustainable behavior. In Handbook of Human-Computer Interaction.

Designing for Digital Empathy in Telehealth Interfaces

Explanation: Designing for digital empathy in telehealth interfaces examines how interactive systems can convey understanding, compassion, and responsiveness to patients and clinicians across virtual channels. This research area addresses both technical and human-centered challenges: crafting voice, visual, and interaction cues that express warmth and attentiveness; adapting to diverse patient needs (age, culture, disability, neurodiversity); and preserving privacy and trust while collecting the contextual data needed for empathetic responses.

Key focal points include:

  • Interaction design patterns that simulate empathetic behaviors (timing of responses, micro-animations, tone adjustments) without becoming uncanny or intrusive.
  • Personalization strategies that balance tailored support with data minimization and transparency.
  • Multimodal affordances (video, audio, text, haptics) for conveying presence and emotional attunement when in-person cues are absent.
  • Workflow integration for clinicians to support empathetic communication without increasing cognitive load or documentation burden.
  • Evaluation metrics that go beyond usability to measure perceived empathy, therapeutic alliance, clinical outcomes, and equity of experience.

Wicked problems related to this topic:

  • Defining and operationalizing “empathy” in measurable design terms that are culturally and contextually sensitive.
  • Avoiding performative or manipulative empathy—interfaces that appear empathetic but undermine autonomy or exploit emotions.
  • Reconciling personalization (which often requires extensive data) with patient privacy, consent, and regulatory constraints.
  • Ensuring equitable empathetic experiences across socio-economic, linguistic, and accessibility divides, given differing access to devices and broadband.
  • Balancing clinician time and emotional labor: creating systems that support rather than replace genuine human empathy.

Relevant sources and starting points:

  • Bickmore, T., & Picard, R. (2005). Establishing and maintaining long-term human-computer relationships. ACM Transactions on Computer-Human Interaction.
  • Lupton, D. (2014). Digital health: Critical and cross-disciplinary perspectives. (Discusses affect and design in health technologies.)
  • Ribeiro, M. T., et al. (2020). On the dangers of stochastic parrots: considerations for language models and their use in sensitive domains. (For caution about AI-generated empathy.)

This topic sits at the intersection of interaction design, health informatics, ethics, and HCI evaluation—rich for dissertation work that combines qualitative studies, prototyping, and rigorous quantitative assessment.

Interaction Design for Digital Habit Change Without Coercion

Explanation: This research direction examines how interaction design can promote sustainable behavior change in digital contexts while respecting user autonomy and avoiding manipulative or coercive practices. It focuses on designing interfaces, feedback systems, and choice architectures that nudge users toward beneficial habits (e.g., reduced screen time, healthier app usage, improved privacy behaviors) through friction, defaults, timely prompts, meaningful incentives, and reflective tools rather than through dark patterns, exploitative persuasive tactics, or addictive mechanics.

Key concerns and questions:

  • Defining ethical, noncoercive influence: How do we distinguish supportive nudges from manipulation? What criteria (transparency, reversibility, user control, informed consent) should designs meet?
  • Mechanisms for habit formation: Which interaction patterns (timing of prompts, micro-goals, progress visualization, social support) reliably foster durable habits without creating dependency?
  • Personalization and autonomy: How much tailoring aids habit change while preserving self-determination? What role should user-configurable boundaries and deliberation aids play?
  • Measuring outcomes: How to evaluate success ethically—behavioral change vs. well-being, long-term maintenance vs. short-term compliance?
  • Wicked problem aspects: Conflicting values (user well-being vs. engagement metrics), shifting norms across cultures and age groups, unintended consequences (habits transferring to other contexts), and platform incentives that reward attention capture.

Why this selection matters: Designers and organizations increasingly shape daily routines through digital products. Researching noncoercive habit-change designs addresses an urgent ethical and practical challenge: enabling beneficial user behaviors without exploiting cognitive vulnerabilities. The topic sits at the intersection of HCI, ethics, behavioral science, and policy, offering impact on product practice and public well-being.

References (select):

  • Thaler, R. H., & Sunstein, C. R. (2008). Nudge: Improving Decisions About Health, Wealth, and Happiness.
  • Eyal, N. (2014). Hooked: How to Build Habit-Forming Products. (Critical for methods; consider ethical critiques.)
  • Verbeek, P.-P. (2011). Moralizing Technology: Understanding and Designing the Morality of Things.
  • Fogg, B. J. (2009). A behavior model for persuasive design. Proceedings of the 4th International Conference on Persuasive Technology.

Potential wicked-problem research angles:

  • Designing ethical metrics that align business incentives with long-term user wellbeing.
  • Cross-cultural variations in autonomy and acceptable influence.
  • Regulation vs. design responsibility: how policy can shape noncoercive defaults.

If you’d like, I can expand this into specific dissertation questions, methods, or a literature review outline.

Inclusive Voice UI for Multilingual, Low‑Literacy Populations

Explanation: Designing voice user interfaces (VUIs) that reliably serve multilingual, low‑literacy users addresses a crucial intersection of accessibility, equity, and usability. Such populations often face barriers with text‑centric interfaces (menus, form fields, written instructions) and with existing VUIs that assume standard accents, high literacy for error recovery, or monolingual interaction. A research dissertation on this topic would investigate how to create VUIs that:

  • Recognize and robustly handle multiple languages, dialects, and non‑standard pronunciations common among low‑literacy speakers.
  • Minimize reliance on textual feedback by using multimodal cues (audio prompts, icons, haptic signals, simple images) and conversation strategies tailored to low literacy (short turns, confirmation strategies that avoid complex textual paraphrase).
  • Support error recovery and clarification without requiring reading or typing—e.g., adaptive scaffolding, context-aware suggestions, and progressive disclosure of options.
  • Respect cultural communication norms and privacy concerns (sensitive topics, public use), ensuring trust and uptake.
  • Are evaluated with representative users using inclusive metrics beyond standard task time and error rates—measuring comprehension, perceived dignity, cognitive load, and long‑term adoption.

Research methods could combine field ethnography, participatory design with target communities, speech corpus collection from diverse speakers, machine learning adaptation for low‑resource languages, and iterative usability testing. Ethical considerations (consent, data ownership, avoiding stigmatizing language models) and wicked problems—such as balancing personalization with privacy, and designing across enormous linguistic diversity with limited data—should be central to the investigation.

Representative references:

  • B. Friedman and H. Nissenbaum, “Bias in computer systems,” ACM 1996 (on design ethics and fairness).
  • J. Glass et al., “Speech and Language Processing for Low‑Resource Languages,” Annual Review of Linguistics, relevant surveys on ASR for diverse speakers.
  • P. Dourish, “Where the Action Is” (1999) for embodied, situated interaction and cultural considerations.

Interaction Design for Data Privacy Negotiation

Explanation: This dissertation topic investigates how interactive systems can support meaningful, situated negotiation between users and data-collecting services about what personal data is shared, for what purposes, and under which conditions. It treats privacy not as a binary permission but as a dynamic, contextual dialogue that can be mediated through interfaces, visualizations, and interaction patterns. Key research questions include: how to surface trade-offs and risks in ways users can understand; how to design negotiation workflows that respect power asymmetries and cognitive limits; how to enable ongoing renegotiation as contexts change; and how to incorporate social norms, consent histories, and accountability mechanisms into design.

Methods would combine qualitative studies (contextual inquiry, participatory design), prototyping (interactive mockups, constrained UIs, conversational agents), and evaluation (lab usability tests, field deployments, longitudinal measures of user understanding and consent quality). Technical considerations include privacy-preserving logging, policy languages, and interoperability with platform consent frameworks.

Wicked problems within this topic:

  • Ambiguity of user preferences: preferences are context-dependent, unstable, and hard to articulate.
  • Power asymmetry: platforms wield much more information and control than individual users, making “negotiation” asymmetric in practice.
  • Trade-offs and externalities: choices about data sharing affect third parties and systemic outcomes (surveillance, discrimination), not just consenting users.
  • Regulatory and cultural heterogeneity: legal requirements and norms differ across jurisdictions and user groups, complicating universal designs.
  • Measurement difficulty: evaluating whether negotiated outcomes are fair, informed, or durable is ethically and methodologically challenging.

Relevant literature:

  • Nissenbaum, H. (2010). Privacy in Context: Technology, Policy, and the Integrity of Social Life.
  • Balebako, R., et al. (2014). “Privacy in the App Store: Trends and User Perceptions.”
  • Sadeh, N., et al. (2014). “Understanding and Capturing People’s Privacy Policies in Everyday Mobile Ecosystems.” Proceedings of CHI.
  • Calo, R. (2012). “The Boundaries of Privacy Harm.” Indiana Law Journal.

This topic is timely and impactful: designing negotiable, legible privacy interactions could improve user autonomy, reduce harm from opaque data practices, and inform policy and standards.

Designing Collaborative AR Workspaces for Distributed Teams

Explanation: This research topic examines how augmented reality (AR) can create shared, persistent workspaces that support collaboration among geographically distributed team members. It focuses on interaction design challenges—spatial anchoring, multi-user awareness, gesture and voice input, conflict resolution for shared artifacts, and accessibility—while considering technical constraints such as latency, device heterogeneity, and tracking accuracy. The project treats collaboration as a socio-technical system: it investigates how AR alters communication practices, roles, and workflows, and develops interaction patterns and design guidelines that balance immersion with usability and inclusion.

Why this selection matters:

  • Practical relevance: Remote and hybrid work are persistent; AR promises richer collaboration than video or 2D screens for tasks requiring spatial reasoning, prototyping, or situated information.
  • Wicked problem characteristics: The topic involves ill-defined goals (optimal levels of shared presence), conflicting stakeholder needs (privacy vs. awareness, individual vs. group control), evolving technology, and intertwined social and technical elements—making simple solutions impossible and requiring iterative, interdisciplinary design research.
  • Research contributions: Expected outputs include conceptual frameworks for AR collaboration, design patterns for multi-user interaction, prototype systems, empirical evaluations of productivity and team dynamics, and ethical guidelines for privacy, equity, and psychological effects.

Key research questions to pursue:

  • How should shared objects and annotations be anchored and synchronized across devices with differing capabilities and network conditions?
  • What interaction metaphors and controls best support turn-taking, conflict resolution, and ownership of virtual artifacts?
  • How does AR presence affect awareness, trust, and decision-making in distributed teams?
  • What accessibility strategies ensure inclusive participation in spatial AR environments?

Relevant references:

  • Benford, S., Crabtree, A., Reeves, S., Sheridan, J. G., & Dix, A. (2006). “Between us: HCI and collaborative systems.” (Discusses socio-technical perspectives on collaboration.)
  • Billinghurst, M., Clark, A., & Lee, G. (2015). “A survey of augmented reality.” Foundations and Trends in Human–Computer Interaction. (Overview of AR interaction techniques.)
  • Dourish, P. (2001). “Where the Action Is: The Foundations of Embodied Interaction.” (Foundational ideas about embodied and situated interaction.)

This topic suits design-led PhD projects combining prototyping, user studies, and theoretical analysis to address a complex, evolving problem space.

Designing for Older Adults’ Autonomy with Smart Home Systems

Explanation: This research topic examines how interaction design for smart home technologies can support older adults’ independence, dignity, and decision-making while balancing safety and privacy. It focuses on user-centered methods to identify older adults’ values, daily routines, cognitive and sensory needs, and social contexts; on designing adaptive, transparent interfaces and control mechanisms; and on ethical frameworks and evaluation metrics that measure autonomy rather than only risk reduction. Key questions include how systems can offer unobtrusive assistance that preserves choice (e.g., configurable alerts, graceful degradation, consentful data sharing), how to communicate system behavior and limits to users and caregivers, and how to design for diverse aging trajectories and digital literacies.

Relevant wicked problems:

  • Competing values: autonomy versus safety — stakeholders (older adults, family, care providers, insurers) disagree about acceptable risk and intervention.
  • Privacy versus monitoring: continuous sensing improves assistance but undermines privacy and may change behavior.
  • Heterogeneity of users: wide variability in health, cognition, cultural expectations, and technology comfort resists one-size-fits-all solutions.
  • Responsibility and liability: ambiguous accountability when automated systems fail or when caregivers defer decisions to technology.
  • Long-term adaptation: designing systems that remain useful across progressive health changes and evolving social networks.

Methods and contributions:

  • Mixed methods: ethnographic studies, participatory co-design with older adults and caregivers, and longitudinal field deployments.
  • Interaction design outcomes: adaptive UI patterns, consent-first data flows, explainable autonomy modes, fallback and override affordances.
  • Evaluation: autonomy-centered metrics (perceived control, decision sovereignty), quality-of-life assessments, and safety/privacy tradeoff analyses.

References (select):

  • Friedman, B., Kahn, P. H., & Borning, A. (2006). Value Sensitive Design. In The Encyclopedia of Human–Computer Interaction.
  • Czaja, S. J., & Lee, C. C. (2007). The impact of aging on access to technology. Universal Access in the Information Society.
  • Crabtree, A., et al. (2013). Turning the living room into a laboratory: The role of deployment in design research. In Proceedings of the 6th ACM conference on Designing Interactive Systems.

Why these topics—and who else to read

Short explanation for the selection

  • Relevance: Each topic addresses current, high-impact areas where interaction design shapes health, equity, trust, and sustainability—domains with clear societal stakes and funding interest.
  • Research fit: They pair empirical grounding (ethnography, fieldwork, trials) with design practice (prototyping, co-design) so a dissertation can produce both knowledge and artifacts.
  • Wickedness: I prioritized problems that are “wicked” in the sense of multiple stakeholders, value conflicts, shifting constraints, and ambiguous success criteria—conditions that make research practically important and theoretically rich.
  • Feasibility: Topics span levels of technical difficulty and participant access so you can choose a scope that matches resources (e.g., interface studies vs. clinical trials).
  • Ethics & impact: Many choices foreground ethical trade-offs (privacy, persuasion, autonomy), encouraging responsible research design and early IRB engagement.

Suggested related ideas (quick alternatives or expansions)

  • Digital empathy in education platforms (students/teachers).
  • Low-bandwidth voice UIs for disaster response.
  • Transparent recommender interfaces for news and political content.
  • Habit-change designs for workplace well-being (not just general use).
  • AR design for medical training (rather than only distributed work).
  • Climate action design focused on organizational change (companies, institutions).
  • Smart-home negotiation tools that mediate between older adults and remote caregivers.
  • Mental-health interfaces explicitly designed for crisis detection and safe escalation.
  • Persuasive design frameworks tailored for low-trust communities.
  • Privacy negotiation UIs for IoT devices (home hubs, wearables).

Key authors and sources to read next (by topic)

  • Designing for Digital Empathy: Jonathan Grudin; Lisbeth Bødker (participatory & human-computer interaction); Don Norman (emotional design). See Bødker, L. (2006). “When Second Wave HCI Meets Third Wave Challenges.”
  • Inclusive Voice UI: Benetech (inclusive tech reports); Meredith Ringel Morris; Seeman et al. on voice UX. Also explore work by Cathy Marshall on accessibility.
  • Trust & Transparency in Recommenders: Nicholas Diakopoulos; Aza Raskin; Eslami et al. (2016) “Be Careful; This Paper Is About You”; Kocielnik et al. (2019).
  • Habit Change Without Coercion: B.J. Fogg (Behavior Model, persuasive tech); Lockton, Harrison & Stanton (nudges and design with consent).
  • Collaborative AR Workspaces: Hirokazu Kato, Steve Benford, Mark Billinghurst; Paul Dourish on awareness and coordination (1995–2004).
  • Climate Action & Citizen Engagement: Ezio Manzini (design for social innovation); Per Espen Stoknes (psychology of climate action); Elizabeth Shove (practice theory) for behavior-context perspectives.
  • Older Adults & Smart Homes: Wendy Rogers; Neil Charness; Constantine Stephanidis; work by Georgia Institute of Technology on aging and technology.
  • Multi-Modal Mental Health: John Torous; John Torous & Matcheri Keshavan; Shanee Yorshankar; Luxton et al. (2016) on mobile health clinical considerations.
  • Ethical Persuasive Interfaces: B.J. Fogg (2003); Susser, Roessler & Nissenbaum (2019) on manipulation and autonomy.
  • Data Privacy Negotiation: Helen Nissenbaum (privacy as contextual integrity); Lorrie Cranor (usable privacy/security); Daniel J. Solove on privacy taxonomy.

If you’d like, I can:

  • Expand one selected topic into a 2–3 page proposal (research questions, methods, contributions, timeline, bibliography).
  • Provide a short annotated bibliography for any one topic above.

Why These Interaction Design Dissertation Topics Were Selected — Short Explanation with Examples

These topics were chosen because they sit at the intersection of technically feasible interaction design research and socially consequential, complex (often “wicked”) problems. Each topic:

  • Addresses real-world stakes: health, privacy, inclusion, environment, work, aging.
  • Balances theory and practice: combines socio-technical theory (e.g., trust, persuasion, accessibility) with hands-on methods (co-design, prototyping, field studies).
  • Embraces wickedness: acknowledges conflicting stakeholder values, measurement challenges, and changing contexts — which makes the research both hard and societally important.
  • Enables multiple research contributions: design patterns, evaluation methods, ethical frameworks, policy implications.

Examples (concrete scenarios showing why each is useful and wicked):

  1. Telehealth Empathy Interfaces
  • Why chosen: remote care is scaling fast; small design changes can affect patient outcomes.
  • Wickedness example: a clinician wants efficient visits while a patient needs time and emotional support — optimizing for one can harm the other. Measuring “empathy” mixes subjective reports, conversational cues, and physiological signals, complicating evaluation.
  1. Voice UI for Low-Literacy, Multilingual Users
  • Why chosen: voice platforms can expand access where text fails.
  • Wickedness example: translating prompts into multiple languages isn’t enough — cultural norms about turn-taking, formality, and privacy differ, so a single voice interaction model may fail across contexts. Success metrics (task completion, user satisfaction, adoption) may conflict.
  1. Trust and Transparency in Recommenders
  • Why chosen: algorithmic systems shape choices in media, health, and shopping; interfaces can surface explanations to users.
  • Wickedness example: a business wants engagement (which may benefit from opaque optimization) while users need understandable, fair explanations. Technical model opacity and varied user mental models make one-size-fits-all explanations ineffective.
  1. Non-Coercive Digital Habit Change
  • Why chosen: platforms encourage engagement; designers can steer behavior toward wellbeing.
  • Wickedness example: persuasive nudges can help or manipulate. Measuring long-term habit change is hard (short-term compliance ≠ sustained autonomy), and platform incentives often oppose reduction of engagement.
  1. Collaborative AR for Distributed Teams
  • Why chosen: AR promises new coordination affordances for remote work.
  • Wickedness example: technical limits (latency, occlusion) collide with social needs (privacy, presence). Different teams have divergent workflows, producing conflicting design requirements.
  1. Climate Action Citizen Apps
  • Why chosen: interaction design can mobilize individual and collective action on climate.
  • Wickedness example: motivating behavioral changes intersects politics and identity; an app that nudges emissions reductions for individuals may be ineffective without systemic change, making impact assessment difficult.
  1. Smart Homes for Older Adults’ Autonomy
  • Why chosen: aging populations need supportive tech that preserves dignity.
  • Wickedness example: sensors that ensure safety can feel surveillant; family members and older adults may disagree on acceptable monitoring, and regulations vary by jurisdiction.
  1. Multi-Modal Mental Health Self-Management
  • Why chosen: combining modalities can make self-help more responsive and accessible.
  • Wickedness example: sensor-based personalization risks privacy and unintended clinical harms (missed crises). Clinical validity and ethical safeguards must be negotiated.
  1. Ethical Persuasive Interfaces for Public Health
  • Why chosen: public health requires population-level behavior change without manipulation.
  • Wickedness example: what counts as “ethical persuasion” differs across cultures and stakeholders; interventions can backfire or entrench mistrust.
  1. Data Privacy Negotiation UIs
  • Why chosen: current consent models fail to support meaningful, granular choices.
  • Wickedness example: legal complexity, platform power asymmetries, and cognitive load make truly informed consent extremely difficult to achieve in practice.

Practical guidance for choosing among them

  • Pick a problem where you can access relevant stakeholders (clinicians, community groups, platform data, etc.).
  • Prefer topics where you can combine qualitative grounding (to map values and practices) with iterative prototypes and measurable outcomes.
  • Anticipate ethical review and design safeguards early for sensitive domains (health, mental health, vulnerable groups).

If you’d like, I can expand any one example into a 2–3 page dissertation proposal (research questions, methods, expected contributions, timeline, and bibliography).

Why these topics were selected — brief explanation and related scholars

Short explanation for the selection The list prioritizes research topics that are (1) societally relevant, (2) methodologically tractable within a dissertation, and (3) philosophically and practically “wicked” — meaning they involve value tensions, stakeholders with conflicting goals, and open-ended evaluation criteria. Interaction design sits at the intersection of technology, behavior, and institutional incentives; these topics foreground ethical stakes (privacy, autonomy, manipulation), real-world constraints (technical limits, regulatory environments), and opportunities for novel design interventions (new interaction patterns, explainability, co-design). They are chosen because they allow you to produce original design artefacts and deployable studies while engaging with pressing social problems.

Suggested adjacent ideas

  • Digital literacy scaffolds for algorithmic media: Interfaces that teach users how recommendation systems work through micro-interactions.
  • Adaptive consent UIs for long-term research: Designs that let participants revise data-sharing preferences over time.
  • Emotion-aware interfaces for conflict de-escalation: Interaction patterns that surface and moderate affect in mediated conversations.
  • Civic deliberation platforms that resist polarization: Interaction techniques to encourage perspective-taking and reduce echo chambers.
  • Transparent personalization for education tech: Interfaces that explain how content is adapted to learners and let them adjust goals.

Key people and sources to consult

  • On values, privacy, and design: Helen Nissenbaum (Privacy in Context, 2010); Batya Friedman & David Hendry (Value Sensitive Design).
  • On wicked problems and design: Horst Rittel & Melvin Webber (Dilemmas in a General Theory of Planning, 1973); Nigel Cross (Designerly Ways of Knowing).
  • On persuasive and behavior design: B. J. Fogg (Persuasive Technology, 2003/2009); Richard Thaler & Cass Sunstein (Nudge).
  • On explainability, trust, and recommender systems: Amirata Eslami et al. (“Be careful; things can be worse than they appear”: A study of algorithmic transparency in News Feed, 2016); Kocielnik et al. (explainable recommender interfaces).
  • On inclusive interaction and voice UIs: Benetech resources on inclusive tech; Seeman et al. (voice UX research).
  • On interaction and health/mental health tech: John Torous et al. (digital mental health research, 2018); Luxton et al. (mobile health ethics).
  • On aging, autonomy, and smart homes: Wendy Rogers; Shelia Czaja & Joseph Lee (older adults & technology).
  • On human-computer collaboration and AR: Steve Benford, Mark Billinghurst, Paul Dourish.
  • On normative and ethical frameworks: Helen Nissenbaum (privacy), Luciano Floridi (information ethics), Shannon Vallor (technology and virtue).

If you’d like, I can:

  • Expand any single topic into a 2–3 page dissertation proposal (questions, methods, contributions, timeline, bibliography).
  • Provide a curated reading list (10–15 core papers) for one chosen topic.
  • Map potential supervisors and labs active in a chosen area.

Rationale and Expanded Research Questions for Selected Interaction Design Topics (including Wicked Problems)

Short explanation for the selection: These topics were chosen because they sit at the intersection of real-world impact, technical feasibility, and deep socio-technical uncertainty — the hallmark of “wicked problems.” They each address domains where interaction design decisions materially affect human wellbeing, social equity, or public goods (health, privacy, climate). They also lend themselves to mixed-methods approaches (qualitative grounding + iterative prototyping + measurable evaluation), which is appropriate for dissertations that must produce both theoretical contributions and validated artifacts. Finally, the topics span a range of interaction modalities (voice, AR, recommender UIs, smart homes, sensor-driven interfaces) and populations (older adults, multilingual low-literacy users, clinicians, citizens), giving options that fit different researcher skills, access to stakeholders, and ethical constraints.

Expanded research questions (by topic)

  1. Designing for Digital Empathy in Telehealth Interfaces
  • How do specific interaction cues (tone, avatars, micro-interactions) affect patients’ perceived empathy during video consultations?
  • Can physiological signals (heart rate variability, skin conductance) provide reliable indicators of empathetic connection in remote sessions?
  • How do clinicians’ workflows change when empathy-supportive features are introduced, and what trade-offs arise?
  • What design features mitigate misunderstandings in cross-cultural telehealth encounters?
  1. Inclusive Voice UI for Multilingual, Low-Literacy Populations
  • What conversational strategies (turn-taking, confirmation prompts) improve task success among low-literacy users?
  • How does code-switching and dialectal variation affect ASR accuracy and user satisfaction?
  • What approaches to privacy-preserving personalization work for users who share devices or lack digital identities?
  • Which evaluation metrics (task completion, error recovery, felt autonomy) best capture inclusivity in voice UIs?
  1. Designing for Trust and Transparency in Recommender Interfaces
  • What kinds of explanations (feature-based, example-based, process-based) most effectively increase user understanding without cognitive overload?
  • How do transparency features change users’ acceptance of recommendations across different domains (news, health, e-commerce)?
  • Can interactive controls (slider for novelty vs. relevance) align recommendations with users’ evolving goals?
  • How do business incentives (click-through optimization) constrain transparent design, and what interface patterns mitigate these tensions?
  1. Interaction Design for Digital Habit Change Without Coercion
  • Which micro-interaction patterns support sustained behavior change while preserving perceived autonomy?
  • How does framing (self-improvement vs. prevention) influence long-term adherence to healthier digital habits?
  • What role do social features (support groups, social contracts) play in voluntary habit change, and when do they backfire?
  • How can designers measure “nudging” vs. “coercion” empirically in digital contexts?
  1. Designing Collaborative AR Workspaces for Distributed Teams
  • What spatial metaphors and interaction primitives best support shared attention and awareness in persistent AR?
  • How do ephemeral vs. persistent artifacts affect coordination and knowledge continuity across time zones?
  • What privacy and presence controls do users need to feel comfortable working in AR with colleagues?
  • How do AR collaboration patterns vary across task types (creative design vs. coordination vs. training)?
  1. Interaction Design for Climate Action Apps (Citizen Engagement)
  • What interaction features increase sustained civic participation in climate initiatives (pledges, local action groups)?
  • How do different framings of climate information (local impact, personal carbon footprint, collective targets) affect motivation?
  • Can gamification elements produce measurable reductions in emissions, or do they produce shallow engagement?
  • How do platform-level incentives and political polarization shape uptake and trust in climate apps?
  1. Designing for Older Adults’ Autonomy with Smart Home Systems
  • What interaction metaphors (conversational, dashboard, ambient) best support older adults’ sense of control?
  • How can systems negotiate safety interventions with elderly users and their caregivers in ways that respect autonomy?
  • Which monitoring modalities (vision, motion, wearables) are most acceptable when balanced against privacy concerns?
  • How do regulatory constraints (data retention, consent) shape feasible interaction designs for aging-in-place?
  1. Multi-Modal Interaction for Mental Health Self-Management
  • How can multi-modal cues (text, voice, sensor-derived context) be fused to accurately detect states requiring intervention?
  • What personalization strategies reduce false positives and maintain user trust over time?
  • How to design safe escalation paths (crisis support) that respect privacy while minimizing harm?
  • Which interaction affordances promote sustained engagement with self-management tools without dependency?
  1. Designing Ethical Persuasive Interfaces for Public Health Campaigns
  • How can designers operationalize “ethical persuasion” — what principles and interaction patterns constitute acceptable influence?
  • Which persuasive techniques are effective across cultures and demographic groups without being manipulative?
  • How do transparency and consent mechanisms affect campaign efficacy and public trust?
  • What safeguards prevent backfire effects or unintended behavioral harms?
  1. Interaction Design for Data Privacy Negotiation
  • What UI metaphors make abstract data practices concrete and negotiable for everyday users?
  • How granular should negotiation controls be to balance cognitive load and meaningful choice?
  • Can progressive disclosure and contextual notices improve comprehension and consent quality?
  • How do power asymmetries (platform defaults, needed services) limit users’ practical privacy choices, and what interface patterns can mitigate this?

If you’d like, I can:

  • Expand any one topic into a 2–3 page dissertation proposal (research questions, methods, contributions, timeline, bibliography).
  • Prioritize topics by feasibility given different constraints (e.g., limited access to participants, limited funding).
  • Generate sample ethics protocols or measurement instruments for sensitive domains.

References (selected)

  • Bødker, S. (2006). When second-wave HCI meets third-wave challenges. Proceedings of the 4th Nordic conference on Human-computer interaction.
  • Norman, D. A. (2013). The Design of Everyday Things: Revised and Expanded Edition. Basic Books.
  • Eslami, M. et al. (2016). “I always assumed that I wasn’t good at [recommender]:” A study of algorithmic awareness. CHI 2016.
  • Fogg, B. J. (2003). Persuasive Technology: Using Computers to Change What We Think and Do. Morgan Kaufmann.
  • Nissenbaum, H. (2010). Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford University Press.
  • Torous, J., et al. (2018). Clinical review of smartphone applications for mental health. World Psychiatry.

Which topic do you want expanded into a full proposal?

Why “Designing for Older Adults’ Autonomy with Smart Home Systems” was selected

This topic was chosen because it sits at the intersection of urgent social need, rich design challenges, and ethically complex “wicked” problems—making it both academically fertile and socially impactful.

Key reasons:

  • High societal relevance: Aging populations and the preference for aging-in-place create strong demand for technologies that support independence while reducing care burdens on families and health systems (Czaja & Lee 2007).
  • Rich interaction-design complexity: Supporting autonomy requires subtle, contextual interaction patterns (configurable assistance, graceful degradation, clear override controls) rather than simple on/off features, which invites novel design contributions.
  • Wicked problem characteristics: Conflicting stakeholder values (safety vs. autonomy), privacy-monitoring trade-offs, heterogeneous user needs, and unclear liability chains make this domain resistant to one-size solutions and call for iterative, participatory approaches.
  • Methodological opportunities: The topic naturally supports mixed methods—ethnography, co-design, longitudinal deployments, and quantitative measures—so a dissertation can combine theory, artifacts, and rigorous evaluation.
  • Ethical and theoretical contribution: Focusing on autonomy (rather than purely safety or efficiency) foregrounds value-sensitive design, explainability, consentful data practices, and metrics that reflect dignity and decision sovereignty.
  • Feasibility and impact: Researchers can access diverse stakeholders (older adults, caregivers, clinicians), prototype within real homes, and produce deliverables with clear translational potential (design patterns, evaluation frameworks, policy recommendations).

Selected references:

  • Czaja, S. J., & Lee, C. C. (2007). The impact of aging on access to technology. Universal Access in the Information Society.
  • Friedman, B., Kahn, P. H., & Borning, A. (2006). Value Sensitive Design. In The Encyclopedia of Human–Computer Interaction.
  • Crabtree, A., et al. (2013). Turning the living room into a laboratory: The role of deployment in design research. DIS Proceedings.

If you want, I can expand this into a 1–2 page problem statement with research questions, methods, and possible evaluation metrics.

Interaction Patterns for Autonomy in Smart Home Systems — Configurable Assistance, Graceful Degradation, and Clear Override Controls

Short explanation for the selection: These interaction patterns—configurable assistance, graceful degradation, and clear override controls—directly target the central tension in designing smart homes for older adults: supporting safety and well‑being without undermining autonomy and dignity. Configurable assistance lets users choose the level, timing, and types of help (ranging from subtle reminders to active interventions), honoring personal preferences and changing needs. Graceful degradation ensures that when sensors fail, connectivity drops, or a user’s ability changes, the system steps down service in predictable, non‑punitive ways (e.g., switching from automated intervention to gentle prompts or caregiver alerts) rather than abruptly removing support. Clear override controls give older adults straightforward, discoverable means to accept, delay, or refuse system actions—preserving decision sovereignty and preventing covert automation from displacing human agency. Together, these patterns create an interaction ethos that foregrounds consent, transparency, and adaptability, and they provide measurable design levers for evaluating perceived control, trust, and quality of life among diverse aging populations.

Why these patterns matter (brief):

  • They operationalize autonomy-centered values into concrete UI/behavioral mechanisms suitable for deployment and evaluation.
  • They address wicked aspects: they mediate competing stakeholder values (users vs. caregivers vs. insurers), cope with heterogeneity by enabling personalization, and reduce harm from technological failure through predictable fallback behavior.
  • They are amenable to mixed-methods research (co-design to identify preferences; prototypes to test interactions; longitudinal measures of autonomy, safety, and satisfaction).

Relevant design considerations (concise):

  • Defaults and configurability: default settings should be conservative and easy to change; progressive disclosure helps avoid overwhelming users.
  • Feedback and legibility: the system must explain why it acted and how to change settings in plain language and multimodal cues.
  • Social mediation: support for delegating temporary control to trusted caregivers with clear expiration and audit trails.
  • Ethical safeguards: logging and consent records, minimal necessary data collection, and mechanisms to contest or reverse automated actions.

Key references:

  • Friedman, Kahn & Borning, Value Sensitive Design (2006).
  • Czaja & Lee, The impact of aging on access to technology (2007).
  • Crabtree et al., Deployment in design research (2013).

If you want, I can expand this into a 2–3 page proposal with concrete research questions, methods, prototypes, evaluation metrics, and a timeline.

Multi-Modal Interaction for Mental Health Self-Management

Explanation: This research topic investigates how multiple interaction modes (speech, text, touch, gesture, biofeedback, and ambient displays) can be integrated into digital tools to support individuals’ ongoing self-management of mental health conditions (e.g., anxiety, depression, bipolar disorder). The core aim is to design, prototype, and evaluate interaction ecosystems that adapt to users’ changing contexts, cognitive states, and preferences to increase accessibility, engagement, personalization, and therapeutic effectiveness.

Why this selection matters:

  • Complexity and variability: Mental health self-management is a “wicked problem” — needs are highly individual, change over time, and intersect with stigma, privacy, and access to care. Multi-modal systems can flexibly meet diverse needs where single-modality solutions fail.
  • Situated support: Combining passive sensing (physiological signals, activity, speech patterns) with active interaction (conversational agents, expressive touch or gesture controls, journaling interfaces) enables timely, context-aware interventions and recommendations.
  • Inclusivity and accessibility: Offering alternatives (voice for low literacy or visual impairment; text and haptics for privacy-sensitive contexts) lowers barriers to use across populations and contexts.
  • Engagement and adherence: Varied modalities can reduce fatigue and monotony, scaffold behavior change, and maintain long-term engagement—key for chronic self-management.
  • Ethical and design challenges: The topic foregrounds privacy, consent, data security, interpretability of affective sensing, potential over-reliance on automation, and cultural sensitivity—each a research contribution area.

Methods and contributions (brief):

  • Mixed-methods design: ethnographic fieldwork, co-design with clinicians and lived-experience participants, iterative prototyping.
  • Technical development: multimodal fusion algorithms, real-time adaptive interfaces, low-latency biofeedback loops.
  • Evaluation: longitudinal user studies measuring usability, clinical symptom trajectories, engagement, and safety.
  • Theoretical output: frameworks for designing responsible multi-modal mental health interactions, guidelines for managing trade-offs between personalization and privacy.

Relevant references (starting points):

  • Diefenbach, S., & Hassenzahl, M. (2017). “Designing for the good life” (on wellbeing-focused interaction design).
  • Mohr, D. C., et al. (2017). “The behavioral intervention technology model” (J. Med Internet Res).
  • Picard, R. W. (1997). “Affective Computing.”
  • Wright, P., et al. (2014). “Designing for and with people with dementia” (for ethical co-design methods).

This topic yields both practical prototypes and theoretical frameworks addressing a pressing, wicked societal problem through interaction design.

Dissertation Topics in Interaction & Digital Design — Including Wicked Problems

Below are 12 concise dissertation ideas for interaction and digital design, each with a short explanation, examples, and notes on wicked-problem aspects where relevant.

  1. Designing for Digital Well-being in Always-On Workplaces
  • Explanation: Investigate interaction designs that reduce cognitive overload, notification stress, and burnout in remote/hybrid work.
  • Examples: adaptive notification scheduling; ambient displays signaling focus windows; calendar-integrated interruption budgets.
  • Wickedness: Balancing productivity, autonomy, employer expectations, and diverse personal needs creates value conflicts and no single solution.
  1. Inclusive Interaction Patterns for Neurodiverse Users
  • Explanation: Study design patterns, customization, and adaptive interfaces that support autistic, ADHD, and sensory-sensitive users.
  • Examples: configurable sensory filters, task scaffolding modes, predictable navigation templates.
  • Wickedness: Neurodiversity spans many profiles; ethical and practical trade-offs between standardization and personalization.
  1. Designing Trust and Transparency for AI-Driven Interfaces
  • Explanation: Explore interaction techniques that make machine-learning decisions interpretable, actionable, and fair to end-users.
  • Examples: layered explanations, contrastive explanations, uncertainty visualizations, user-controllable model settings.
  • Wickedness: Trade-offs among transparency, system performance, commercial IP, and user comprehension create contested design constraints.
  1. Emotion-Aware Interfaces: Ethics, Detection, and Interaction Futures
  • Explanation: Research methods for detecting affect (voice, face, behavior) and designing ethical responses that respect privacy and dignity.
  • Examples: affective feedback for tutoring systems; mood-adaptive music interfaces; opt-in emotional state logging with clear consent flows.
  • Wickedness: Privacy, surveillance risks, and cultural differences in emotional expression make design ethically fraught and context-dependent.
  1. Designing for Digital Longevity and Repairability in Consumer Devices
  • Explanation: Focus on interaction design that encourages repair, maintenance, and sustainable user behavior.
  • Examples: intuitive repair-guides embedded in UIs, diagnostic assistants, modular firmware that supports third-party parts.
  • Wickedness: Conflicts among manufacturers’ business models, user capabilities, regulation, and environmental goals produce multi-stakeholder disputes.
  1. Multimodal Interaction for Accessibility in Public Spaces
  • Explanation: Create interaction systems combining voice, haptics, gestures, and displays to serve diverse accessibility needs in transit stations, kiosks, etc.
  • Examples: multimodal ticket machines that switch modes based on user preference; haptic wayfinding for the visually impaired.
  • Wickedness: Public deployments must reconcile universal design ideals with physical constraints, cost, and differing legal standards.
  1. Designing Persuasive Interfaces Without Manipulation
  • Explanation: Investigate techniques for behavior change (health, energy use) that preserve user autonomy and avoid dark patterns.
  • Examples: commitment devices with transparent trade-offs, nudge designs with clear opt-outs, reflective prompts.
  • Wickedness: Distinguishing persuasion from manipulation involves subjective ethics and varied stakeholder incentives.
  1. Interaction Design for Mixed Reality Collaboration at Scale
  • Explanation: Study interaction metaphors and tools enabling large-group collaboration in AR/VR that preserve social cues and coordination.
  • Examples: spatial audio cueing, scalable avatar representations, focus+context tools for shared virtual whiteboards.
  • Wickedness: Technical limits, privacy, social norms, and accessibility intersect in complex ways as scale increases.
  1. Data Minimalism: Interfaces That Reduce Unnecessary Data Capture
  • Explanation: Explore design strategies that let users accomplish goals while minimizing personal data collection and retention.
  • Examples: ephemeral session modes, on-device processing UIs, data-sparing presets.
  • Wickedness: Business analytics, personalization benefits, and regulatory requirements create competing pressures.
  1. Designing for Algorithmic Fairness in Personalized Content
  • Explanation: Research interaction methods that allow users to perceive and correct personalization biases in newsfeeds, recommender systems.
  • Examples: bias-aware settings, provenance badges for content, user-adjustable weighting sliders for recommendation criteria.
  • Wickedness: Fairness definitions differ across contexts and groups; transparency may enable gaming or reduce commercial utility.
  1. Crisis-Responsive Interaction Design for Urban Information Systems
  • Explanation: Create adaptable interaction systems for delivering timely, reliable information during disasters and public emergencies.
  • Examples: progressive disclosure UIs for overloaded networks, priority channels, multi-channel confirmation mechanisms.
  • Wickedness: High stakes, uncertain information, diverse audiences, and institutional coordination make solutions unpredictable and contested.
  1. Designing for Long-Term Digital Relationships (Products as Companions)
  • Explanation: Study how interaction design shapes extended relationships between users and devices/agents over years—attachment, trust trajectories, obsolescence.
  • Examples: lifecycle onboarding that evolves with skill, retirement modes that transfer data to successors.
  • Wickedness: Emotional bonds, commercial lifecycles, data legacy, and evolving user identities create morally and practically entangled design challenges.

Suggested references (starting points)

  • Rittel, H. & Webber, M. “Dilemmas in a General Theory of Planning” (wicked problems) — Policy Sciences, 1973.
  • Norman, D. “The Design of Everyday Things” — interaction design principles.
  • Eubanks, V. “Automating Inequality” — on systemic impacts of algorithms.
  • Shneiderman, B. “Human-Centered AI” — principles for trustworthy AI interfaces.
  • Bødker, S. “Designing for Appropriation” — long-term use and evolving relationships.

If you want, I can: 1) expand any idea into a full dissertation outline (research questions, methods, data sources, ethical concerns, timeline), or 2) prioritize topics by feasibility and originality for a specific program or region.

A Short Argument Against "Dissertation Topics in Interaction & Digital Design — Including Wicked Problems"

While the collection of dissertation topics framed around interaction design and “wicked problems” is timely and ambitious, it has several weaknesses that limit its usefulness as a starting point for rigorous academic research.

  1. Overbreadth and Vagueness
  • Many topics are phrased at a high level (e.g., “Trust and Transparency,” “Emotion-Aware Interfaces”) without clear, tractable research boundaries. For a dissertation, vague scopes invite scope creep and make it hard to define specific hypotheses or evaluation criteria. See Robson (2011) on research design and the need for precise questions.
  1. Insufficient Theoretical Grounding
  • Several entries lean heavily on applied concerns and engineering solutions but lack explicit theoretical frameworks (e.g., cognitive load theory, theories of trust, or critical design approaches). A dissertation requires a theoretical scaffold to interpret findings and contribute to knowledge, not just usable prototypes. (Refer to Dourish, 2004; Norman, 2013.)
  1. Mixed Methodological Expectations Without Prioritization
  • The topics often list wide method sets (ethnography, RCTs, field deployments) as if all are equally feasible. Dissertations must balance depth and feasibility: attempting ethnography plus large-scale randomized trials within one project is usually unrealistic given time and resource limits. (Creswell & Creswell, 2018.)
  1. Underdeveloped Ethical and Practical Research Plans
  • Several wicked-problem notes acknowledge ethical complexity (privacy, surveillance, vulnerable populations) but do not propose concrete risk mitigation strategies (recruitment constraints, informed consent, data governance). Ethical review and practical access to stakeholders are central concerns that deserve more than cursory mention. (See Torous et al., 2018 on ethics in digital mental health research.)
  1. Risk of Reproducing Technological Solutionism
  • The emphasis on designing interfaces and prototypes risks prioritizing technical fixes over systemic analysis of the socio-technical systems that produce the “wickedness” (business incentives, regulation, cultural norms). Without analyzing these structures, solutions may be superficial or unsustainable. (Morozov, 2013; Nissenbaum, 2010.)
  1. Ambiguous Contribution Claims
  • Many proposals imply contributions such as “improved wellbeing” or “increased trust” but do not clarify what counts as novel theoretical, methodological, or practical contribution relative to existing HCI literature. Dissertations must articulate measurable, original contributions.

In short, the topic list is a useful ideation resource but not yet a set of dissertation-ready projects. To be suitable for a dissertation, each topic needs narrowing, explicit theoretical framing, a feasible and prioritized methods plan, a clear ethical strategy, and an articulation of original scholarly contribution. References: Dourish (2004); Creswell & Creswell (2018); Norman (2013); Nissenbaum (2010); Torous et al. (2018); Morozov (2013).

Why These Interaction Design Topics — Brief Rationale and Further Reading Suggestions

Short explanation for the selection These dissertation topics were chosen because they foreground current socio-technical tensions where design decisions produce high-stakes, ambiguous outcomes — i.e., wicked problems. Each topic ties practical interaction concerns (usability, accessibility, trust, engagement) to larger systemic issues (business incentives, regulation, ethics, cultural difference). That combination makes them well suited for doctoral research: they require empirical grounding, iterative prototyping, and conceptual analysis, and offer potential for original contributions across design, HCI, and policy.

Suggested related ideas (extensions and close variants)

  • Digital Empathy in Telehealth → Compare empathy affordances across synchronous (video) and asynchronous (message-based) care; study training interfaces for clinicians to present empathy signals.
  • Inclusive Voice UI → Focus on code-switching and dialect recognition; explore low-power on-device ASR for privacy-preserving deployment.
  • Trust & Transparency in Recommenders → Investigate user-driven model steering (users refine model behavior via interaction) or social explanations (peer-based rationale).
  • Habit Change without Coercion → Design “honest nudges” that expose trade-offs; explore interventions for rebound effects once nudges are removed.
  • AR Collaborative Workspaces → Study micro-interactions that signal interruption intent and consent in shared AR spaces.
  • Climate Action Apps → Design collective goal-setting interfaces that translate individual actions into visible community impact metrics.
  • Smart Homes for Older Adults → Prototype consent dashboards that mediate family/caregiver access and user autonomy over time.
  • Multimodal Mental Health Tools → Compare passive sensor-based inference vs. user-reported state as the basis for adaptive responses.
  • Ethical Persuasive Interfaces → Create taxonomies of persuasive patterns classified by autonomy impact; propose design heuristics to avoid manipulation.
  • Data Privacy Negotiation → Explore “privacy affordance” patterns: UI metaphors that make consequences of sharing tangible.

Key authors and sources to consult (by topic)

  • Wicked problems & planning
    • Rittel, H. W. J., & Webber, M. M. (1973). “Dilemmas in a General Theory of Planning.” Policy Sciences.
  • Interaction design fundamentals
    • Norman, D. A. (2013). The Design of Everyday Things.
    • Bødker, S. (2006). (work on appropriation and long-term design).
  • Trust, transparency, and algorithmic systems
    • Eslami, M. et al. (2016). “Be careful; things can be worse than they appear”: investigating algorithmic transparency in the Netflix Prize era. (Also look for Eslami et al., 2015/2016 on Facebook feed).
    • Kocielnik, R. et al. (2019). Work on explanations and human-AI interaction.
    • Selbst, A. D., & Barocas, S. (2018). “The Intuitions Behind Fairness Metrics.”
  • Persuasion, ethics, and behavior change
    • Fogg, B. J. (2003). Persuasive Technology.
    • Susser, D., Roessler, B., & Nissenbaum, H. (2019). “Technology, Autonomy, and Manipulation.”
  • Accessibility, inclusive design, and neurodiversity
    • Shinohara, K., & Wobbrock, J. O. — work on inclusive research methods.
    • National/discipline-specific guidelines (e.g., WCAG) and current HCI papers on neurodiversity.
  • Affective computing and mental health
    • Picard, R. W. (1997). Affective Computing.
    • Torous, J., et al. (2018). Mobile mental health — apps and evidence reviews.
  • Privacy, data minimalism, and negotiation
    • Nissenbaum, H. (2010). Privacy in Context.
    • Cranor, L. F. (2008). Research on usable privacy and consent dialogs.
  • Sustainable design and repairability
    • Manzini, E. (2015). Design for social innovation/sustainability.
    • Articles on right-to-repair and user-facing repair UIs.

How I can help next

  • Expand any single topic into a 2–3 page dissertation outline (research questions, methods, ethical issues, timeline, bibliography).
  • Prioritize topics by originality, feasibility, or alignment with a specific supervisor/program.
  • Provide a reading list tailored to one chosen topic with summaries of key papers.

In Defense of Dissertation Topics in Interaction & Digital Design — Including Wicked Problems

Interaction and digital design sit at the intersection of technology, human behavior, and social institutions. Arguing for dissertations that explicitly address wicked problems in this field rests on three linked commitments:

  1. Relevance to real-world complexity
  • Wicked problems—ill-defined goals, conflicting stakeholder values, shifting constraints—are precisely the conditions under which interactive systems are deployed. Research that engages these complexities produces work that matters beyond lab settings, informing decisions in healthcare, public infrastructure, workplace systems, and civic technologies. (Rittel & Webber, 1973; Dourish, 2004).
  1. Methodological richness and rigor
  • Tackling wicked problems requires mixed methods: qualitative grounding to reveal stakeholder values and context, iterative prototyping to test interactions in situ, and quantitative evaluation to assess outcomes. This combination produces robust, transferable insights while maintaining critical sensitivity to ethics and power. Such complexity pushes dissertation-level work to contribute both theory and craft. (Bødker, 2006; Sengers et al., 2005).
  1. Ethical and societal responsibility
  • Interaction designers shape the conditions of everyday life—attention, privacy, trust, autonomy. Framing dissertations around wicked problems foregrounds normative questions (What should systems do? For whom? At what cost?) and demands ethical reflection alongside technical innovation. This orientation cultivates researchers who can hold design accountable to public values. (Nissenbaum, 2010; Friedman et al., 2013).

Conclusion

  • Dissertation topics grounded in interaction design and wicked problems combine practical impact, methodological innovation, and ethical responsibility. They prepare researchers to produce actionable design knowledge and to intervene thoughtfully in the sociotechnical systems that structure modern life.

Selected references

  • Rittel, H. W. J., & Webber, M. M. (1973). Dilemmas in a General Theory of Planning.
  • Dourish, P. (2004). Where the Action Is: The Foundations of Embodied Interaction.
  • Bødker, S. (2006). When second wave HCI meets third wave challenges.
  • Nissenbaum, H. (2010). Privacy in Context: Technology, Policy, and the Integrity of Social Life.
  • Friedman, B., Kahn, P. H., & Borning, A. (2013). Value Sensitive Design and information systems.
  • Sengers, P., Boehner, K., David, S., & Kaye, J. (2005). Reflective design.