
This is part two of The Inviqa Guide: Experience Design, in which we explore how Experience Design has evolved, where it’s at risk of losing sight of the human, and why it’s still crucial in helping organisations design experiences that genuinely work for people in increasingly complex digital environments.
In chapter one, published on The Drum, we explored how Experience Design (XD) has repeatedly evolved in response to technological change and why designing for humans remains essential.
In this chapter, we get more practical.
Here, we explore the most common pitfalls organisations fall into when building digital experiences, and how to avoid them.
It looks at where good intentions often break down and why data, patterns and AI tools are powerful only when grounded in real human insight.
As we touched upon in chapter one, there’s sometimes the belief that data tells the whole story, that established design patterns mean we’ve got UI cracked, and that AI tools are now advanced enough that we don’t need designers or to conduct user research with real people.
There’s certainly some truth to these claims, data is vital for making informed decisions, established design patters provide a great foundation to build from, and AI can provide a springboard for further thought and creative exploration.
However, the danger lies in relying on these in isolation. Good Experience Design relies on understanding context, having conversations and a willingness to challenge what we think we know.
With that in mind, here we unpack the most common pitfalls that derail building experiences that work and explain why real human input still matters.
Pitfall 1: An over-reliance on metrics and assumptions
Data can be an incredibly powerful tool and shine a light on where things might be going wrong. The data can show you what is happening and where it might be happening and can therefore help you understand where you may need to focus your attention to try and fix the issue.
The pitfall comes when you don’t dig a little further to understand the why – and this isn’t something you can get from your data.
For example, you might assume the reason someone abandons their cart last-minute is that they may just not be ready to purchase or are waiting for an abandoned cart discount code. In fact, the reason could be sticker shock at the cost of shipping, or an inability to locate your returns policy, which has put someone off completing the purchase - design issues that data alone won’t reveal.
How to avoid this pitfall
Pair analytics with qualitative research. Let the data guide where to look but use user interviews or testing to uncover what’s really going on
Turn guesses into testable questions, ensuring they align with actual user mental models.
Bring user researchers and analysts together early to share insight rather than working in isolation
This helps prevent well intentioned changes that fix nothing, or worse, create new problems.
Pitfall 2: Failing to adapt to changing user behaviours and expectations
While there are certainly digital experiences and UI patterns that simply ‘work’, that doesn’t mean there’s never an impetus to change. Yes, us humans like the comfort of the familiar but we also love novelty and are always striving for better. In other words, people’s expectations and mental models evolve – what felt intuitive a few years ago might feel outdated today. Many of the psychological tactics used on ecommerce websites (such as countdown times) were based on psychological studies conducted in the 1990s and people have become wise to such tricks.
Change is constant, meaning expectations and behaviours change too. That user journey or that design pattern that ‘worked’ five years ago, may no longer work as effectively today. Global events (such as a pandemic) can permanently alter local behaviours, and new apps and consumer tech can fundamentally alter expectations of how a website or app functions. (Just compare an old-style “AI” chat bot with a new LLM-based one). Failing to adapt could jeopardise growth and continued relevance.
How to avoid this pitfall
Treat design patterns as starting points, not fixed rules. Even well-established usability heuristics should be revisited over time, so use best practices as a guide but stay flexible and ready to adjust
Revisit established journeys regularly to see if they still align with user behaviour. What worked in the past might need tweaking now. Frequent check-ins help catch shifts in user preferences
Conduct small, frequent research cycles to avoid relying on outdated assumptions
Continuous adaptation helps ensure the experience matches how people behave today, not how they behaved five or ten years ago.
Pitfall 3: The temptation of synthetic users and AI-driven shortcuts
AI is promising to revolutionise a lot in the working world. User research and testing aren’t immune. Relying on AI solely, however, means you may be relying on outdated data based on how people reacted or thought in the past, rather than it being a snapshot of the reality of today or how someone may feel about the future.
While AI tools can create speculative personas to help empathise with different perspectives, they shouldn’t be mistaken for real people. They’re best used as provisional inputs that prompt thinking, rather than evidence to act on in isolation.
How to avoid this pitfall
Use AI to accelerate early thinking, not to replace human input. Synthetic users can be great at creating early hypotheses, ideating prompts for discussion, or stress testing assumptions that will be validated through real user research
Always validate AI-generated insights through real user research
Understand that AI can’t read body language, interpret hesitation or uncover sentiment. In live user tests, we will often spot a pause, a frown, or confusion that transcripts alone would miss; subtle cues are gold for XD professionals
AI can be useful and can help formulate early ideas or prompt novel thinking, but it shouldn’t be the only source of user insight. Identify the areas where AI can strengthen human-centred design and always remember to apply a human lens to its output.
Pitfall 4: Thinking that it’s once and done
This relates to the pitfall of not adapting to changing behaviours and expectations. Having gone through the effort of doing user research and user testing once, this doesn’t mean you’re exempt from having to do it again.
New features you launch, or changes made, will always benefit from a round or two of user testing. This testing doesn’t always need to be comprehensive but should validate that something works as expected and results in the desired actions.
This aligns with core HCI principles such as iterative design and continuous feedback loops. Even small changes can introduce friction or unintended consequences, so regular validation helps ensure usability and alignment with user goals.
How to avoid this pitfall
Build regular user testing into your delivery cycles
Treat research as part of the product lifecycle, not an optional luxury
Start small. Even short sessions with a handful of users can reveal valuable insight
Making validation routine helps keep the experience aligned with user needs as they evolve. Think of user testing as a quality assurance step for user experience, just as you would for code. It’s about ensuring the product behaves as intended from the user’s perspective.
Pitfall 5: Skipping user research due to economic pressures
There’s a common trope that when it comes to employee training and the cost associated with it a finance manager may question what happens if you spend the money on training a team member and they leave, taking their new skills with them, while the CEO responds: what happens if we don’t, and they stay.
Not investing in user research and testing leaves you in a similar position – you’ll have something that works, but there’s likely untapped potential – you’re a few tweaks and ‘aha’ moments away from having a product that excels at its job. Research is not a luxury but a strategic investment in reducing risk and ensuring product-market fit. It is also not as costly as you might think. In the quantitative research mindset, larger numbers of survey respondents mean greater confidence. With qualitative user testing, UX professionals have known for over 25 years that when testing a user journey with a single audience you begin to see redundancy (the same patterns repeating) after just five people.
How to avoid this pitfall
Reframe research as risk reduction rather than “extra cost”
Use lightweight methods when timelines or budgets are constrained
Show stakeholders examples where skipping research led to poor outcomes
Investing in understanding your users is far cheaper than repairing the damage caused by assumptions.
Avoiding these pitfalls means recognising that data, patterns and AI can only take us so far when used in isolation. On their own, they can surface signals, speed up analysis and open new possibilities, but it’s Experience Design that turns those inputs into decisions grounded in context, empathy and real human behaviour.
Experience Design relies on our ability to understand people in their real contexts which includes their motivations, emotions, expectations and all the invisible factors that shape their behaviour.
So, to build experiences that genuinely work, we need more than numbers, best practice templates or synthetic feedback. We need to step outside our assumptions and reconnect with the people we are designing for.
Which brings us to the cornerstone of effective Experience Design:
Talking to real people is irreplaceable
Talking to people helps put the experience in context.
People may have different opinions and perceptions based on brand, industry, location, where a brand or product sits on the price scale, or whether they’re looking at something online on their laptop, their phone or are using an on-location kiosk – all things that are hard to pull from data alone or AI-generated outputs, but crucial for designing experiences that engage people.
The Value of Nuance & Discovery through Human Insight – A Case Study
With self service technology becoming increasingly common, brands have to balance introducing it in a way that improves convenience and accessibility but doesn’t impede the human qualities customers value.
This is the experience one our or clients encountered first hand with the introduction of self service ordering screens in a number of its UK locations. While the initiative delivered some clear positives, including rising average order values and increased footfall in certain stores, adoption varied by location, and they wanted to validate it was the right choice before continuing the roll out.
As the data was showing what was happening, but not why, to get a better understanding of how the screens were affecting real world experiences, the business partnered with Inviqa to run a qualitative user testing programme.
By speaking directly to people in the cafés – both staff and customers - we were able to uncover what was working, what wasn’t, and the key things the organisations would need to keep in mind as they continued the rollout to support choice, accessibility and human connection.
Here’s what we found out:
The location of the screens mattered – both for adoption and the overall customer experience. Make them hard to find and adoption is lower. Conversely, putting them in too prominent a place risks alienating customers who prefer to talk to someone while they order and perceive that’s no longer an option. This highlights the importance of spatial affordances, how the physical placement of an interface communicates its purpose and availability.
Familiarity matters – people who had used screens previously and were more comfortable with written English, were more likely to search them out if they couldn’t see them straight away. There was also a split between whether someone felt their grasp of English was strong enough to use a screen over ordering at the till – those whose countries used the Latin alphabet found the screens easier to use, while those from countries that use a different writing system found it less easy and would head to the till. This underscores the role of cultural context and language confidence in shaping user preferences. Designing for inclusivity means recognising that different users bring different expectations and comfort levels to the same interface.
It improves accessibility – screens benefit those who are deaf or hard of hearing, as well as those who are neurodiverse. They’re also popular with groups and those with more complicated orders as they don’t feel rushed or judged. This aligns with the HCI principle of flexibility and user control. Providing multiple pathways to complete a task empowers users to choose the method that best suits their needs, reducing cognitive load and social pressure.
Customers don’t want screens to replace humans – whether it’s regulars, someone has a question or wants advice, or just feel more comfortable ordering from a person, people don’t want the screen to be the only ordering option and don’t want their introduction to lead to a reduction in staff numbers.
Based on the insights gathered from taking the time to talk to people, the organisation now has a clearer idea of how to continue the roll out of the self-service screen, ensuring they carefully consider placement in a way that encourages uptake but doesn’t indicate it’s the only option, and that going forward they strike a balance between providing an experience that suits both those keen to screen, and those who prefer the more human touch.
By combining observational insight with behavioural data, we helped our client make evidence-based decisions that respect user diversity, support accessibility, and preserve the human elements that matter most.
As this chapter highlights, avoiding pitfalls is a multi-disciplinary task covering design, user research and more - all disciplines that fall under the Experience Design umbrella.
In chapter 3, we break down the core disciplines that make up the Experience Design discipline and explain how they work together to deliver experiences that are usable, accessible and meaningful.
Want to get each chapter delivered to your inbox as it's published? Sign up for the series.





