Exploring the future of AI in Experience Design

By Inviqa
Exploring the use of AI in experience design

At Inviqa, we’re always exploring how emerging technologies can enhance the way we work and the outcomes we deliver for our clients. With AI adoption accelerating across Havas and the wider industry, we're looking into the potential these tools may have in helping our teams in their work.
To better understand the potential uses for AI in experience design, the XD team decided to dedicate their annual hackathon to exploring how AI tools could support their research and design work: where can they speed up processes or take over an annoying task, and where are they best left to do the work themselves?

Their goal for the day: create a practical guide for using AI in the XD discipline by testing its capabilities across real-world tasks.

Here’s what they learned, and what it means for our clients.
 

AI can support user research, but can't replace human intuition

For recruitment briefs, AI (e.g. Microsoft Copilot) did prove useful for expanding thinking, suggesting new user segments and helping fill gaps in sampling logic. However, when it came to writing the briefs themselves, the output was no better than existing in-house templates.

In creating user personas, AI shows promise in transforming demographic and behavioural data into usable personas. It has the potential to be a time-saver for pitch decks and early-stage ideation, but full validation of its capabilities is required. These personas could then be used as a starting point for more in depth persona research.

In supporting survey creation, AI struggles with nuance and context and lacks the broader thinking humans often bring to this process. It also wasn't able to natively support the building of a survey, meaning overall it's not adding value to the process or saving any time. 

When distilling insights, AI can provide decent summaries of research insights if given the raw data, and Notebook LM was even able to generate a reasonably insightful podcast episode from imported data. However, oversight and tweaks were still required to make sure the output was accurate and relevant.

 

The key takeaway?

While the XD team found that AI can help accelerate parts of the research process, and can be useful in sparking new ideas, human expertise and input remain essential for quality and relevance. Part of good user research is picking up the subtle differences between what people say they do and what they actually do, and that's just not something AI is capable of.
 

Design tasks still need human flair

In competitor research, AI can help identify relevant competitors and provide high-level UX and heuristic analysis, but hallucinations mean validation is still incredibly important. In testing, there were several instances where the AI suggested organisations that didn't exist or claimed a tool could complete tasks it couldn't.

For designing presentations, AI-generated PowerPoint decks completely missed the mark on design and layout and are not yet a shortcut to slick client-ready presentations, even when given examples of existing design and content.

When generating images, Midjourney can spark ideas, but it’s not a fast or reliable way to produce brand-appropriate visuals. With the number of images it can produce in a short amount of time, there's also a lot of chaff to wade through to find something that could be of use.

When it comes to app design, the AI tools tested weren't able to match the speed or quality of human sketching. More time was spent watching loading screens or trying to get the AI to fix its errors than generating a usable design, making the entire process not only inefficient but frustrating.

 

The key takeaway:

As in user research, AI can be a useful ideation partner, but it's not a replacement for skilled design execution. More often than not, outputs are unusable, still require significant work to get up to standard, or are so numerous that it's not saving any time.

 

AI is a tool to be used by people - in certain situations

The hackathon revealed that while AI can assist with some tasks, it often requires more time and effort than doing the work manually. While these tools are improving and show potential to be helpful, they’re not yet capable of delivering the nuanced, high-quality outcomes our clients expect, and we pride ourselves on delivering.

Ultimately, creating something truly useful and that gets someone excited still requires significant human input.

 

So, what’s next?

The XD team will continue to explore AI’s potential across its practice, identifying where it can genuinely support their work and enhance client outcomes. AI won’t replace the human thinking, creativity or diligence that goes into what we produce for our clients, but that doesn’t mean it can’t occasionally lend a helping hand or help spark an idea we might otherwise not have had. 

If you're curious about how AI could support your own CX and UX initiatives, or want to explore co-innovation opportunities, we’d love to talk.
 

Want to hear direct from the team about what they learnt during the hackathon? Watch the insightful wrap-up video from Harriet Lam here: