A complete AI for UX glossary: 100 terms all designers should know

Confused by AI terms like generative UI, agentic AI or predictive UX? This complete AI glossary breaks down 100 essential concepts every UX and product designer should know in simple, practical language.

Free course promotion image

Professional Diploma in UX Design: Comprehensive, current and AI-focused

The industry-standard diploma in UX design: learn the full process, from research to prototyping, with content updated to reflect the expanding role of AI.

AI glossary illustration for UX designers showing key concepts including agentic AI, predictive UX, human-in-the-loop AI and prompt design around the title “AI Glossary (for UX)”.

Keeping up with AI technology is challenging enough. Then you’ve got the ever-growing number of new terms and buzzwords to contend with.

From generative UI and AI copilots to multimodal interfaces and predictive UX, the language of AI can be overwhelming. That’s why we’ve created our complete AI for UX glossary: a collection of 100 terms you’re likely to come across as you navigate AI in product design. 

Refer to the table of contents for quick navigation, or browse the full list below to explore the concepts worth knowing right now.

1. Adaptive interface

An adaptive interface is a user interface that automatically adjusts its layout, content or functionality based on user behaviour, preferences or context. AI can analyse usage patterns and adapt the interface to make common tasks easier or faster, or prioritise more engaging content.

For UX designers, adaptive interfaces make it possible to create highly personalised products and experiences.

Example: A productivity app might automatically prioritise the tools a user interacts with most, moving them to the top of the interface.

2. Adaptive personalisation

Adaptive personalisation refers to systems that dynamically tailor content, recommendations or UI elements to individual users based on real-time data and behaviour.

For product designers, this can improve relevance and engagement. AI models analyse interactions to deliver experiences that feel customised without requiring manual configuration.

Example: Spotify’s home screen adapts to listening habits, highlighting playlists and artists the user is likely to enjoy.

3. Agent interface

An agent interface is a UI designed for interacting with an AI agent that can complete tasks on behalf of the user.

Designing these interfaces requires UX decisions about how users assign tasks, monitor progress and intervene when necessary.

Example: A travel booking assistant that plans flights and accommodation after a user provides their destination and dates.

Read also: How to design experiences for AI agents: a step-by-step guide.

4. Agentic AI

Agentic AI refers to AI systems that can autonomously plan actions and execute tasks to achieve a goal.

For designers, agentic systems introduce new UX challenges around control, trust and transparency. Interfaces must clearly communicate what the AI is doing and allow users to intervene when needed.

Example: An AI agent that researches travel options, compares prices and automatically builds a travel itinerary.

5. AI assistant

An AI assistant is a digital tool that uses AI to help users complete tasks, answer questions or provide recommendations. In product design, AI assistants are often integrated into applications as copilots or chat interfaces.

Example: An AI assistant inside a design tool might suggest layout improvements or generate placeholder copy.

6. AI co-creation

AI co-creation describes collaborative workflows where humans and AI work together to create content, designs or solutions. Designers increasingly use AI as a creative partner during brainstorming, prototyping and iteration.

Example: A designer might use AI to generate multiple visual directions for a landing page before refining the best option.

7. AI content design

AI content design refers to the use of AI tools to generate or assist with UX writing, microcopy and interface content. For UX and content designers, AI can help produce multiple variations of text quickly, although human review is still essential.

Example: Generating different versions of a call-to-action button to test which wording performs best.

8. AI design critique tools

AI design critique tools analyse interface designs and provide feedback based on usability, accessibility or design principles. These tools can help designers quickly identify potential issues during early design stages.

Example: An AI tool that flags colour contrast problems or suggests improvements to button hierarchy.

9. AI design suggestions

AI design suggestions are recommendations generated by AI tools to improve layouts, components or visual hierarchy. These suggestions often rely on patterns learned from large datasets of interface designs.

Example: A design tool suggesting alternative spacing or alignment options to improve visual balance.

10. AI experience design (AIX)

AI experience design (AIX) is an emerging discipline focused on designing user experiences for AI-powered products and features. While traditional UX design focuses on usability and interaction design, AIX also considers factors unique to AI systems, such as transparency, trust, automation, uncertainty and human-AI collaboration.

Because AI systems can generate unpredictable outputs or make decisions on behalf of users, designers must carefully shape how these systems behave and communicate with people.

Example: Designing how a user interacts with a generative AI writing assistant inside a productivity app, including how prompts are entered, how results are presented and how users can refine or reject the AI’s suggestions.

11. AI feature discovery

AI feature discovery refers to using AI to identify potential product features or improvements based on user data and behaviour patterns. Product teams can use AI insights to uncover unmet user needs or opportunities for innovation.

Example: An AI analytics tool might reveal that many users export data to spreadsheets because the product lacks built-in reporting features, highlighting an opportunity to add a reporting tool.

12. AI-first interface

An AI-first interface is a product experience where AI plays a central role in how users interact with the system. Rather than navigating menus, users often interact through natural language prompts or commands.

Example: ChatGPT’s interface allows users to simply type what they want instead of navigating a traditional UI.

Read also: How to design for AI-first products.

13. AI-generated personas

AI-generated personas are user personas created with the help of AI tools using research data, analytics or prompt-based inputs.

These tools can quickly synthesise user insights and generate draft personas that describe goals, behaviours and pain points.

While they still require human validation, AI-generated personas can speed up the early stages of UX research and documentation.

Example: A researcher might upload interview summaries to an AI tool and generate several draft personas that represent key user segments.

14. AI interaction patterns

AI interaction patterns are recurring ways users interact with AI systems within digital products. Just like traditional UI patterns, these patterns help designers structure interactions so they feel predictable and easy to understand.

Common AI interaction patterns include prompt-based input, conversational workflows and AI copilots that assist users during tasks.

Understanding these patterns helps designers create experiences that feel intuitive rather than experimental.

Example: A design tool might use a copilot pattern where the AI sits alongside the interface and offers suggestions as the user works.

15. AI journey mapping

Journey mapping is a core UX practice used to visualise how users interact with a product across different touchpoints. Traditionally, creating journey maps involves manually analysing research data and mapping out user behaviours, emotions and pain points.

AI journey mapping uses AI tools to analyse user behaviour data and automatically generate or enhance these journey maps. This can help teams identify patterns and friction points more quickly.

Example: An AI tool might analyse product analytics and session data to automatically generate a visual map of the most common paths users take through an app.

16. AI layout generation

AI layout generation refers to AI tools that automatically create interface layouts based on prompts, design rules or existing components. These tools can accelerate early design exploration by quickly generating multiple layout variations that designers can then refine.

Example: You might prompt your AI design tool to generate several dashboard layouts based on a set of components or design tokens, then select and refine the most promising option.

17. AI microcopy generation

AI microcopy generation involves using AI to create short interface texts such as labels, tooltips and error messages. For UX writers, AI can speed up the drafting process, but human editing is essential for ensuring accuracy, clarity and tone.

Example: Generating multiple error message options for a failed login attempt.

18. AI onboarding

AI onboarding considers how users are introduced to an AI-powered feature or product for the first time.

Because AI systems often behave differently from traditional interfaces, onboarding plays an important role in setting expectations. Good AI onboarding helps users understand what the tool can do, how to interact with it and where its limitations are.

This is especially important in products where users need to write prompts, review AI outputs or learn a new interaction pattern.

Example: A writing tool might use its onboarding flow to show users how to ask the AI for edits, how to regenerate suggestions and how to fact-check the output.

Read also: UX onboarding best practices: how to design first-time user experiences.

19. AI product designer

An AI product designer is a designer who specialises in creating user experiences for AI-powered products. This role combines UX design skills with knowledge of AI capabilities, limitations and ethical considerations.

Example: Designing interfaces for an AI assistant that helps users manage tasks and schedules.

20. AI prototyping tools

AI prototyping tools help designers generate interactive prototypes quickly using AI. These tools can convert prompts, sketches or wireframes into interactive interfaces.

Example: Entering a text description of an app feature to generate a clickable prototype.

Want to learn more about AI for prototyping? Check out the UX Design Institute’s Certificate in AI for Prototyping

21. AI-powered recommendations

AI-powered recommendations are suggestions generated by AI systems based on a user’s behaviour, preferences or past interactions. These systems analyse patterns in large datasets to predict what content, products or actions a user is most likely to find useful.

Recommendation systems are widely used in digital products to personalise the user experience, helping users discover relevant content more quickly.

For designers, this raises important questions about how recommendations are presented, how transparent they are and how much control users have over them.

Example: Streaming platforms like Netflix use AI-powered recommendation systems to suggest films and TV shows based on what you’ve previously watched and rated.

22. AI-powered search

AI-powered search uses artificial intelligence to better understand user intent and deliver more relevant search results. Unlike traditional keyword-based search, AI-powered systems can interpret natural language queries and provide summarised answers.

This is increasingly visible in products like search engines, where AI-generated summaries appear at the top of search results.

For UX designers, this changes how search experiences are designed, shifting from lists of links to answer-driven interfaces.

Example: When you search Google today, you may see an AI-generated summary at the top of the results page that directly answers your question.

23. AI research assistant

An AI research assistant helps UX researchers analyse interviews, surveys and usability testing data. These tools can summarise transcripts, extract themes and cluster insights.

Example: Automatically summarising key themes from user interview transcripts.

24. AI suggestion systems

AI suggestion systems proactively recommend actions, content or inputs based on user behaviour and context.

For designers, these systems are an opportunity to reduce user effort and guide people toward useful actions, but they must be implemented carefully to avoid overwhelming or distracting users.

Example: Google Docs suggests sentence completions as you type, helping you write faster without interrupting your workflow.

25. AI trust signals

AI trust signals are interface elements that help users understand and feel confident in how an AI system works.

Because AI outputs can sometimes be uncertain or incorrect, designers often use trust signals to communicate transparency and reliability.

These might include explanations, confidence indicators, source citations or options to verify or edit AI-generated results.

Example: A research tool might show a confidence score next to an AI-generated insight and allow the user to view the original source data.

26. AI-assisted ideation

AI-assisted ideation involves using AI tools during brainstorming or concept development. Designers can use AI to generate ideas, variations or inspiration.

Example: Generating alternative design directions for a mobile app onboarding flow.

27. AI-assisted research analysis

AI-assisted research analysis refers to using AI tools to analyse qualitative research data such as interview transcripts, survey responses and usability test recordings.

Traditionally, analysing qualitative data is a highly manual and time-consuming process, requiring researchers to read transcripts, code responses and identify themes. AI tools can now speed up this process by automatically clustering insights, highlighting patterns and summarising key findings.

For UX teams, this can significantly reduce the time required to turn research into actionable insights.

Example: An AI research tool might analyse dozens of user interviews and automatically group feedback into themes such as “navigation confusion,” “pricing concerns,” or “feature requests.”

28. AI insight clustering

AI insight clustering groups similar user feedback or research insights together automatically. This technique can help researchers quickly identify common themes.

Example: Grouping survey responses about usability problems into categories.

29. AI research synthesis

AI research synthesis refers to using AI tools to combine and summarise insights from multiple research sources such as interviews, surveys or usability tests.

In UX work, synthesis is the step where researchers turn raw data into meaningful insights. AI tools can help speed up this process by identifying patterns and themes across large sets of research data.

Example: After conducting several user interviews, you might use an AI research tool to analyse the transcripts and highlight the most common usability issues.

30. AI transcription tools

AI transcription tools automatically convert spoken conversations into written text. They’re commonly used in UX research to turn interview recordings, usability test sessions or workshops into transcripts that can be analysed later.

For UX researchers, this removes one of the most time-consuming parts of research. Instead of manually typing out recordings, you can quickly generate transcripts and move straight to identifying insights.

Example: After conducting user interviews, you might upload the recordings to a transcription tool which produces searchable transcripts you can review and analyse.

31. AI visual generation

AI visual generation refers to AI tools that can create images, graphics or design assets based on a prompt.

This can be especially useful during early ideation or concept exploration. Rather than starting from a blank canvas, you can quickly generate visual directions and refine the most promising ideas.

Example: You might ask an AI tool to generate several visual concepts for a mobile app onboarding screen to explore different visual styles.

32. Algorithmic bias

Algorithmic bias occurs when an AI system produces unfair or skewed outcomes because of bias in the data it was trained on or the way the system was designed.

For designers, this is an important ethical consideration. If bias is present in an AI product, it can affect how recommendations are made, how users are categorised or what information they see.

Example: A hiring tool trained on biased historical data might favour certain candidates over others. Designers may need to introduce transparency or human oversight to reduce these risks.

33. Automated usability testing

Automated usability testing uses AI to analyse user behaviour during product testing and identify potential usability issues.

Instead of manually reviewing every session recording, AI can detect patterns such as hesitation, repeated clicks or navigation loops that suggest friction. This can speed up testing and help uncover problems earlier in the design process.

Example: An AI tool analysing session recordings might flag that many users struggle to find the checkout button on a product page.

34. Autonomous agents

Autonomous agents are AI systems that can plan and carry out tasks independently in order to achieve a goal.

Unlike traditional tools that respond to a single prompt, agents can perform multiple steps and adapt their behaviour along the way. For product designers, this raises new UX questions around control, transparency and how users monitor the agent’s progress.

Example: An AI home assistant might manage household tasks such as adjusting the thermostat, ordering groceries when supplies run low and scheduling appliance maintenance.

35. Behavioural analytics

Behavioural analytics refers to analysing how users interact with a digital product in order to understand behaviour patterns.

AI can enhance behavioural analytics by identifying patterns across large datasets and predicting future behaviour. For UX teams,  this helps reveal how people actually use a product rather than how designers expect them to.

Example: An AI analytics tool might show that many users abandon a checkout process at the same step, indicating a usability issue.

36. Bias

Bias in AI refers to systematic errors that lead a system to produce unfair or unbalanced results.

Bias can arise from the training data used to build an AI system, or from assumptions made during development. For designers, it’s important to recognise where bias might appear in a product and to design experiences that mitigate potential harm.

Example: A recommendation system trained on limited data might repeatedly suggest similar types of content and fail to reflect diverse user interests.

37. Command-based interaction

Command-based interaction allows users to control a system by entering commands or prompts rather than navigating traditional menus. This interaction model is becoming more common in AI products where users can type instructions to generate content or perform tasks.

When designing for command-based interactions, designers must consider how commands are structured and how users learn what the system can do.

Example: In a design tool, you might type a command such as “create a landing page layout for a travel website”.

38. Component generation

Component generation refers to AI tools that automatically create UI components such as buttons, cards or navigation elements. These tools often generate components based on prompts, sketches or design system guidelines.

This can accelerate early prototyping while maintaining consistency across a design system.

Example: You might ask an AI tool to generate a set of card components that follow your product’s design system.

39. Context-aware interfaces

Context-aware interfaces adapt their behaviour based on information about the user’s situation or environment. AI systems can analyse context such as location, device type or past behaviour to tailor the interface.

For designers, this creates opportunities to make experiences feel more relevant and responsive.

Example: A travel app might highlight nearby attractions when it detects that the user has arrived in a new city.

40. Contextual AI

Contextual AI refers to AI systems that consider contextual information when generating responses or recommendations. Rather than treating each request in isolation, the system uses surrounding information to provide more relevant outputs.

Designers working with contextual AI must think about how context is captured and how it influences the experience.

Example: A customer support chatbot might use previous conversations to better understand a user’s problem.

41. Conversational AI

Conversational AI refers to systems that allow users to interact with technology through natural language conversations. This includes chatbots, virtual assistants and AI copilots that respond to questions or requests.

For designers, conversational interfaces require careful thinking about tone, feedback and how the conversation flows.

Example: A chatbot that helps customers troubleshoot a problem through a series of questions.

42. Conversational design

Conversational design focuses on designing interactions that take place through conversation rather than traditional interfaces.

Designers shape how questions are asked, how responses are structured and how the conversation guides the user toward a goal. This discipline is increasingly important as chat-based interfaces become more common.

Example: Designing how a chatbot guides users through booking a service.

Read also: UX design for chatbots: how to create human-like conversations

43. Conversational UI (CUI)

A conversational UI, or CUI, is an interface that allows users to interact with a system through conversation using text or voice.

Unlike traditional graphical interfaces (GUIs), CUIs rely on dialogue as the primary interaction method. Designers must ensure conversations are clear, helpful and easy to navigate.

Example: Messaging interfaces used by AI assistants such as ChatGPT.

44. Conversational workflow

A conversational workflow is a task flow that unfolds through dialogue with an AI system.

Instead of navigating multiple screens, users complete tasks by asking questions and refining their requests through conversation.

For designers, this means thinking carefully about how conversations progress and how users recover from mistakes.

Example: Planning a trip through a chatbot by answering a series of questions about dates, budget and preferences.

45. Copilot interface

A copilot interface integrates an AI assistant directly into a product to support the user while they work. Rather than replacing the user, the AI acts as a collaborator that offers suggestions, generates content or performs tasks.

When designing such interfaces, designers must balance assistance with user control so the AI feels helpful rather than intrusive.

Example: Microsoft Copilot suggesting text edits inside a document.

46. Data-driven design

Data-driven design is an approach to product design where decisions are informed by real user data rather than assumptions or intuition alone.

Instead of relying purely on best practices or internal opinions, teams look at evidence such as analytics, user behaviour and research insights to understand what is actually happening in the product. AI tools can strengthen this approach by analysing large datasets and identifying patterns that might otherwise be missed.

In practice, this helps teams prioritise improvements that genuinely impact the user experience.

Example: Product analytics might reveal that a large percentage of users abandon a sign-up flow at the same step, prompting the design team to simplify that part of the experience.

47. Data labelling

Data labelling is the process of tagging data so that an AI system can learn from it. Labels help the system recognise patterns and understand what different pieces of data represent.

This might involve categorising text, tagging images or identifying emotions in user feedback. The quality of these labels has a direct impact on how an AI system behaves, which means poorly labelled data can lead to inaccurate or biased outputs.

When designing AI-powered products, understanding how training data is labelled helps explain why a system might respond in certain ways.

Example: Customer reviews might be labelled as positive, neutral or negative so an AI model can learn how to perform sentiment analysis.

48. Decision intelligence

Decision intelligence refers to using data, analytics and AI to support better decision-making across a product or organisation.

Rather than simply presenting raw data, decision intelligence systems combine analysis and predictive insights to help teams understand what actions to take next. This can influence product strategy, feature prioritisation or design improvements.

For product teams, this kind of insight can help turn complex data into clear signals about what users need.

Example: A product analytics platform might analyse user behaviour and highlight that users who complete onboarding are far more likely to remain active, encouraging the team to focus on improving that experience.

49. Decision-support interface

A decision-support interface presents complex information in a way that helps users evaluate options and make informed decisions.

These interfaces often combine analytics, predictions or AI-generated insights with clear visualisations so users can quickly understand what is happening and what actions they might take.

Design plays an important role here. Information must be structured in a way that feels trustworthy and easy to interpret.

Example: A financial dashboard that analyses spending patterns and highlights areas where a user could save money.

50. Design automation

Design automation refers to using tools or AI to handle repetitive design tasks automatically.

Many parts of the design process involve fiddly manual tasks such as resizing assets, generating layout variations or adapting interfaces for different screen sizes. Automation can handle these tasks quickly, allowing designers to focus more on creative exploration and problem-solving.

Rather than replacing designers, automation often works best as a productivity boost within the design workflow.

Example: Automatically generating responsive versions of a layout for mobile, tablet and desktop screens.

51. Design system automation

Design system automation uses automation or AI to maintain and scale a design system across a product or organisation.

As products grow, keeping components, tokens and patterns consistent can become difficult. Automation tools help by generating components, updating styles or ensuring new designs follow system guidelines.

This helps teams maintain visual consistency while allowing designers and developers to work faster.

Example: When a colour token in a design system is updated, automated tools can instantly apply the change across every component that uses it.

52. Design token generation

Design token generation refers to using automation or AI to create and manage design tokens such as colours, spacing values or typography styles.

Design tokens act as the building blocks of a design system, helping ensure consistency across products and platforms. AI tools can assist by generating token sets, suggesting new tokens based on existing styles or automatically updating tokens when a design system evolves.

For design teams working at scale, this can make it much easier to maintain visual consistency while speeding up design and development workflows.

Example: An AI design tool might analyse an existing interface and generate a set of colour and spacing tokens that can be added to the design system.

53. Dynamic UI

A dynamic UI is an interface that changes in response to user behaviour, context or system data. Instead of presenting the same layout or content to every user, the interface adapts in real-time to make the experience more relevant.

AI systems often power dynamic interfaces by analysing patterns in user activity and adjusting the interface accordingly. This might involve surfacing relevant content, highlighting useful features or adapting navigation.

For product teams, dynamic UI can make digital experiences feel more responsive and personalised.

Example: A shopping app might dynamically highlight products related to items a user has recently viewed, helping them discover relevant options more quickly.

54. Ethical AI design

Ethical AI design focuses on creating AI-powered products that are fair, transparent and responsible.

As AI systems influence more decisions and experiences, designers play a key role in ensuring these systems behave in ways that respect users and minimise harm. This includes considering issues such as bias, transparency and user control.

Design choices often shape how ethical principles appear in the product experience.

Example: A recruitment platform might allow hiring managers to review and override AI-generated candidate recommendations to ensure decisions remain fair.

55. Explainable AI (XAI)

Explainable AI, or XAI, refers to AI systems that provide understandable explanations for how their outputs were generated.

Many AI models operate as complex systems that can be difficult to interpret. Explainability helps bridge this gap by making it clearer as to why a particular result or recommendation was produced.

In user-facing products, explainability can help build trust and support better decision-making.

Example: A credit scoring tool might show which financial factors influenced its recommendation so users understand how the result was calculated.

56. Explainability UI patterns

Explainability UI patterns are design approaches used to communicate how an AI system reached a particular output.

These patterns help translate complex AI processes into explanations that people can understand. This might involve showing supporting data, highlighting key factors or offering a deeper explanation on demand.

Well-designed explainability patterns help users evaluate whether they want to rely on an AI-generated result.

Example: An AI writing assistant might highlight which parts of a document influenced its summary so users can quickly verify the source information.

57. Feature suggestion systems

Feature suggestion systems use AI to recommend product features or actions based on how people interact with a product. These systems analyse user behaviour and identify patterns that suggest what might help users achieve their goals more easily.

For product teams, this can reveal opportunities to introduce helpful features or guide users toward functionality they may not have discovered yet.

Example: A project management tool might suggest enabling a reminder feature after noticing that users frequently miss task deadlines.

58. Few-shot prompting

Few-shot prompting is a technique used when interacting with AI models where a small number of examples are provided to guide the model’s response.

Instead of relying on just a single prompt, you include a few examples that demonstrate the format or type of output you want. The AI then uses those examples to generate similar responses.

For designers experimenting with AI tools, this can help produce more consistent outputs.

Example: If you want an AI tool to generate onboarding messages, you might include two or three example messages so the system understands the tone and structure you want.

59. Flow generation

Flow generation refers to AI tools that can automatically generate user flows based on a prompt or product description.

User flows help designers visualise how users move through a product to complete a task. AI tools can accelerate this process by quickly generating draft flows that teams can refine. This can be especially useful during early product exploration.

Example: You might prompt an AI tool to generate a user flow for booking a hotel through a travel app.

60. Generative AI

Generative AI refers to AI systems that can create new content such as text, images, audio or code.

Unlike traditional AI systems that focus mainly on analysing data, generative models produce entirely new outputs based on patterns they’ve learned from large datasets.

Generative AI has quickly become part of many design workflows, supporting activities such as brainstorming, content creation and prototyping.

Example: A designer might use a generative AI tool to quickly produce several hero image concepts for a landing page.

61. Generative design

Generative design uses algorithms or AI to automatically generate multiple design variations based on constraints or goals.

Instead of manually designing each variation, designers define parameters such as layout rules or functional requirements. The system then produces many possible solutions that can be evaluated and refined.

This approach can help explore design spaces that might otherwise take much longer to investigate.

Example: A designer might generate several alternative dashboard layouts that prioritise different types of data visualisation.

62. Generative UI

Generative UI refers to interfaces that can be dynamically created or adapted by AI rather than being fully predefined.

Instead of designing every screen in advance, the system can assemble interface elements in response to user needs or prompts.

This idea is gaining attention as AI systems become capable of generating not just content but entire interface structures.

Example: A productivity app might generate a custom dashboard based on the tasks and tools a user interacts with most often.

63. Goal-based interaction

Goal-based interaction focuses on helping users achieve an outcome rather than requiring them to follow a predefined sequence of steps.

AI systems often support this interaction model by interpreting user intent and determining how best to complete a task. Instead of navigating menus, users simply state what they want to accomplish.

Example: Rather than manually creating charts in a data tool, a user might type “show monthly revenue trends for the past year”.

64. Goal-oriented AI systems

Goal-oriented AI systems are designed to achieve specific objectives by planning actions and adapting along the way. These systems are common in conversational assistants and AI agents that guide users through tasks.

From a design perspective, the challenge is ensuring users understand what the system is trying to accomplish and how they can influence the process.

Example: A fitness app might ask about a user’s fitness level, goals and available workout time before generating a personalised training plan.

65. Guardrails

Guardrails are rules or constraints built into AI systems to guide their behaviour and prevent harmful or inappropriate outputs. These safeguards can limit what the AI generates or restrict how it responds in certain situations.

For teams building AI products, guardrails help ensure that systems behave responsibly and remain aligned with user expectations.

Example: An AI chatbot might refuse to generate harmful content or choose to redirect the user to safer information sources.

66. Hallucination

In AI systems, a hallucination occurs when the model generates information that sounds plausible but is actually incorrect or completely made up.

This is a known limitation of many generative AI systems. Because models generate outputs based on patterns rather than factual understanding, they can occasionally produce confident but inaccurate answers.

Designing around hallucinations often involves adding verification mechanisms or transparency.

Example: An AI assistant might confidently cite a research study that doesn’t actually exist.

67. Human-AI collaboration

Human-AI collaboration describes workflows where people and AI systems work together to complete tasks. Rather than replacing human expertise, AI acts as a partner that supports decision-making, creativity or productivity. Many modern digital tools are designed around this collaborative model.

Example: A designer might generate several layout ideas with AI and then refine the most promising concept.

68. Human-in-the-loop AI

Human-in-the-loop AI refers to systems where human oversight is built into the AI process. This means people can review, correct or guide AI outputs before they’re finalised.

Human oversight is particularly important in areas where accuracy, fairness or accountability are critical.

Example: A moderation system that uses AI to flag potentially harmful content which is then reviewed by a human moderator.

69. Human override controls

Human override controls allow users to intervene and override decisions made by an AI system. These controls help maintain user autonomy and ensure people remain in control when automation is involved.

Providing clear override options is an important part of designing trustworthy AI products.

Example: A navigation app might suggest an AI-optimised route but allow the user to select a different route if they prefer.

70. Intent detection

Intent detection is the process of identifying what a user is trying to accomplish based on their input.

AI systems often analyse language patterns to determine the user’s goal, especially in conversational interfaces. Accurate intent detection helps ensure the system responds in a way that matches the user’s needs.

Example: A customer support chatbot recognising that the user wants to cancel a subscription.

71. Intent-based interaction

Intent-based interaction allows users to communicate what they want to achieve without specifying exactly how the system should do it.

AI interprets the user’s intent and determines the steps required to complete the task. This interaction style can make digital products feel more intuitive.

Example: Typing “schedule a meeting with the marketing team next week” rather than navigating through several calendar menus.

72. Interactive AI systems

Interactive AI systems are systems that continuously interact with users and respond dynamically to their inputs. Unlike static tools that produce one-off outputs, interactive systems adapt as the interaction evolves.

Designing these systems involves carefully shaping the feedback loop between user actions and AI responses.

Example: A design assistant that updates layout suggestions as you refine your prompt.

73. Interface generation

Interface generation refers to AI tools that can automatically create user interfaces based on prompts, sketches or design specifications. Designers can then iterate on these generated interfaces rather than starting from scratch.

This can dramatically speed up early product exploration by turning ideas into visual interfaces quickly.

Example: You might describe a mobile banking app screen and have an AI tool generate a rough UI layout.

74. Journey prediction

Journey prediction uses AI to anticipate how users are likely to move through a product or service.

By analysing historical data and behavioural patterns, AI models can forecast common user paths and potential friction points. This can help product teams identify opportunities to improve the overall experience.

Example: An analytics tool might predict that users who skip onboarding are more likely to abandon the app.

75. Large Language Model (LLM)

A Large Language Model, or LLM, is a type of AI model trained on vast amounts of text data to understand and generate human language. These models power many modern AI tools including chatbots, writing assistants and conversational interfaces.

Because LLMs can interpret natural language prompts, they’ve opened up new interaction patterns in digital products.

Example: Tools such as ChatGPT rely on large language models to generate responses to user questions.

76. LLM interface design

LLM interface design focuses on how users interact with products powered by large language models. Unlike traditional interfaces, LLM-powered products often revolve around conversation or prompting rather than navigation.

Designing these interfaces involves decisions about prompt input, result presentation and how users refine or guide the AI’s output.

Example: Designing the prompt input area, conversation history and response display for an AI writing assistant.

77. Multimodal AI

Multimodal AI refers to AI systems that can process and generate multiple types of data such as text, images, audio or video.

Instead of working with just one input type, these systems combine information from different sources to better understand user requests. This opens up new possibilities for digital products that allow users to interact in more natural ways.

For designers, multimodal AI means thinking beyond text interfaces and considering how different interaction modes work together.

Example: You might upload a screenshot of a website and ask an AI tool to analyse the layout and suggest usability improvements.

78. Natural language interface

A natural language interface allows users to interact with a system using everyday language rather than structured commands or menus. This interaction model is becoming more common in AI-powered products where users can simply type or speak what they want to do.

Designing natural language interfaces involves helping users understand what the system can handle while keeping the interaction simple and intuitive.

Example: Instead of navigating several menus in a travel app, a user might type “find flights from Berlin to Lisbon next weekend”.

79. Natural language processing (NLP)

Natural language processing, or NLP, is the area of AI focused on understanding and generating human language. NLP technologies allow systems to interpret written or spoken language, detect intent and generate responses.

Many familiar digital experiences rely on NLP including chatbots, search engines and voice assistants.

Example: A support chatbot using NLP to understand a question such as “how do I reset my password”.

80. Pattern recognition

Pattern recognition is the ability of AI systems to detect patterns in large datasets.

These patterns might involve user behaviour, language structures or visual elements. Once patterns are identified, AI can use them to make predictions or generate outputs.

In digital products, pattern recognition often powers recommendations, automation and predictive features.

Example: An analytics tool recognising that users tend to abandon a checkout flow after encountering shipping costs.

81. Personalisation algorithms

Personalisation algorithms analyse user behaviour to tailor content, recommendations or features to individual users. Many digital products use these algorithms to create experiences that feel more relevant and engaging.

The design challenge often lies in balancing helpful personalisation with transparency so users understand why certain content appears.

Example: A news app adjusting its homepage to prioritise topics a reader frequently engages with.

82. Predictive UX

Predictive UX uses data and AI to anticipate what a user is likely to need next and surface helpful actions or information. Rather than waiting for users to search or navigate, the interface can proactively guide them toward relevant content or tasks.

When implemented thoughtfully, predictive features can make digital products feel faster and more intuitive.

Example: A calendar app suggesting travel time reminders before an upcoming meeting based on location and past behaviour.

83. Predictive interface

A predictive interface adapts its behaviour based on predictions about what the user will do next.

These predictions often come from analysing patterns in user behaviour. The interface can then prioritise features, highlight relevant information or automate certain actions.

Predictive interfaces are becoming more common in products where efficiency and personalisation are important.

Example: A music app automatically surfacing playlists that match a user’s listening habits at certain times of the day.

84. Progressive AI disclosure

Progressive AI disclosure is a design approach where information about an AI system is revealed gradually rather than all at once.

Instead of overwhelming users with technical details, the interface provides simple explanations first and allows deeper information to be accessed when needed. This approach helps maintain clarity while still supporting transparency.

Example: An AI recommendation might show a short explanation such as “suggested based on your recent activity” with an option to learn more.

85. Prompt design

Prompt design involves crafting the instructions or inputs used to guide an AI system’s response.

The way a prompt is written can significantly influence the quality and usefulness of the output. Designers and product teams often experiment with prompts to produce clearer results or better user experiences.

Well-designed prompts can help users achieve their goals more efficiently when interacting with AI tools.Example: A user might refine a prompt from “summarise this article” to “summarise this article in three bullet points for a busy reader,” resulting in a clearer and more useful response.

86. Prompt engineering

Prompt engineering refers to the practice of systematically designing prompts to improve how AI systems respond.

While prompt design often focuses on the user experience, prompt engineering may involve deeper experimentation with prompt structure, examples or constraints to achieve reliable outputs.

This practice has become increasingly important as many AI tools rely on prompts as their primary interaction model.

Example: Providing an AI tool with specific formatting instructions and examples so it generates consistent UX copy.

87. Prototype generation

Prototype generation refers to AI tools that can automatically create interactive prototypes based on prompts, sketches or design specifications.

These tools can help teams move from concept to prototype much faster than traditional workflows. Instead of building every interaction manually, designers can generate a starting point and refine it.

Example: You might describe a mobile onboarding experience and have an AI tool generate a clickable prototype.

88. Recommendation system

A recommendation system is an AI system that suggests content, products or actions based on user behaviour and preferences. Recommendation systems power many familiar digital experiences including streaming platforms, online shopping and news feeds.

For designers, an important consideration is how recommendations appear in the interface and how much control users have over them.

Example: An online store recommending products related to items you recently viewed.

89. Responsible AI

Responsible AI is about developing and using AI systems in ways that are ethical, transparent and accountable.

This concept includes addressing issues such as bias, privacy and fairness while ensuring AI systems behave in a way that benefits users and society.Design decisions play an important role here. How AI outputs are explained, how users provide feedback and how systems handle uncertainty all contribute to responsible AI experiences.

Example: A healthcare app clearly explaining how an AI-generated recommendation was produced and encouraging users to consult a professional before acting on it.

90. Sentiment analysis

Sentiment analysis is an AI technique used to determine the emotional tone behind a piece of text.

By analysing language patterns, AI systems can identify whether feedback is positive, negative or neutral. This makes it easier to process large volumes of user feedback quickly.

For product teams, sentiment analysis can help highlight areas of the experience that frustrate users or generate strong positive reactions.

Example: A UX team analysing thousands of app store reviews might use sentiment analysis to quickly identify recurring complaints about a confusing navigation flow.

91. Style transfer

Style transfer is a technique that allows AI systems to apply the visual or stylistic characteristics of one piece of content to another.

In creative workflows, this can help designers explore visual directions quickly. AI tools can generate variations that reflect different styles without requiring manual redesign.

This approach is often used for experimentation during early concept exploration.

Example: You might upload a product illustration and ask an AI tool to generate versions in different artistic styles to test possible visual directions.

92. System prompt

A system prompt is an instruction that defines how an AI system should behave before a user even enters their own prompt.

It helps shape the tone, boundaries and overall behaviour of the AI. While users only see the visible prompt interface, system prompts operate behind the scenes and play an important role in ensuring the AI responds appropriately.

For product designers, system prompts are part of how an AI experience is shaped. They influence how helpful, safe or consistent the system feels to users.

Example: A customer support chatbot might include a system prompt that instructs the AI to respond politely, avoid giving legal advice and redirect users to human support when necessary.

93. Task automation

Task automation refers to using software or AI to perform tasks that would otherwise require manual effort.

In digital products, this can range from simple background processes to more complex AI-powered actions. Automating repetitive work can reduce friction for users and make products feel faster and more efficient.

When designing automated systems, it’s important to ensure that users still feel informed and in control.

Example: An email tool that automatically categorises incoming messages into folders based on content.

94. Text-to-image generation

Text-to-image generation is a type of generative AI that creates images based on written prompts.

These systems interpret a text description and generate visuals that match the request. Designers often use them to explore ideas quickly or produce visual assets during early concept stages.

While the results usually require refinement, they can provide a useful starting point for creative exploration.

Example: You might type “modern mobile banking dashboard with minimal design” and receive several visual concepts to explore.

95. Training data

Training data is the dataset used to teach an AI model how to perform a particular task.

The model learns patterns from this data during the training process. The quality and diversity of the training data strongly influence how the AI system behaves and how reliable its outputs are.

Understanding training data helps explain why AI systems sometimes perform well in certain contexts and poorly in others.

Example: A language model trained on large collections of online text learns patterns in grammar and writing style that allow it to generate human-like responses.

96. Trust calibration

Trust calibration refers to designing AI systems so that users trust them at the right level.

If users trust an AI system too much, they may rely on inaccurate results. If they trust it too little, they may ignore useful features. As such, designers must aim to communicate the system’s capabilities and limitations clearly.

Achieving the right balance helps users make informed decisions about when to rely on AI.

Example: An AI research tool might show source references and confidence indicators so users can verify the information it generates.

97. Unstructured data

Unstructured data refers to information that does not follow a fixed format or organised structure.

This type of data includes things like text, images, videos and audio recordings. AI systems are particularly useful for analysing unstructured data because they can identify patterns that would be difficult to detect manually.

For UX teams, unstructured data often appears in research materials such as interview transcripts or user feedback.

Example: AI tools analysing thousands of customer support messages to identify common usability issues.

98. User data modelling

User data modelling involves organising and analysing user data to better understand behaviour patterns and preferences. AI systems can use these models to predict what users are likely to do next or what features they might find useful.

For product teams, these insights can inform design decisions and help create more personalised experiences.

Example: A streaming platform analysing viewing history to model user preferences and recommend relevant content.

99. Wireframe generation

Wireframe generation refers to AI tools that automatically create low-fidelity interface layouts based on prompts or product descriptions.

Wireframes are typically used early in the design process to explore structure and layout before visual design begins. AI tools can accelerate this stage by producing draft layouts that designers can refine. This allows teams to quickly explore multiple interface ideas.

Example: You might describe a mobile shopping app product page and have an AI tool generate a basic wireframe layout.

100. Workflow automation

Workflow automation uses software or AI to streamline multi-step processes within a product or organisation. Instead of completing each step manually, automated workflows connect different actions together so tasks can be completed more efficiently.

In digital products, this often improves the user experience by reducing repetitive work and simplifying complex processes.

Example: A project management tool automatically assigning tasks, sending reminders and updating project status as work progresses.

Wrapping up

AI is changing how digital products are designed and experienced, and how designers work. Understanding the terminology behind these changes can make it much easier to follow industry conversations, evaluate new tools and integrate AI into your design practice.

But terminology is just one layer. If you’d like to move beyond definitions and learn how to apply AI in real design workflows, check out the UX Design Institute’s Certificate in AI Fundamentals for UX. During three live online masterclasses, you’ll learn how AI actually works, master the art of effective prompting, and discover how AI can accelerate your productivity.

Want to learn more about the role of AI in UX, UI and product design? Check out the following:

 

Author Image
Emily Stevens Writer for the UX Design Institute Blog

Emily is a professional writer and content strategist with an MSc in Psychology. She has 8+ years of experience in the tech industry, with a focus on UX and design thinking. A regular contributor to top design publications, she also authored a chapter in The UX Careers Handbook. Emily also holds a BA in French and German and is passionate about languages and continuous learning.

Professional Diploma in UX Design

Build your UX career with a globally recognised, industry-approved qualification. Get the mindset, the confidence and the skills that make UX designers so valuable.

Course starts

7 April 2026

Course price

€2,995

View course details