
""UX in AI" has become one of the most confusing buzzwords in our industry. Jakob Nielsen has famously talked about how UX is desperately needed for AI, but few can define what this means (or how to do it). Is it about designing chat interfaces and chatbots? Is it about working with algorithms or vibe coding? Is it about using Replit and Bolt instead of Figma?"
""UX in AI" isn't an unimportant topic, but there might be a dozen different interpretations for what that means, such as: Designing interfaces for AI tools (ChatGPT's chat bubbles, Midjourney's prompt boxes) Doing "vibe coding" to turn mockups into fully developed applications Making AI outputs more explainable (translating outputs, adding explanations, and sources) Creating knowledge-based repositories (Second Brains/RAGs) Creating AI-powered personalization (Netflix recommendations, Spotify playlists) Building conversational experiences (chatbots, voice assistants) etc."
UX in AI has become a confusing buzzword despite clear needs for UX expertise in AI contexts. Multiple, disparate interpretations exist, including designing interfaces for AI tools, vibe coding to convert mockups into applications, improving explainability, building knowledge repositories, enabling personalization, and creating conversational experiences. Lumping these activities together produces a field too broad to master and leaves many designers unsure which skills to develop or what employers seek. Digital twin modeling offers a focused, actionable AI use case. The term "digital twin" originates at NASA and played a role in solving critical monitoring and maintenance challenges such as Apollo 13.
Read at Medium
Unable to calculate read time
Collection
[
|
...
]