UX design
fromMedium
4 days agoHow reading patterns have changed
Primary button placement should align with left-to-right scanning patterns and evolving device-driven consumption habits rather than rigid left-or-right rules.
Your junior designer spins up a prototype in Lovable before lunch. Your PM shows you a "working" MVP built entirely with Cursor within a day. And your CEO forwards you a LinkedIn post about how AI will replace 80% of UI work by 2026. And it seems like anyone can now make an app to solve a specific problem. Has the graphical interface really died, as Jakob Nielsen provocatively suggests?
UX is entering a new era. At the centre of every design conversation in 2026 lies a singular force: Artificial Intelligence. It is now so pervasive that even five-year-olds can explain its utility, while the AI natives of the new Beta generation are coming of age in a world where a conversational digital collaborator is not a feature but a baseline reality.
The majority of AI products remain tethered to a single, monolithic UI pattern: the chat box. While conversational interfaces are effective for exploration and managing ambiguity, they frequently become suboptimal when applied to structured professional workflows. To move beyond "bolted-on" chat, product teams must shift from asking where AI can be added to identifying the specific user intent and the interface best suited to deliver it.
Sam's issue: "After I signed up it made a git repo with no explanation and the only next step it suggested was to connect my domain, after that is done... what do i do?" This classic. No context, no guidance, no next steps. The industry data shows what's at stake: 77% of users abandon apps within 3 days (Source: Andrew Chen, a16z) Top-quartile onboarding achieves 2.5x higher customer lifetime value (Source: McKinsey) Getting users to their "aha moment" quickly is critical for retention
Visitor = your user / audienceThey come with expectations, emotions, and limited time. Access = your entry pointsLanding pages, SEO/social entry, app open, onboarding, login, paywall moments, notifications - everything that determines whether they can enter smoothly. Zones/Platforms = contextsHome, feed, article page, video page, search, profile, commerce, chat - each is a "zone" with a promise. Rides/Attractions = featuresRecommendation modules, player, comment, follow, save, share, checkout, personalization, bundles - anything interactive that creates value.
I am a UX designer, which means I can no longer use the internet without noticing everything that is wrong with it. This article is about UX patterns that are frustrating, widely adopted, and somehow still treated as acceptable at massive scale. If you have never noticed them, consider yourself lucky. Once you do, they become impossible to unsee. Your tolerance for digital nonsense may permanently decrease after reading this article. That is your warning!
One predictable pain point with contrast-color() is that it only returns black and white named colors. From a design systems perspective, that's not ideal because you want your colors. You want your harmonious brand and the colors you and your team spent thousands of man hours in meetings deciding on. Those colors. In fact, an earlier version of Safari had color-contrast() (confusing I know, naming is hard) which allowed you to pass in a list of best candidates to choose from. I beleive that proposal got mired in standards discussions, color contrast algorithms, and competing proposals; and contrast-color() is what survived which got simplified down to a binary result.
Did you know you can tag Figma in ChatGPT chat and prompt it to do design work? In this article, I want to share my top 4 favorite use cases for using Figma right in the ChatGPT chat window. 1. Instant design critique for real screens What to know what other people think about your design, but don't have access to real users? No problem, you can use ChatGPT for that.
I've lost count of the number of times I've tried to explain, in practice, many of the success criteria of the Web Content Accessibility Guidelines, better known by the acronym WCAG. The same number of times I've tried to explain them, I've seen the WCAG guidelines presented in many contexts (articles, lectures, webinars, etc.) as a reference to be shared with teams so they can begin the work of implementing and correcting the accessibility of their digital products and services.
"You know, having those conversations early on, reaching out to people in different departments ...that was really hard when I didn't have much confidence.” A VP of Design brought this up recently, reflecting what many designers are facing. There's been a crisis of confidence in design, and it's happening all across the career ladder. Due to shrinking budgets and layoffs, more designers are being forced to work solo.
But now, Bing might be testing a more retro, older-looking interface for the local pack in the Bing search results. This was spotted by Frank Sandtmann who posted this screenshot on LinkedIn: Frank wrote, "Bing appears to be testing a new (?!) design for its Places results. With the small map, it looks a bit retro." I cannot replicate this, I've been trying with several browsers over the past few days but I was unsuccessful.
Software used to feel separate from us. It sat behind the glass, efficient and obedient. Then it fell into our hands. It became a thing we pinched, swiped, and tapped, each gesture rewiring how we think, feel, and connect. For an entire generation, the connection to software has turned the user experience into human experience. Now, another shift is coming. Software is becoming intelligent. Instead of fixed interactions, we'll build systems that learn, adapt, and respond.
We're witnessing the birth of a new kind of designer: The AI Designer. Designers who work in evals, prompts, and tool calls. Designers who have as much of a taste for models as they do for fonts. Designers who think in mental models, agents, and intelligence.
AI design tools are everywhere right now. But here's the question every designer is asking: Do they actually solve real UI problems - or just generate pretty mockups? To find out, I ran a simple experiment with one rule: no cherry-picking, no reruns - just raw, first-attempt results. I fed 10 common UI design prompts - from accessibility and error handling to minimalist layouts - into 5 different AI tools. The goal? To see which AI came closest to solving real design challenges, unfiltered.
Information allows us to act more skillfully. Imagine you come to a fork on a road. Without a sign, you'd need a compass or a great sense of direction to choose correctly. But with a clear sign, you'd quickly know which road to take. The sign reduces ambiguity. The Moylan arrow, too, disambiguates a choice. Pulling in on the wrong side of the pump is an annoying inconvenience.
What's coming into sharper focus isn't fidelity, it's foresight. Part of the work of Product Design today is conceptual: sensing trends, building future-proof systems, and thinking years ahead. But besides the current momentum, we still have to focus on real problems that bring real value as of now. This balance is sometimes challenging, but also creates opportunities to reform our thinking and approaches.
In 2016, I presented at @Roblox Indie Game Developer Meetup about design strategy as an indie developer. Back then, I had no idea children as young as 5 were interacting with random adults on their platform. Today, the same company (NYSE: $RBLX) is filled with poorly moderated "games" like Bathroom Simulator and worse - all while letting adults animate their avatars for sexual role play.
The federated model suggests that design system work can be distributed across multiple teams without a central authority. It sounds democratic. It sounds efficient. It sounds empowering. In practice, it creates an ownership vacuum. Who's responsible for defining the architecture of the design system? Who establishes and evolves the processes needed to scale? Who ensures quality and consistency? Who maintains the infrastructure on which the system depends? Who deals with the unknown challenges that will inevitably
In this context, trust is not just an emotional response. It is about system reliability, the confidence that an AI assistant will behave predictably, communicate clearly, and acknowledge uncertainty responsibly. In healthcare, that reliability is not optional. Even when AI performs well, people still hesitate. They ask: Can I rely on this? Does it really understand me? What happens if it's wrong?
Greetings. Can someone please tell me how to get the image on the back of the flip card to fill the card like the image on the front of the card? See attachment with identical image on front and back. I have read the help documentation but did not fine the information needed. Basically, I need the images on the front and back to be the same dimensions. Thank you.
At 90 years old, the "Godfather of UX" isn't slowing down. Don Norman shares why our industry must shift from Human-Centered Design (HCD) to a more systemic, Humanity-Centered (HCD+) approach (image source: Yeo) The Transit That Changed Everything It's a true story: three years ago, a fortunate coincidence brought Don Norman to Singapore for a transit while travelling from San Diego to Shanghai. At that exact moment, Singapore Polytechnic was hosting its annual Design Thinking and User Experience (DT | UX) Summit.
It's also important to know that some of the vibes come with intentional signalling. Plenty of people whose views you can find online have a financial interest in one product over another, for instance because they are investors in it or they are paid influencers. They might have become investors because they liked the product, but it's also possible that their views are affected and shaped by that relationship.
AI tools are now embedded across almost every stage of product design. We use AI to generate ideas, summarize research findings, explore visual directions, write UX copy, and even ship working prototypes. Yet despite widespread adoption, many teams still struggle with a key question: How do you integrate AI into the design process without weakening design quality?