This looks vibe coded
2026-04-27
You walk into a demo. The UI is clean. The flows are logical. The animations are smooth. And the prospect tilts their head and says: "This looks… vibe coded."
What they're really telling you is: I don't trust this. This feels like a template. I can't tell if a human being who understands my problem actually built this, or if someone typed a paragraph into Cursor and shipped whatever came out.
Over 25% of Y Combinator's W25 batch shipped codebases that were ~95% AI-generated. Vibe coding tools like Lovable, Bolt, and Replit Agent made it possible for a solo founder to go from idea to deployed app in a weekend. The barriers to building collapsed.
And now customers, especially enterprise buyers, power users, and anyone who's been burned by demo-ware, can smell it. The generic card layouts. The suspiciously perfect spacing that somehow lacks rhythm. The features that work in the happy path and disintegrate the moment you go off-script.
"Vibe coded" has become shorthand for: fast, cheap, and not deeply thought through.
If your customers are saying it, or worse, thinking it, you have a positioning problem, a craft problem, and a trust problem.
Why the Perception Exists
Vibe coding excels at producing what looks like a product. Polished front-ends that mask hollow back-ends. In 2025, startups began shipping "investor demos" directly to customers, and customers noticed. The buttons were beautiful. The edge cases were catastrophic.
Worse, when everyone uses the same AI models with similar prompts, products converge toward the same aesthetic and interaction patterns. There's a sameness to AI-generated UIs, the same shadowed cards, the same sans-serif hierarchies, the same carnival of colors on badges and status displays. Customers toggle between three competitors and can't tell them apart.
Compare this to Linear, which built its project management tool with obsessive attention to keyboard shortcuts, transitions, and information density. Every pixel carries an opinion about how engineering teams should work. That's a team with deep domain conviction making thousands of deliberate micro-decisions.
The most damaging thing about vibe-coded products isn't the code quality, it's what the code reveals about how well the team understands the problem. When you vibe code a solution, you're outsourcing decision-making to a model that has no context about your user's workflow, organizational politics, or the specific friction that makes them lose 45 minutes every Thursday afternoon.
Stripe didn't win because it had better payment forms. It won because the team understood that the real problem was developer experience friction, and they built APIs, documentation, and dashboard tooling that encoded how engineers actually think about money flows.
That understanding has to be earned, through thousands of support tickets read, dozens of user interviews conducted, and watching real people struggle with real workflows until you internalize their frustrations as your own.
The Gap is Product Craft
Product craft is the discipline of making intentional decisions at every layer of the product experience, the difference between a product that works and a product that feels like someone gives a damn.
Here's what vibe-coded products systematically lack:
Opinion
Great products are opinionated. They make choices that exclude certain users in order to deeply serve others. Basecamp is famously opinionated about simplicity, no Gantt charts, no resource leveling, no enterprise SSO for years. That opinion is the product. AI-generated products, by default, are agreeable. They include everything the prompt mentions (and more!) and have a point of view about nothing.
Design Language
There's a difference between using a component library and having a design language. A design language reflects a coherent philosophy, how information is prioritized, how actions are weighted, how the product communicates state and consequence.
Arc Browser built a design language around the idea that the browser should disappear and let the web breathe. Every decision, the collapsible sidebar, the spaces, the boost feature, reinforced a single thesis. AI generates components. It doesn't generate coherence.
Interactions
Every micro-interaction, hover state, loading sequence, error message, and transition, tells the user something true about the system's state and the designer's care.
When Stripe rebuilds a dashboard element, they obsess over the animation curve of a data table loading. The accumulation of these decisions creates a feeling of solidity.
Look at a product's error states. If they guide the user, explain what happened, and suggest a next step, someone who understands the user built this.
Take Granola, the AI notepad that blew up in early 2026. When a meeting transcript fails to capture properly, the app doesn't just throw an error. It shows you what it did capture, explains why the gap exists (microphone permissions, a dropped connection, a speaker too far from the laptop), and gives you a one-click path to patch the notes manually.
Information Architecture
The hardest product work isn't building features, it's deciding where things go and what to call them. Notion's sidebar, Airtable's bases-tables-views hierarchy, Figma's layers panel, these aren't just UI containers. They're maps of how users think about their work.
This kind of architecture requires research, iteration, and the willingness to throw away structures that test well on paper but fail in practice. AI defaults to whatever structure its training data suggests. Users feel the mismatch instantly.
Build Products That Can't Be Dismissed
Principle 1: Start with the Workflow, Not the Wireframe
Before you generate a single screen, map the user's actual workflow. Not the idealized version. The messy, interrupt-driven, context-switching reality.
Loom didn't start with a video recording UI. They started with the observation that explaining something over text takes 10x longer than showing it, and that people were already doing clumsy screen recordings with QuickTime and uploading to Google Drive. The product was shaped by the workflow gap, not by a feature list.
Principle 2: Design the Seams, Not Just the Surface
The "vibe coded" accusation sticks when the surface is polished but the seams are raw. Seams are where features meet: the handoff between onboarding and first use, the transition from free to paid, the moment a user's data model outgrows the default setup.
Calendly understood this. The product's magic isn't the scheduling page, it's how effortlessly a meeting type connects to a calendar, respects buffer times, handles timezone logic, and pushes to the right integration. The seams are the product.
Principle 3: Build an Opinion Stack
Every product decision should trace back to a core opinion about how the world works or should work. Vercel is the clearest recent example of an opinion stack made legible through product decisions.
- Worldview: "Most UI code in the next decade will be generated, not written."
- Belief: "Whoever owns the primitives the AI generates into, owns the aesthetic of the web."
- Principle: "Control the full path from prompt to production: model layer, component layer, deploy layer."
- Decision: "Ship
npm i aias the open model-agnostic SDK. Build v0 as the generation surface. Hire shadcn and make shadcn/ui the default registry v0 generates into. Host it all on Vercel."
Together, these decisions form a coherent opinion about the future of UI development.
Principle 4: Use AI to Explore, Humans to Decide
The highest-leverage use of AI in product development is divergent exploration, generating 50 variations of an interaction pattern, stress-testing copy, simulating edge cases, producing design options at a speed that lets the team evaluate more possibilities than they ever could manually.
But the convergent decisions, which variation ships, which copy resonates, which edge case matters most, must be made by people who understand the user, the market, and the long-term product vision.
Figma's 2025 framework for design systems in the AI era gets this exactly right: systems should be "carriers of craft" that guide AI outputs without losing the nuance that makes a product feel human. Encode your taste into the system. Let AI execute within those guardrails.
Principle 5: Invest in the Moments That AI Can't Generate
There are product moments that no model can produce because they require judgment under ambiguity:
- What do you show a user who has imported 10,000 records and none of them are formatted correctly?
- How do you communicate that a feature is intentionally limited, and why that limitation is a strength?
- What's the right level of friction in an upgrade flow to signal value without creating resentment?
Allocate your best design and product thinking to these inflection points.
When you can go a million miles an hour, your direction matters most
In April 2026, it is hard to miss the hype around AI coding. You can't go one day on X without seeing the latest claudemaxxing techbros flaunting how they run day/night shifts on their fleet of 20 AI agents, and how many hours of autonomous coding they are seeing.
For me, there is a reason why they call it product-market fit. There has to be a market, or a problem, for what you are building. Discovering that problem ("building the right thing") and designing the best solution ("building the thing right") is still going to be the limiter.
No AI agent is going to tell you that.