Skip to main content
AI
7 min read
January 23, 2026

Why "Just Use AI" Is the New "Just Use a Framework" — And Equally Dangerous

The Framework Trap: A History Lesson

Segev Sinay

Segev Sinay

Frontend Architect

Share:

Ten years ago, the answer to every frontend question was "just use a framework." Don't know how to manage state? Use Redux. Need routing? Use React Router. Want animations? Use a library. Building a form? There's a package for that.

Today, the refrain has shifted: "Just use AI."

Don't want to write that component? Ask AI. Struggling with TypeScript types? Let AI figure it out. Need tests? AI can generate them. Debugging a tricky issue? Paste it into Claude.

And just like "just use a framework" led to an entire generation of developers who couldn't build anything without npm install, "just use AI" is creating a generation that can't build anything without a prompt.

This is a problem. A serious one.

The Framework Trap: A History Lesson

Let me take you back to 2016-2018. The JavaScript ecosystem was exploding. New frameworks every week. The community's response to almost any challenge was to add another dependency.

The result? Applications with 1,200 npm packages for a simple CRUD app. Developers who couldn't write a fetch request without Axios. Teams that couldn't implement a dropdown without a 50KB library. Build times measured in minutes because of dependency bloat.

Worse, when things broke — and they always broke — developers couldn't debug because they didn't understand what the libraries were actually doing under the hood. They'd installed abstractions on top of abstractions, and when the bottom layer had a bug, they were helpless.

We eventually learned. The pendulum swung back. The industry started valuing developers who understood fundamentals. "Use the platform" became a rallying cry. Vanilla JS gained respect. Understanding how things work under the hood became a differentiator.

The AI Trap: Same Pattern, Different Drug

Now watch the exact same pattern play out with AI:

Phase 1: Discovery — "Wow, AI can write code! This is incredible!" Phase 2: Over-reliance — "Why would I write this myself when AI can do it?" Phase 3: Atrophy — Developers lose the ability to solve problems independently Phase 4: Crisis — Something breaks that AI can't fix, and nobody on the team can either Phase 5: Correction — Industry re-values fundamental skills

We're currently somewhere between Phase 2 and Phase 3. I see it constantly in my consulting work.

Real Stories from the Trenches

Story 1: The TypeScript Cargo Cult

A team I worked with had been using AI to generate all their TypeScript types. The types were technically correct — they compiled without errors. But they were a mess. Redundant interfaces, unnecessary generics, inconsistent naming, types that didn't reflect the actual domain model.

When I asked the team to explain their type hierarchy, nobody could. They'd been accepting whatever AI generated without understanding the design decisions behind it. When they needed to refactor their data model, they couldn't, because nobody understood the existing types well enough to evolve them intentionally.

Story 2: The Debug Black Hole

A developer spent three days trying to fix a bug by repeatedly pasting error messages into AI and trying whatever it suggested. The AI kept suggesting increasingly complex solutions — adding error boundaries, restructuring state, rewriting components. None of it worked because the actual bug was a missing key prop in a list rendering that caused a subtle re-mounting issue.

A developer who understood React's reconciliation algorithm would have spotted this in 30 minutes. But this developer had never needed to understand reconciliation because AI always wrote their components for them.

Story 3: The Performance Mystery

A startup's app was slow. They asked AI to optimize it. AI suggested memoization, lazy loading, and code splitting. They implemented all of it. The app was still slow.

The actual problem was that their GraphQL queries were over-fetching by 10x, pulling entire user objects when they only needed names. No amount of frontend optimization could fix a backend data fetching problem. But the team had been so focused on asking AI to fix the frontend that they never investigated the actual root cause.

Why "Just Use AI" Is Dangerous

1. It Erodes Problem-Solving Skills

Every time you ask AI to solve a problem instead of solving it yourself, you lose an opportunity to build mental models. Mental models are what separate senior developers from juniors. They're the internal understanding of how systems work that allows you to diagnose problems, predict behavior, and design solutions.

You can't build mental models by reading AI output. You build them by struggling with problems, making mistakes, and understanding why things work the way they do.

2. It Creates a False Sense of Competence

Shipping features quickly feels like competence. But if you can't explain how your code works, debug it when it breaks, or modify it when requirements change — you're not competent. You're a prompt engineer with a GitHub account.

3. It Hides Complexity

AI makes hard things look easy. That's dangerous because the difficulty of something is important signal. When a feature is hard to implement, that difficulty often reflects genuine complexity in the problem domain. Hiding that complexity behind AI-generated code doesn't eliminate it — it just delays the reckoning.

4. It Undermines Code Review

How do you review code that you couldn't have written yourself? If the reviewer doesn't understand the code deeply enough to have written it, they can't effectively evaluate it. AI-generated code that passes review by developers who don't fully understand it is a ticking time bomb.

The Right Relationship with AI

I'm not saying don't use AI. I use it extensively. But I use it the way a chef uses a food processor — to speed up preparation tasks that I already know how to do by hand.

Use AI to accelerate, not to replace understanding.

Before you ask AI to generate something, ask yourself: "Could I write this myself if I had to?" If the answer is no, you have a learning opportunity — don't skip it by outsourcing to AI.

Use AI to handle the boring parts, not the hard parts.

The boring parts are boilerplate, repetitive code, standard patterns. Let AI handle those. The hard parts — architecture, debugging complex issues, performance optimization — are where you need to engage your brain. Those are also where you build the expertise that makes you valuable.

Use AI to learn, not just to produce.

When AI generates code, read it. Understand it. Ask AI to explain why it made certain choices. Use it as a teaching tool, not just a production tool. The best developers I work with use AI as a very fast pair programming partner, not as a code generation service.

The Fundamental Truth

Here's what hasn't changed despite all the AI hype: the most valuable skill in software development is the ability to understand and solve complex problems. Tools change. Languages change. Frameworks change. AI capabilities change. But the ability to reason about systems, make good decisions under uncertainty, and debug the unexpected — that's timeless.

"Just use AI" without understanding what AI is doing is exactly as dangerous as "just use a framework" without understanding what the framework is doing. Both lead to developers who are productive in ideal conditions and helpless when things go wrong.

Things always go wrong.

Build your skills. Use AI as a force multiplier for those skills. Don't let it become a crutch that replaces them.

AI
Industry Trends
React
TypeScript
Performance
Code Review
Refactoring
Junior Developers

Related Articles

Contact

Let’s Connect

Have a question or an idea? I’d love to hear from you.

Send a Message