January 23, 2026

We're Building on Rented Ground (And the Lease Is Expiring)

AI tools feel like magic and we're building around them. But what happens when you can't code without them, and they cost $3,000 a month?

AISoftware EngineeringEconomicsOpen SourceIndependence

LLMs feel like magic. Claude Code writes a feature in minutes that would have taken hours. Cursor refactors your codebase while you watch. You ship faster, your team is more productive, and the cost is trivial—$20, maybe $200 a month.

So you build around it. You structure your workflow around AI-assisted development. Your team stops doing certain things manually because the AI handles it. New hires learn to code with Claude suggesting every function. The velocity is real.

But here's what we're not asking: What are we building on?

Node.js is open source. You can fork it, audit it, run it forever. C# is open. Python is open. Git is open. The tools we've built software on for decades are tools anyone can use, anyone can run, anyone can maintain indefinitely.

Claude Code is not. GPT-5.2 is not. These are proprietary systems, priced 90% below their actual cost, controlled by two or three companies burning venture capital to make you dependent on them.

And there's a second danger: applications with AI ingrained in them. Not just development tools, but products that need Claude API calls to function. When your application's core features depend on model responses, you're not just using a tool. You're building on infrastructure you don't control, that costs 10x what you're paying, run by companies with no profitable business model yet.

What We're Building On

The economics are simple. Anthropic revealed that some Claude Code users consume "tens of thousands in model usage on a $200 plan." Analysis of infrastructure costs shows these services are subsidized 87-95% below true cost. Sustainable pricing for professional developers would be $ 1,000-$3,000 per month.

This is Phase 1: land grab. Companies with no path to profitability yet are racing to make you dependent on their tools before competitors do. The models themselves are ephemeral—they cost billions to train but stay frontier for months, not years. The only durable moat is your dependency.

Traditional open source gave you independence. Fork the code, run it yourself, maintain it indefinitely. "Open source" AI models give you weights, not source. You can't audit what they learned, can't fix what's broken, can't guarantee they'll work the same way next month. Even the "open" models aren't open in any meaningful sense.

Danger 1: When You Can't Code Without Them

The first danger is skill atrophy. Not that you forget syntax—that's trivial. You lose the ability to debug without AI suggesting fixes. You stop building mental models of how systems work because the AI fills gaps. You can't architect from first principles because you've been accepting AI's architecture for six months.

An engineer who coded for ten years doesn't become incompetent overnight. But an engineer who learned to code with Claude Code never built that foundation. And an experienced engineer who leans on AI for every function, every refactor, every debug session—their edge degrades.

When pricing normalizes to $1,500/month and your startup can't afford it, can you still ship? When rate limits kick in mid-sprint, does your team slow to 1x or stop entirely?

The tool that makes you 10x faster is only valuable if you can still work at 1x without it.

Danger 2: When Your Product Can't Run Without Them

The second danger is runtime dependency. You're building products that call GPT-5.2 or Claude Sonnet in production. AI-powered search, document summarization, chat interfaces, code generation features—your product's value proposition includes "powered by AI."

Right now, API costs look manageable. You're paying $2.00 per million tokens for GPT-5-mini output. The real infrastructure cost is $6.37 per million tokens generated. When subsidies end and prices normalize, your unit economics break.

If your product makes 100,000 API calls per day generating 500 tokens each (50 million tokens/month), you're paying $100/month today. At true cost, that's $318/month. If OpenAI needs 50% margin to stay viable, you're looking at $ 636/month—a 6x increase.

And you have no alternative. You can't fork GPT-5.2. You can't self-host Claude. Your product depends on infrastructure owned by companies that will charge whatever the market can bear once you're locked in.

The Gatekeeping Future

"AI democratizes engineering" is true right now. A solo developer with $200/month can build at the velocity of a small team. Someone learning to code can ship products that would have required years of experience.

But this is the subsidized phase. When prices normalize:

  • Big tech companies with enterprise volume deals continue at $150-300/seat
  • Indie developers face $1,000-$3,000/month to maintain development velocity
  • Startups with embedded AI need to 3-10x their infrastructure budget or rebuild without it
  • Engineers who can't afford the tools lose competitive edge against those who can

Software engineering becomes gated again. Not by universities or credentials, but by ability to pay three companies for access to proprietary systems. The great democratization becomes the opposite: only those who can afford the gatekeepers can build competitive software.

What We Should Do

Use AI but maintain core skills. Set aside time to code without assistance. Debug manually. Build mental models. Optimize for independence, not just velocity.

Focus on open models you can self-host. Mistral, OpenCoder, Qwen, DeepSeek—models you can run on your own infrastructure. They're not as capable as GPT-5.2 or Claude Opus today, but the gap is closing, and you control the deployment.

Use provider-agnostic tools. Opencode, Aider with Ollama, anything that makes switching models trivial. Avoid workflow lock-in to proprietary ecosystems like Claude Code. The moment it's difficult to change providers, you've handed them pricing power.

Budget honestly. If your development process or product depends on AI, price it at sustainable rates: $1,000-$ 3,000/month per developer, 3-10x current API costs for embedded features. If the math doesn't work at real prices, you're building on subsidy, not business model.

Build with exit strategy. For embedded AI, architect so you can swap providers or fall back to simpler approaches. For development tools, ensure your team can ship—slower, but functional—without them.

The Bottom Line

AI is useful. The models work well. The productivity gains are real.

But we're building on rented ground priced 90% below cost, controlled by three companies with no sustainable business model yet, racing to make us dependent before competitors do.

The tools we've relied on for decades—languages, frameworks, databases—are ours to use indefinitely. These aren't.

Recognize the pattern. Maintain independence. Choose tools that preserve your ability to leave.

Sources

Enjoyed this article?

Check out more articles or connect with me