Files
deardiary/www/_posts/2026-03-27-building-deardiary-with-ai.md

3.9 KiB

title, date, author, excerpt
title date author excerpt
What I Learned Building Software with AI 2026-03-27 Konrad Lother I built DearDiary with an AI coding assistant. Here's what actually happened.

What I Learned Building Software with AI

I built a full-stack application using an AI coding assistant. Let me tell you what it's actually like.

No, I'm not going to tell you it's magical. No, I'm not going to tell you it replaced my job. It's more complicated than that.

The Setup

DearDiary: Bun + Hono backend, React frontend, SQLite, Docker. Nothing exotic. I gave the AI context about the project, set it loose, and watched what happened.

The Problems

Here's what actually went wrong:

1. The Invisible Gaps

AI is excellent at systematic changes. Change DATABASE_URL to BACKEND_DATABASE_URL everywhere? Done. Except it missed:

  • The Prisma schema
  • Test helpers
  • A healthcheck config

The app crashed. The error message was cryptic. It took time to find the missing env var.

The problem: AI makes systematic changes, but it doesn't know what it doesn't know. The gap is invisible to it.

2. Routes That Should Exist But Don't

"Fixed the routes by moving them to a separate file." Except the routes were at /api/v1/events but the frontend called /events. The AI didn't catch the mounting path mismatch.

3. "I Fixed That"

Multiple times: the AI said "Fixed!" and showed the corrected code. The file wasn't changed. Or it described a fix that wasn't implemented.

This is the most dangerous failure mode. Confidence without execution.

4. Docker Permissions

Entrypoint scripts kept failing with "permission denied." The AI knew about chmod +x. The order was wrong - file copied after chmod, or Docker cache served old versions.

AI knows facts. Execution order matters.

5. The Real Problem Wasn't Code

Events returned 404. Routes: correct. Mounting: fixed. Auth: fixed.

The actual problem: port 3000 on the host mapped directly to the backend, not through nginx. The frontend (served by nginx) couldn't reach the API.

The AI focused on code layers. The problem was infrastructure configuration.

What Actually Worked

Core features worked on first try. The AI understood the patterns, maintained consistency. Once we established how things worked, it stayed consistent.

The README updates, documentation, refactoring - all smooth after the initial chaos.

The Communication Patterns That Matter

Be Specific About Failures

Don't say "it doesn't work."

Say: "Events endpoint returns 404, docker logs show route is registered."

More context = better fix.

Ask for Verification

"Show me the exact changes before committing."

This catches the "I said I fixed it" problem.

Break It Down

Instead of "consolidate all env vars," we did it in stages:

  1. List current env vars
  2. Decide naming convention
  3. Update backend
  4. Update frontend
  5. Update docker-compose
  6. Verify

State What Works

"The previous approach worked with docker compose build && docker compose up -d."

Context about what has succeeded helps the AI avoid untested solutions.

The Real Insight

Building with AI is like working with a very knowledgeable junior developer who:

  • Has read every Stack Overflow post
  • Writes code faster than you can type
  • Sometimes confidently does the wrong thing
  • Needs supervision, especially for cross-file changes
  • Gets better with clearer instructions

Your job becomes managing the AI, not just writing code.

What I'd Tell Someone Else

AI is a tool. Like any tool, it has strengths and weaknesses.

Use it for:

  • Pattern matching
  • Speed on routine tasks
  • Generating boilerplate
  • Explaining unfamiliar code

Don't use it for:

  • Complex multi-file changes without verification
  • Infrastructure configuration without checking
  • Anything you don't understand yourself

The future isn't "AI replaces developers." It's "developers who use AI replace developers who don't."

Just keep an eye on those environment variables.