My Methodology

Direction first. Specs when earned.

Most AI development fails because people spec the wrong thing. I start with direction — North Stars that guide decisions — then let specifications emerge as tools, not prerequisites.

The Problem

Vibe Coding vs. Direction-Driven Development

What I don't do

Vibe Coding

  • Improvised, intuition-driven prompts
  • Hidden assumptions in every request
  • Unpredictable, untestable results
  • "Hope-based" debugging strategies
  • Context lost between sessions
  • No systematic approach to complexity
  • Delivery dates are guesswork
What I do

Direction-Driven Development

  • North Stars define the "why" before the "what"
  • Goals with obstacle anticipation (WOOP methodology)
  • Staged progression: Discovery → Design → Implementation
  • Specs emerge from working code, not upfront
  • Dynamic task generation from codebase reality
  • Transition Kit that survives stack changes
  • 90 days, guaranteed
The Framework

North Stars & Goals

Direction-Driven Development starts with why, not what. Before any specification, I establish the direction that guides every decision.

North Stars

The guiding direction — what you're ultimately trying to achieve. Not a task, not a feature. The reason the product exists.

  • "Users trust their data is safe"
  • "Founders validate ideas before building"
  • "Teams ship without decision fatigue"

Goals (WOOP)

Concrete objectives under each North Star, built with obstacle anticipation:

  • Wish — What you want to achieve
  • Outcome — How you'll know it's done
  • Obstacles — What might block you
  • Plan — "If [obstacle], then [response]"

Stage Progression

Discovery

Jobs-to-be-Done

Design

User Stories → Features

Implementation

Read → Plan → Code → Validate

SPECs apply to implementation. Discovery and design come first — or you're specifying the wrong thing.

The Nuance

When Specs Work — and When They Don't

Specs aren't religion. They're tools. The goal isn't maximum specification — it's specifications that earn their existence by preventing real problems.

Specs Accelerate When...

  • Requirements are clear and stable
  • Multiple people need alignment
  • Compliance or contracts require documentation
  • You're building on existing architecture
  • The domain is well-understood

Specs Slow You Down When...

  • You're still discovering what to build
  • It's just you and AI (no handoffs)
  • Requirements change faster than docs
  • You're exploring a new domain
  • The spec becomes self-documentation overhead
Implementation Tool

Implementation Checklist

Once direction is set and design is complete, these four elements ensure consistent, reproducible implementation:

Scope

Explicit boundaries — what's included, what's not. Prevents AI hallucination into adjacent features.

Rules

Constraints and requirements. What must always be true, what must never happen.

Edge Cases

Anticipated exceptions and how to handle them. The AI knows what to do when things go wrong.

Locked Assumptions

Explicit statements the AI must treat as absolute truth — no questioning, no reinterpretation.

The Process

Read → Plan → Code → Validate

R

Read

Gather context, understand existing code, identify dependencies

P

Plan

Structure the approach, create SPEC, define success criteria

C

Code

Execute with AI assistance, following the SPEC exactly

V

Validate

Test against criteria, review with human oversight

Why This Order Matters

Most AI coding failures happen because developers jump straight to "Code" without reading existing context or planning the approach. The AI makes assumptions, those assumptions conflict with reality, and debugging spirals begin.

My workflow ensures the AI always has complete context before generating a single line. Planning happens explicitly, not implicitly. And validation isn't an afterthought — it's built into every cycle.

The Lifecycle

Direction-Driven SDLC

01

North Star Definition

Before any code, we establish the guiding direction. What problem are you solving? Who has it most acutely? What does success look like? Human judgment defines the destination.

02

Goal Setting (WOOP)

Break the North Star into concrete goals with obstacle anticipation. Each goal has clear outcomes, identified blockers, and If-Then contingency plans. No wishful thinking.

03

Discovery & Design

Jobs-to-be-Done analysis. User story mapping. Feature files. This is where we figure out what to build — before committing to how. SPECs emerge here, not before.

04

Implementation

Now SPECs matter. Read → Plan → Code → Validate for each task. AI generates, humans review. Every output tested against acceptance criteria. This is where direction becomes code.

05

Transition Kit

As we build, we capture portable artifacts: schemas, contracts, flows, logic. The working code generates its own specification — a Transition Kit that survives stack changes.

The Science

Debugging Decay

100%
75%
45%
25%
12%

AI debugging effectiveness drops dramatically after 2-4 attempts in a single session.

I've observed — and the research confirms — that AI debugging follows a decay curve. The first attempt at fixing an issue has the highest success rate. Each subsequent attempt in the same context has diminishing returns as the AI accumulates conflicting assumptions.

The key insight: decay happens when you keep trying the same approach. The fix isn't just "reset after N attempts" — it's trying genuinely different approaches first:

1. Try three different approaches — not three variations of the same idea

2. Document what was attempted — prevents repeating failures

3. Only then reset — with fresh context and explicit learnings

This isn't giving up — it's engineering discipline. The documentation of failed approaches becomes institutional memory that survives the reset.

The Stack

Anthropic-Only Architecture

One stack. Infinite depth.

I don't chase the latest AI model announcements. I go deep on one stack — Anthropic's Claude — because depth beats breadth when you're building production systems.

If Claude can't do something, I build the tooling myself. If I can't build it, then — and only then — I look elsewhere. This discipline ensures I understand my tools completely, not superficially.

The result: predictable behavior, accumulated expertise, and no surprises when it matters most.

The Deliverable

The Transition Kit

What happens when you need to rebuild in a different stack? My specs aren't throwaway docs — they're a portable Transition Kit that survives technology changes.

schemas/*.yml

Your data model — entities, relationships, constraints. Generates ORM models in any language.

contracts/*.yml

Your API surface — endpoints, request/response formats. Maps to routes in any framework.

flows/*.yml

Your user journeys — step-by-step behavior, not wireframes. Rebuilds in any UI framework.

logic/*.yml

Your business rules — validation, calculations, edge cases. The "why" that survives rewrites.

Rebuild in Next.js, React Native, Go, Phoenix — the kit travels with you. This is the difference between specs that die with the codebase and specs that outlive it.

Back to Zero2One