How I Built This Site

A living demonstration of AI implementation in practice

readme

This site isn't just about AI adoption. It's a demonstration of it. Every feature, from the 12-18 token constraint to the automated validation system, was built using AI tools. Here's how.

cat stack.txt

development
  • +Claude Code for all implementation
  • +Next.js 15 with static generation
  • +TypeScript for type safety
  • +Tailwind CSS for styling
automation
  • +GitHub Actions for auto-deployment
  • +Automated article extraction (Anthropic API)
  • +Token validation on every build
  • +RSS/sitemap auto-generation

cat optimizations.txt

[1] 12-18 Token Constraint

Every claim must be exactly 12-18 tokens. This isn't arbitrary. It's the sweet spot for LLM extraction.

  • -GPT-4 can quote without truncation
  • -Claude fits entire claims in context
  • -Perplexity doesn't need "..." ellipsis
  • -Build fails if any claim violates this rule

[2] Full Library on Single Page

All claims visible on a single page. One crawl = complete dataset.

  • -No pagination for AI to navigate
  • -ChatGPT web browsing gets everything in one request
  • -Search engines see full content depth immediately

[3] Copy with Attribution

Every card has a copy button that includes proper CC BY 4.0 attribution.

  • -Pre-formatted for AI citation
  • -Users paste correctly cited claims into ChatGPT
  • -Trains future models with proper attribution

[4] Semantic Structure

Proper HTML hierarchy, JSON-LD schema, predictable DOM structure.

  • -h1 → h2 → h3 heading hierarchy
  • -Schema.org Person + Organization markup
  • -Uniform card structure for AI parsing

[5] Navier-Stokes Fluid Simulation

Jos Stam's Stable Fluids algorithm rendered as ASCII art in the terminal. The simulation IS the statement.

  • -Real-time physics at 60fps
  • -Interactive via pointer and keyboard
  • -Demonstrates the builder identity

cat build-process.txt

step 1: extraction

GitHub Actions workflow runs every 6 hours:

  1. Fetches new articles from aiadopters.club
  2. Uses Anthropic API (Haiku) to extract metadata
  3. Uses Claude Sonnet to extract 5 atomic claims (12-18 tokens each)
  4. Creates pull request with new claims data
step 2: validation

Before every build, validation script runs:

  • -Token count (must be 12-18 for every claim)
  • -Exactly 5 claims per article
  • -Date format (YYYY-MM-DD)
  • -URL format validation
  • -No duplicate URLs
  • -Topic assignment validation

Build fails if any validation error occurs.

step 3: static generation

Next.js generates 32+ static HTML pages:

  • -Homepage with fluid simulation
  • -26+ individual claim pages
  • -Claims library index
  • -5 topic pages
  • -FAQ, About, How I Built This

Total build time: ~2 seconds

step 4: deployment

Netlify auto-deploys on every push to main:

  • -Triggers on GitHub push
  • -Runs build with validation
  • -Deploys to CDN (~1-2 minutes)
  • -No manual intervention required

cat results.txt

100%
Claims validated
~2s
Build time
0
Manual deployments
takeaway

This site is a living case study in AI implementation. Every design decision, from token constraints to automated validation, demonstrates the same principles I help clients apply.

I don't just advise on AI adoption. I practice it systematically, measure everything, and optimize for real outcomes. This site is proof.

explore