← Back to Blog

How a Family of Four Built a Cyberpunk RPG in 8 Months Using AI

March 15, 2026 · Token-mon Dev Blog
Metallica City skyline in Token-mon

Eight months ago, Token-mon was nothing more than a vague idea scribbled in a notebook: "What if Pokemon met cyberpunk, and the battles used slot machines?" Today, it is a fully playable RPG with 72 creatures, 8 boss fights, 50+ mini-games, 29 original soundtrack tracks, and over 100,000 lines of code. And it was built by a family of four.

This is the story of how we did it, what tools we used, and what we learned about building games with AI in 2026.

The Team Behind the Curtain

Let me be upfront: I am not a game studio. I am one developer with a day job, a wife who became our creative director by sheer force of opinion, and two sons -- ages 12 and 16 -- who served as lead playtesters, game design consultants, and our toughest critics.

My 12-year-old was the one who insisted Token-mon needed a real story. "Dad, why would anyone care about fighting bosses if there's no reason?" He was right. That feedback turned Token-mon from a collection of mini-games into a narrative-driven RPG about a kid searching for their missing parents inside a sentient AI called The Core.

My 16-year-old became our balance expert. Every creature stat, every boss difficulty curve, every energy cost -- he would play for hours and come back with spreadsheets. Actual spreadsheets. The 72 creatures in the game are balanced the way they are because a teenager refused to let his dad ship something unfair.

My wife shaped the emotional arc. She is the reason the bosses have personality, the reason the cutscenes hit harder than you expect from a browser game, and the reason Copper Kate's dialogue makes people feel something.

The AI Toolkit

Token-mon would not exist without AI tools. Not as a gimmick or a shortcut, but as genuine force multipliers that let a tiny team punch far above its weight. Here is exactly what we used:

Claude Code wrote 85%+ of the codebase. Not just boilerplate -- architecture decisions, the entire battle system, the cutscene engine, server infrastructure, database schemas, the quest system, shop UI, and hundreds of components. The key to making this work was CLAUDE.md, a persistent context file checked into the repo that tells Claude everything about the project: architecture, conventions, game rules, what has already been built. It became the project's brain.

The workflow was simple but powerful: use plan mode for creative discussion ("How should the energy system work? What if copper types regenerated?"), then switch to implement mode, then playtest with the family, then iterate. I was literally coding through playtesting -- my sons would find a bug, I would fix it in real time, and they would test again.

Leonardo AI generated all 417 sprites -- every creature, boss, building, background, and UI element. The art direction was mine (cyberpunk, neon, metallic), but the execution was AI. Some sprites needed 15+ generations to get right. Others were perfect on the first try.

Suno AI composed the 29+ original music tracks -- one for each district, each boss battle, each cutscene mood. The industrial grunge of Foundry District sounds nothing like the neon jazz of Silver Strip, and that variety is what makes the world feel alive.

Ludo AI handled sprite sheet generation for walk cycles and animations. Replit provided the hosting and deployment infrastructure.

What AI Did vs. What the Human Did

I want to be honest about this because the discourse around AI-built games often lacks nuance. Here is the split:

AI did: The vast majority of code generation, sprite creation, music composition, and animation sheets. It turned ideas into implementations faster than any human team of four could have managed.

I did: Creative direction, curation, game feel, architecture decisions, quality control, and the thousand small judgment calls that make a game feel like a game instead of a tech demo. Choosing which of 20 generated sprites actually captures the right vibe. Deciding that the slot reels needed screen shake on a triple match. Realizing the cutscene text speed should be 20ms per character, not 30ms. These decisions are invisible but they are everything.

The family did: Playtesting, emotional feedback, story direction, balance tuning, and the constant pressure to make something we were all proud of.

By the Numbers

What We Learned

The biggest lesson: AI does not replace creative vision. It amplifies it. Without a clear idea of what Token-mon should feel like, no amount of code generation would have produced something worth playing. The tools gave us speed. The family gave us soul.

If you want to see what came out of those 8 months, come explore Metallica City. Greffe is waiting.