I’ve always loved music—not just listening to it, but trying to create it. And yet, as someone without formal training, I’ve often found myself stuck when it comes to composing chords. It’s a foundational part of music theory, but the tools out there either felt too complex or too limited. So I built one myself.

When fast ideas become real prototypes

Learning to code started as a side hobby for me—slow, sometimes frustrating, but deeply satisfying. What changed everything recently was realizing how accessible prototyping has become thanks to AI. Suddenly, fast ideas weren’t just napkin sketches; they could become working apps in days.

That realization pushed me to look for ideas worth prototyping. Music composition—especially chords—was the obvious place to start. It’s a problem I’ve wrestled with for years, and I had a very personal motivation to solve it.

Why existing tools didn’t work for me

I tried so many tools over the years:

  • Tools like Scaler give me access to rich chords, but feels stiff and hard to play expressively.
  • Tools like Orchid are simpler and easier to use but lack the complexity I want for building real progressions.

So I started imagining something different. A tool focused not just on the theory, but on direct manipulation. Something versatile enough to play complex, emotional progressions live, using a regular keyboard. And above all, something intuitive, like a musical instrument built from a gamer’s mindset.

Designing for expressive control

The core idea?

  • Use the right hand to select and trigger chords.
  • Let the left hand control chord type and voicing.
  • Use the thumb to switch moods or tonal centers.

It’s a weird little setup—but it works. I took inspiration from how videogames map deep interactions to simple, familiar controls. And that made me think: what if this wasn’t just a fun experiment, but a way to explore designing for creativity?

As I built the prototype, I started layering in features:

  • Visual feedback for every note played.
  • Background beats to help with rhythm and flow.
  • Swappable instruments and sounds.
  • And eventually, the ability to export MIDI so people can use it with their DAWs.
Building it (with and despite AI)

I used Cursor to speed up the build, but let’s be honest: it’s still a bit painful. The hardest part wasn’t writing the code—it was architecting the musical logic. That part needed human input. AI could help suggest snippets, but making it all robust and playable required a lot of manual reasoning.

And then there’s the front-end. For this project, I needed pixel-perfect precision, inspired by the industrial beauty of Dieter Rams and the playful minimalism of Teenage Engineering. That part? Still handcrafted.

What’s next

I’ve been testing and tweaking based on feedback—and that’s another thing I love about this new dev era. Iteration is faster than ever.

My plan is to keep the web app free as a way to gather users and refine the concept. The next step?

  • Build a mobile version (iOS first), with MIDI input via iPhone/iPad.
  • Use that to generate some revenue.
  • Reinvest into creating a VST plugin so musicians can use it directly inside their DAW setups.

It’s a long game, but I’m in.

Thanks for reading. If you want to try the prototype, you can test CC1 here.