Skip to content
← All posts

Building BeatSage: From 8-Bar Loops to an AI Music Production Mentor

The story of why BeatSage exists, how the first version came together in ten days of post-surgery recovery, and what I learned building an AI that teaches music production from inside your DAW.

This is the first post in the BeatSage dev blog. I'm Brandon Nelson, the founder. This is the story of why BeatSage exists and how the first version came together in ten days.

The Problem

I've been stuck in the 8-bar loop for twenty years.

I'm a software engineer with fifteen years of experience, currently an engineering manager at a large tech company. I build things for a living. But music production has defeated me, repeatedly, for two decades. Countless grooves, partial songs, weekend jam sessions. Sometimes cathartic, more often frustrating. "I'm never going to get this right." "There's no path forward for me to understand that thing that I just seem to be missing."

This started back in my hardcore junglist raver years. I was always seeking friendship and guidance amongst creators and producers. I have fond memories of afterparties and weekend sessions, hanging out and listening to records. But something never stuck for me. I couldn't gather the focus and confidence to break through.

Fast forward another decade and a half, I'm still frequently returning to production as an outlet and passion. The drive and motivation is there, but baggage and the impenetrable nature of the software continued to get in my way. I would create a few grooves, enjoy myself, but constantly get blocked by that thing. That missing piece to help me finish the song or just feel confident that I was progressing in my learning.

Raise your hand if you've spent an embarrassing amount of money on online courses or tutorials chasing that same feeling. I've spent hundreds of dollars trying to unlock my brain. Many of those courses were outright scams, requiring refund requests or, in a few cases, chargeback action with my credit card provider. There is a lot of predatory and gatekeeping behavior in the music production space. The barrier to entry creates incentives for people to guard their knowledge closely, or at least paywall it.

What I was missing was a kind, patient mentor who could meet me where I was. Someone who could look at my actual project, hear what I was working on, and show me the next step. Not a YouTube video. Not a course that assumed I already knew what a sidechain was.

That's what I set out to build.

Ten Days

In late February 2026, I was recovering from a triple sinus surgery. Couldn't go anywhere, couldn't do much, but I could sit at my desk and write code. The idea for BeatSage had been rattling around in my head for a while. I finally had the time (and the painkillers) to act on it.

What happened next was a blur. Ten days of building between naps and ice packs, with my partner checking in to make sure I was actually resting sometimes. I was not.

I'd been studying how AI could interact with DAWs, and I saw something nobody else seemed to be building: not another AI assistant, but an AI mentor. One that could teach music production by actually doing things in your DAW while you watch and listen. Show, don't tell.

The first few days were architecture and core infrastructure. I designed the system so that any DAW could plug in through the same interface, whether the communication underneath was TCP sockets, MIDI, or HTTP. That abstraction turned out to be the single best decision of the project.

By day three I had a working prototype you could talk to. Not pretty, but functional.

Days three and four were the big push. I built a desktop app (Tauri, Rust backend, Svelte frontend) with a real chat interface, an installer, and a dashboard. BeatSage went from a prototype to something you could actually download and use.

Days four and five: the engine tools. Over a hundred high-level operations for controlling your DAW, each one designed to do the right thing end-to-end rather than requiring five separate commands to accomplish one task.

Days five and six were the bet. The education engine. Structured lessons, skill tracking, the ability to analyze a producer's session and recommend what to learn next. This is the piece that turns a chatbot into a tutor, and no competitor had anything like it.

Days six and seven: FL Studio and REAPER support. FL Studio was a saga (full blog post here). Short version: FL Studio's scripting environment is sandboxed with no network access, so we had to get creative with MIDI to communicate. Hit six dead ends before finding a solution. REAPER was straightforward by comparison.

Days seven through nine: authentication, subscriptions, and the education UI. A Learn tab where you browse courses and track progress. An onboarding flow that figures out where you are and meets you there.

Day ten: v1.1.0. Feature freeze, launch prep, first beta users.

I'm glossing over a lot. There were dozens of specification documents written along the way. An accessibility audit. A landing page. A brand guide. Competitor research.

Ten days from first commit to shipping beta, most of them spent recovering from surgery on my couch. I don't say that to brag. I say it because that's what happens when a problem has been eating at you for twenty years and you finally have both the skills and the downtime to solve it.

The Rewrite

A few weeks after launch, I rewrote the entire backend from Python to Go. Not because Python was slow. Because packaging was a nightmare.

BeatSage ships as a desktop app, and the Python backend needed a virtual environment. The installer had to locate Python on the user's machine, create a venv, pip-install a wheel, handle edge cases. It was the most common support issue in beta.

The Go rewrite produced a single static binary. No interpreter, no venv, no pip. The installer went from a multi-step wizard to "copy binary, run binary." Startup dropped from 2-3 seconds to under 200ms. Memory usage dropped dramatically.

The architecture survived the rewrite unchanged. That validated the early design work. When your abstraction boundaries are clean, swapping the implementation language is mostly mechanical.

What v1.0 Shipped With

When v1.1.0 went out as an early access beta, here's what was in the box:

For learners:

  • Multiple courses with guided, step-by-step lessons
  • Skill tracking that adapts to your level
  • Session analysis that looks at your DAW project and recommends what to learn next
  • "Show, don't tell" teaching. The AI demonstrates techniques in your actual project, not in a YouTube video.

For producers:

  • Cloud AI on the free tier (no GPU required, no API keys to configure)
  • Deep DAW integration for Ableton Live and FL Studio
  • A desktop app that detects your DAW, handles setup, and gets out of the way

For power users:

  • Higher-tier cloud AI with bigger context
  • Full tool access for direct DAW control
  • Multi-DAW support

Why Accessibility Matters

Music production has always had barriers. Expensive software. Dense interfaces designed for people who already know what they're doing. Tutorials that assume you can see the screen, use a mouse, or process information in one specific way.

I believe that if you can hear music in your head, you should be able to get it out. Your neurotype, your disability, your budget. None of that should decide whether you get to create.

BeatSage is built from the ground up with this in mind. Screen reader support with proper ARIA annotations. Keyboard navigation for every interactive element. WCAG AA contrast ratios throughout.

We're also designing a hardware controller specifically for blind and visually impaired producers. A USB device with tactile buttons, a rotary encoder, a haptic motor, and an OLED display. Still in the breadboard prototype stage. The idea is straightforward: if a standard interface doesn't work for you, we'll build one that does.

We're learning from communities like Sound Without Sight, Blind Producers, and the MIDI Association's MASSIG initiative as we design accessible tools.

What's Next

v1.2.0 is shipping now. Active development includes spaced repetition for skill reinforcement, before/after audio snapshots so you can hear what each lesson step changed, musical "why" annotations on every action, and bringing the full experience to REAPER as a third supported DAW.

The roadmap is long. The team is one person. But every feature ships because someone (usually me) needed it while trying to make music.

Building in Public

This dev blog is where I'll share the technical and product decisions behind BeatSage. Some posts will be deeply technical. Some will be about education design or accessibility. I'll be honest about what works and what doesn't.

Up next: How We Got AI Into FL Studio - Six dead ends, one Microsoft release, one SDK flag, and a chord progression that no other FL Studio AI can write.

A Note on AI

I use AI tools extensively in my development workflow. Claude Code is my primary coding partner. The direction is mine. The judgment calls are mine. The bugs are mine too.

BeatSage exists because I had a specific problem and a specific vision for solving it. The AI helped me build faster. It didn't tell me what to build.

This Is for People Like Us

If you've ever felt stuck in the loop, literally or creatively. If you've bought the course and felt cheated. If you've wanted a mentor but couldn't afford one.

BeatSage is built for people like us.


For Ella, who sat with me through the recovery, tested every broken build, and never once told me to stop working. BeatSage's first alpha tester and my soon to be wife. Thank you ♥

For Mom, through her life with visual disability (retinitis pigmentosa) she has taught me a great many things. Thank you for inspiring me to help people and do my best to be a force of good in this world.


BeatSage is an AI-powered music production mentor for Ableton Live, FL Studio, REAPER, and Strudel. Learn more at beatsage.ai.