Dev Blog #1 -- A melodramatic origin story.
Share
Welcome to my development blog -- where all grammatical mistakes may exist and are solely mine. Because I'm not running with Grammarly, or name your favorite LLM. I'm just putting my fingers on the keys here.
So today is August 8, 2025 and we've been "soft launching" for the past month. After a year of work, we're finally shipping headsets, and our app doesn't break constantly, or take 20+ minutes to download a new language. So, how did we get here?
I grew up loving movies about startups like The Pirates of Silicon Valley and The Social Network. But when I play back how we started in my head, it seems a lot more boring. Mostly because it feels like I've been doing nothing but staring at one screen or another for the past year.
So...
Alec, my good friend and cofounder, and I had been brainstorming (no pun intended) neurotech ideas for a couple of years, and even created an EEG headcap to see if we could read what words Alec was thinking, back in 2024. We soldered a crap ton of wires onto electrodes and put together a bunch of audio for Alec to listen to, figuring if we could decipher the words as he was listening to audio, maybe we could correlate that to the words that naturally appear in his mind when he's thinking them.
I made sure to stuff some David Goggins motivational speeches and Alex Jones into the mix, as Alec had to sit still in a dark room for 2 hours listening to this hot garbage while the EEG recorded the electrical signals that made it to his scalp. The idea never ended up working for us, which I'm not beat up about at all -- because what the heck would you do with a device that could read your thoughts? (Likely with very low resolution) I mean, we already have devices that you can just speak to and they'll control your computer. Which is kinda the same thing -- yes, yes, super reductionist, I know. But funny enough, as I found out over the past year while creating the NeuroLingo, that the market is already super saturated with EEG devices that don't really do anything. Most of them (I won't name names) seem to just tell you when you're focused or tired. What's worse is they're not even that accurate.
Anyway, last summer Alec was goofing around in China (probably suffering the locals he met with his poor Mandarin, haha) and sends me a Signal message about 'tDCS for language learning.' I'd never heard about this, but as I came to realize, the literature had been around for a long time. But that same day, we were accepted into the Launchpad program at Yale's Tsai City Centre for startups. Thus began a startup sprint that ended up taking every second of my free time and iota of sanity, continuing through today.
Well, that's something of a rather short origin story. Maybe tomorrow I'll make a proper development blog talking about my favorite frameworks for mobile development and what went into making the brain stimulation device.
Yours.
Luke Knoble, CTO.
1 comment
Heeey my Mandarin wasn’t that bad, I could order a coffee in a good enough accent that the barista would ask a follow-up question that I couldn’t understand. Turns out she asked if I wanted milk or sugar, just like a person rather than a textbook 🥲