How I Turned a Sunday, a Quarter-Sized Device, and Some Questionable Voice Recognition into the Future of Work
Teaching Silicon to Understand My Mumbling: An Executive's Descent into Voice-First Madness
⚠️ HEALTH WARNING: This post contains dangerous levels of tech enthusiasm, questionable humor, and a grown man talking to jewelry. Side effects may include: uncontrollable eye-rolling, sudden urges to buy wearable AI, and the inexplicable need to say "Hey Omi" to random objects. I'm sorry. I'm sorry again. Actually, I'm sorry a third time just to be safe. Proceed at your own risk.
Sunday, 8:47 AM. I'm lying in bed, scrolling through my phone, when I mumble to the air: "Hey Omi, what's the weather looking like?"
Three seconds later, my phone buzzes. "Today will be partly cloudy with a high of 72°F."
My wife looks at me. "Did you just talk to your necklace?"
"It's not a necklace," I protest. "It's a wearable AI assistant."
"That you wear... around your neck?"
"...Yes."
Welcome to my life as a builder and an early adopter.
The Great Wearable Safari of 2024
Let me back up. Two weeks ago, I went on what I now call my "Goldilocks Journey Through Wearable AI." Picture me at my desk, surrounded by browser tabs like a digital hoarder, researching every voice-recording wearable known to humanity.
The Limitless Pendant? Too closed. The Friend necklace? Too... friendly? (I already have friends, thanks.) Plaud Note? Too rectangular. That random spy pen from Amazon? Too "guy who gets kicked out of board meetings."
Then I found OMI. Open-source. Hackable. A builder's playground disguised as executive jewelry.
The moment I read "it can receive voice, and then use that to do ANYTHING afterwards," I heard angels singing. Or maybe that was just my credit card crying as I clicked "Order Now."
Sunday Morning: The Unboxing Heard 'Round My House
10:00 AM: It arrives. I open it. My daughter walks by.
"Dad's got a new toy."
"This isn't a toy," I say, holding up the tiny device. "This is the future of human-AI interaction."
Her eye roll is so intense I can actually hear it. Like a garage door closing on my credibility.
The First App: Teaching Omi to Remember My Ramblings
11:00 AM: First challenge: Make this thing actually useful. My initial vision? Simple. Voice goes in, organized thoughts come out.
11:30 AM: Code deployed to a server. Nothing fancy—just catch voice files, transcribe them, summarize them. Think of it as Marie Kondo for my mental chaos.
12:00 PM: First successful test. I ramble for three minutes about AI models and agentic coding, and out comes a neat bullet-point summary. My scattered thoughts, suddenly coherent. If only it could do the same for my dance moves.
The "Hey Omi" Saga: A Comedy of Errors
Here's where things get interesting. I wanted to create my own "Hey Siri" moment. You know, casual voice activation, like I'm Tony Stark calling JARVIS.
2:00 PM: Begin testing wake words.
2:15 PM: "Hey Omi" becomes "Het Homie." My AI thinks I'm greeting my friend from the '90s.
2:30 PM: "Hey Omi" becomes "Hi Amy." Somewhere, an Amy wonders why I keep calling her.
2:45 PM: "Hey Omi" becomes "Hey, oh me." Now it's getting philosophical.
3:00 PM: My daughter suggests I'm pronouncing it wrong. I suggest she's grounded.
3:30 PM: Breakthrough! Fuzzy matching algorithm implemented. If it's 70% close to "Hey Omi," we're in business.
3:45 PM: "Hay Omelet" triggers a response. Close enough.
4:00 PM: Fine-tuning complete. 94% accuracy. The other 6%? Let's just say if you mumble "Hey Salami" at my device, you might still get the weather.
The Integration Marathon: Making Magic Happen
6:00 PM: "Hey Omi, what's the capital of Mongolia?" (Testing random knowledge)
Buzz. "Ulaanbaatar. Fun fact: It's the coldest capital city in the world."
My device is now smarter than my last three Tuesday trivia performances combined.
7:00 PM: Slack integration complete. Now my random thoughts get broadcasted to a channel I optimistically named "Justin's Genius Ideas."
8:00 PM: Pushover notifications set up. My phone now buzzes with answers faster than my kids respond to "chicken nuggets are ready."
Let's Talk Executive Math™ (Or: How to Justify This to Your Spouse/CFO)
Here's where it gets interesting for the spreadsheet lovers among us:
But here's the real kicker—it's not about the time saved. It's about the thoughts captured, the ideas preserved, the "what was that thing Johnson said about Q4?" questions answered instantly.
Last week alone:
Captured 47 meeting notes while walking between conference rooms
Answered 134 random questions (32% were "What time is it?" because I'm too lazy to check my phone)
Saved approximately 7 brilliant ideas that would've been forgotten by the time I found a pen
The Vision: This Is Just Sunday, Folks
Here's what gets me excited enough to talk to my chest in public:
Today: "Hey Omi, what's the weather?" Simple questions, simple answers.
Next Week: "Hey Omi, summarize my last three meetings with Sarah." Context awareness kicks in.
Next Month: "Hey Omi, what are the key themes from all my customer calls this quarter?" Personal knowledge base activated.
Next Quarter: Every team member with their own Omi. Collective intelligence emerges. We become the Borg, but with better work-life balance.
The Dream: I walk into a meeting, and Omi whispers (through my earbuds), "Based on Johnson's email sentiment analysis, he's worried about timeline. Lead with the accelerated delivery plan."
The Meta Raybans Moment (Or: Why Open Source Wins)
Quick sidebar: I own Meta Raybans. They're great for looking like I'm in a spy movie. But they're a walled garden. Beautiful walls, but walls nonetheless.
Omi? It's more like an open field. With a playground. And a maker space. And a "build whatever crazy thing you can imagine" sign at the entrance.
When Omi releases glasses (coming soon™), I'll be able to build:
Visual recognition + voice queries ("What wine pairs with what I'm looking at?")
Real-time translation overlays
"Where did I leave my keys?" with actual answers
Meeting notes that include whiteboard captures
Try doing that with closed-source hardware. I'll wait.
The Part Where I Get Philosophical (Bear With Me)
We're at an inflection point. Not just in technology, but in how we interact with information. We've gone from:
Stone tablets → Paper → Keyboards → Touchscreens → ...Voice?
But here's the thing—voice isn't just another interface. It's the removal of interface. It's thought-to-action at the speed of speech.
Last Tuesday, I watched a colleague frantically typing notes during an important meeting. Thirty minutes later, they asked, "What was that thing? about the thing."
Crickets.
Meanwhile, I asked Omi and had the answer in three seconds. Not because I'm smarter. Because I'm lazier, and lazy people find efficient solutions.
Your Move, Future Builder
Look, I could tell you to rush out and buy an Omi. But that's not the point. The point is this:
The future of work isn't about typing faster. It's about not typing at all.
It's about capturing every brilliant idea that strikes during your shower. It's about never losing that perfect turn of phrase from a client call. It's about having an AI assistant that actually assists, not just responds.
So here's my challenge:
Find your Sunday. Block 12 hours. Less time than a Netflix binge.
Pick your experiment. Omi, or something else. Just start.
Build something ridiculous. My wake word could've been "Oh Great Computer." (My wife vetoed this.)
Share your story. Seriously, I want to hear what you build.
The Call to Adventure (Yes, I'm Talking to You)
Got a wild idea for what voice AI could do in your workflow? Think my "Executive Math™" is too conservative? Want to debate whether "Hey Omi" is better than "Yo, Robot"?
Let's talk. Find me on LinkedIn, carrier pigeon, or just shout "Hey Omi, connect me with that guy who talks to his necklace."
Because here's the secret: This isn't about one device, one app, or one Sunday of coding. It's about reimagining how we work, think, and capture the constant stream of human brilliance that flows through our days.
The revolution won't be typed. It'll be spoken.
And it starts with three words: "Hey Omi..."
P.S. - Current wake word accuracy: 94%. Number of times my family has said "Hey Omi" just to mess with me: 2,847. Number of times it's actually helped me remember something important: Priceless.
P.P.S. - To anyone who asks why I spent my weekend talking to jewelry: It heard you. It's planning its response. Be afraid.
About the Author: Justin is a technical leader who believes the best innovations happen when you're too lazy to do things the hard way. When he's not having philosophical conversations with his AI necklace, he's probably explaining to his family why talking to inanimate objects is totally normal in 2025. His Omi has been programmed to agree with him.