There's a particular kind of professional frustration that doesn't have a clean name.
It's not failure. The work is good, the product is working, users are responding. It's the feeling of watching something real get abandoned right at the moment it starts to matter — not because it didn't work, but because the person holding the map decided to look somewhere else.
I spent thirteen months at Huzi AI feeling that feeling. I also did some of the most technically interesting work of my career there. Both things are true.
How do you turn a proof-of-concept into a production product?
Eric, Huzi's founder and CEO, found me through TikTok. I was one of the earlier builders posting seriously about AI product development, and he'd been watching. In late 2023, while I was thinking through the end of GPTBoss, he reached out. He'd just closed a capital raise and needed someone to turn his proof-of-concept into a real product.
The proof-of-concept was a Bubble app — a no-code prototype commissioned from a Bubble agency, built to demonstrate the idea to investors alongside decks and Eric's personal track record. It had done its job. Now it needed to become something that could actually scale.
I said yes. The vision was compelling: an AI worker for real estate agents, handling the compliance-heavy, documentation-heavy administrative work that most agents outsource to virtual assistants. Tedious, time-consuming, high-stakes work that was exactly the kind of thing early agentic AI was starting to be able to handle.
The first thing I did was rebuild from the ground up.
Auth and payments went first — reconstructed properly in Next.js, replacing the no-code equivalents that couldn't support what we were trying to build. Then a monorepo to colocate all the agentic experiments and interface iterations inside a single platform. Proper PostgreSQL for structured data, Cloudflare R2 for document and bucket storage. The unglamorous work that nobody sees and everybody depends on.
It's the kind of rebuild that looks like nothing from the outside and changes everything from the inside.
What did it take to build AI document workflows before the tooling existed?
The flagship product idea was a listing enhancer. Feed it an address; it enriches the data, pulls comparables, runs the relevant compliance checks, and produces a report the agent can bring to a client meeting — a professional, structured document that walks them through everything they need to know about a transaction.
The problem was that the tooling to build this didn't really exist yet.
Vercel's AI SDK — now standard infrastructure for this kind of work — wasn't commercially available when I needed it. So I built the equivalent internally. Harnesses for chaining model calls, managing context across long-running tasks, handling OCR pipelines, structuring outputs into usable schemas. Multi-agent orchestration from scratch, in TypeScript and Bun, before the industry had agreed on what that was supposed to look like.
It worked. The listing enhancer produced real outputs. The document workflows ran. The platform scaled.
What I was building, in retrospect, was a small piece of the infrastructure the industry later standardized around. I just did it quietly, under NDA, inside a startup in Portland.
How do you grow junior engineers into autonomous project leads?
About five months in, the first junior engineer joined. I took the mentorship seriously — not because I was asked to, but because I'd always seen "guy who turns juniors into seniors" as a core part of the job, not a side responsibility.
The approach wasn't about technical mentorship in the traditional sense — pair programming, code review, architectural guidance. Those things happened, but they weren't the core of it.
The real bottleneck in early-stage engineering isn't technical complexity. It's context loss.
You work all day on a feature, make real progress, understand exactly what "done" looks like. You sleep. You wake up the next morning with a slightly fuzzier picture. A week passes. The original intent has drifted. You're solving a slightly different problem than the one you started with, and nobody has noticed because the code still compiles.
I was militant about working from tickets. Not as a process formality — as a memory prosthetic. A ticket isn't bureaucracy; it's a written record of what done looks like, approved by the person who asked for the thing, before a single line of code gets written. It externalizes the context so the overnight brain-drain can't touch it.
I trained both engineers on the full translation pipeline: how non-technical stakeholders communicate (vague, outcome-oriented, often emotionally coded), how to convert that into a tech spec, how to get the spec approved before starting work, and how to coordinate async from a checklist of conditions for success. Within six months, both of them were running their own workstreams without hand-holding.
The methodology works because it solves the actual problem, which is never "this engineer doesn't know enough." It's almost always "this engineer doesn't have a clear enough picture of what they're supposed to build."
What happens when a product finds traction and the founder pivots anyway?
This is the part of the Huzi story that's harder to tell cleanly.
Eric was genuinely brilliant — well-capitalized, experimentative, deeply connected in the real estate industry. He'd built his leadership team from people with serious pedigree: former real estate agency owners, brokers, the kind of operators who specialize in developing raw sales talent into high performers. People who understand human performance at a granular level, who can spot the difference between a fixable failure mode and a structural limitation.
We ran a lot of experiments. The VA-replacement idea stalled on OCR development timelines. We pivoted to a chat assistant for agents. Then the listing enhancer. Then an enterprise SaaS for large brokerages that would collate internal runbooks and documentation and make institutional knowledge accessible to staff.
Each of these ideas found something. Users responded. Metrics moved. The signals that indicate product-market fit were showing up, tentatively but genuinely, across multiple directions.
And then we'd pivot.
I came from a background of having found PMF myself — GPTBoss had done it scrappily, with a TikTok and a broken free trial. I knew what the early signals looked like. I knew the difference between an idea that hadn't worked yet and an idea that was starting to work. What I couldn't understand was why you'd abandon the latter.
Eric was in an exploration loop. Every discovery led to another question rather than an attempt to exploit what we'd found. That's a valid approach to early-stage product development — but it has a cost, and the cost compounds over time.
I left in October 2024. The honest version is that I'd run out of patience for the loop, and I was becoming a worse colleague because of it. The structural version is that I couldn't build what I actually wanted to build — a genuine "do the work" button, an AI employee that closes tasks rather than assists with them — while the product direction was changing every few months.
Both versions are true.
What does working with expert talent developers actually teach you?
The unexpected gift of Huzi was the people.
Eric's senior leadership team were, in the real estate industry's specific vocabulary, talent developers — people who take raw sales potential and coax it into consistent, high performance. Not just managers. Not just trainers. People who have spent careers understanding why some people bloom under pressure and others don't, why the same coaching lands differently depending on the person receiving it, why performance breaks down in specific and predictable ways.
Working closely with people like that — trying to translate their instincts about human behavior into product requirements I could actually build — was like having three-quarters of a Rosetta Stone.
They knew things. Deep, practitioner-level things about motivation, about failure modes, about what makes someone pick up the phone on the fifteenth call versus the fifth. But the knowledge lived in their bodies and their pattern recognition, not in language I could immediately use. I had to decode it — pull out the underlying model through careful observation and pointed questions, then figure out what that model implied about the software we were building.
That process sent me deep into neurophysiology and cognitive psychology in a way nothing else had. I started reading seriously: behavioral economics, cognitive load theory, the neuroscience of decision-making. Not as academic interest — as applied research. I needed to understand the "why" behind what these people were telling me, because understanding the why was the only way to translate their expertise into product decisions.
That intellectual thread is still running. The current work I'm doing — understanding how cognitive architecture shapes human-AI interaction — grows directly out of those thirteen months of trying to decode what Eric's team actually knew.
What's the real value of a job that didn't go the way you planned?
Huzi didn't end the way I'd have written it.
The product didn't find its definitive form while I was there. I didn't get to build the agent I wanted to build. The exploration-without-exploitation dynamic wore me down in ways I'm not entirely proud of.
But the technical output was real. I built infrastructure that worked. I mentored two engineers who grew substantially. I pioneered orchestration patterns that the industry later standardized. I shipped production-grade AI systems for a real customer base in a demanding vertical.
And I came away with something I didn't expect: a serious intellectual grounding in why people behave the way they do, gifted to me indirectly by a team of people who'd spent their careers figuring out how to make other people great.
That's not nothing. That's, in some ways, the most durable thing I've taken from any job.
The work I'm planning next is pointed directly at what I couldn't build at Huzi: a genuine AI employee, narrow-purpose, that takes a task and closes it without a human in the loop at every step. The research foundation to build it properly came, in large part, from thirteen months of trying to understand what those talent developers actually knew.
Sometimes the job that frustrates you the most is the one that prepares you best.