The Workshop Invitation
After two weeks of posting about lib-pcb — the parsing complexity, the weekend experiments, the velocity numbers, the testing discipline — I posted something different.
Not more proof. An invitation.
After two weeks of posting about lib-pcb — the parsing complexity, the weekend experiments, the velocity numbers, the testing discipline — I posted something different.
Not more proof. An invitation.
For most of my thirty years in software, iteration has been expensive. Not in theory. In practice, in the way that shapes every decision a team makes. When changing a core data structure takes two weeks of careful refactoring across dozens of files, you do not change the data structure on a hunch. You analyze. You write a proposal. You get approval. You schedule it for the next sprint, or the one after that. The cost of being wrong is measured in weeks, and so the entire machinery of software engineering orients itself around not being wrong.
That cost has collapsed. Not gradually. Not by half. By orders of magnitude. And I am not sure we have reckoned with what that means for the way we work.
Unit tests passed. Every one of them. Green across the board.
And then we ran the parser against real legacy Gerber files — files from actual PCB designs, exported by real design tools used by real engineers over the last twenty years — and the success rate was 25%.
Three out of four failed.
For thirty years I have broken work into tasks. Decompose the feature into subtasks, estimate the hours, write the code, move the ticket. The unit of progress was the line of code. The measure of a good day was how much I shipped. That loop was so deeply embedded in how I worked that I did not notice it was a loop. It was just what development meant.
Then I started delegating implementation to AI, and the loop broke. Not gradually. In about a week.
The first reaction is always disbelief.
"That's not possible." Or: "That only works for trivial problems." Or the politer version: "That must be very rough code."
So here are the numbers. Not estimates. Actuals.
Everyone tells the same story about AI-assisted development. AI generates code fast, so you ship faster. Straightforward. Compelling. Wrong.
The actual productivity gain from AI does not come from generation speed. It comes from verification infrastructure that makes it safe to accept AI output at scale. The counterintuitive truth: the team that writes the most tests ships the fastest. Not despite the testing. Because of it.
The plan was to spend a weekend validating whether a complete PCB design library was actually buildable at AI velocity.
Not a prototype. Not a demo with curated inputs. Something that could consume real Gerber RS-274X files — the manufacturing format that PCB designers actually export from KiCad, Altium, Eagle — parse them completely, and produce manufacturing-ready outputs.
I have been writing software for thirty years. In that time I have sat through thousands of daily standups, hundreds of onboarding sessions, and more planning ceremonies than I care to count. Most of them existed for one reason: transferring context from people who had it to people who did not. The new developer needs to know how the deployment pipeline works. The team lead missed yesterday's discussion about the API change. The architect needs to understand why the data model looks the way it does before approving the next feature.
These are not bad reasons to meet. But they are expensive reasons. And increasingly, they are avoidable ones.
You'd think parsing a part number would be trivial.
A string of characters. Match it against a pattern. Done.
Then you actually try to do it across a real component database, and you discover how much institutional knowledge is packed into those strings — and how little of it is documented anywhere.
There's a reason temporal analytics resonates particularly strongly in Norway.
It's not just that Norwegian organisations face the same data challenges as everyone else — though they do. It's that several converging factors make Norway an unusually well-positioned market for AI-powered data infrastructure, right now.