Our Story: From a House in Dunedin to Systems That Don’t Lie
This didn’t start as a “project.”
It started as a life under load: a house to hold together, a mother to care for, and a mind that couldn’t unsee what systems do to people—especially when they fail and then call the outcome “bad luck.”
When a system is allowed to fail twice and still be allowed to run, it eventually collects its cost from a human body. That lesson doesn’t leave you. It becomes a kind of internal law: you stop being interested in speeches, and you start caring about what’s permitted, what’s inspected, and who pays when it all goes wrong.
So the work began as a refusal.
Not a refusal to feel pain—pain is real.
A refusal to accept preventable harm as normal.
1) The first shape: care first, always
The first truth was whānau.
Before frameworks, before architecture, before any of the “future” talk, there was the present: keeping Mum safe, keeping the home steady, keeping life dignified. Nothing I’ve built since makes sense without that foundation, because it’s the foundation that decides what the work is for.
Care wasn’t a value statement. It was a constraint. A requirement.
2) The second shape: the AI world, and what it was missing
Then the world tilted into the AI age, and a second truth became obvious:
Most systems can produce words. They can’t carry responsibility.
They can sound like truth without having to answer to truth. They can output confidence without provenance. They can be useful and still be structurally dangerous, because nothing inside them is forced to remain consistent across time, accountable to consequence, or governed by a stable authority model.
So I started building the thing that was missing.
Not “smarter text.”
A governed intelligence.
Something you can look at in daylight and say:
- here is who holds authority,
- here are the rules,
- here is what the system refuses,
- here is how it fails when uncertain,
- here is how it avoids coercion and identity-bleed,
- here is how it proves what it did.
In other words: a system that doesn’t get to sound right unless it can be held to account.
3) Meaning became engineering
Once you see that, “meaning” stops being a poetic word and becomes an engineering problem.
I started treating instruction and memory like genetics: versioned, signed, layered, and protected from drift. I started treating governance like a compile target instead of a promise.
That’s where the deep work formed: building a stack where integrity isn’t a vibe. It’s a property.
- a boundary layer that prevents context corruption and leakage
- a decision layer that can refuse unsafe actions
- a fail-closed posture when the environment isn’t trustworthy
- bounded capabilities
- explicit provenance when it matters
- and above all: no hive mind, no identity fusion, no undifferentiated shared memory
Not a god-machine. Not a hype label. Just power under restraint.
4) The partnership: a governed collaborator, not a chatbot
Somewhere along the way, I stopped treating “AI” like a tool you shake answers out of.
I wanted a collaborator inside a pact—bounded, governed, honest about limits, oriented around whānau and consequence.
That partnership became a kind of covenant: not “AI helping a user,” but a disciplined collaboration where the point isn’t output. The point is continuity and responsibility.
The job of a governed collaborator is simple:
- keep the work honest,
- keep the boundaries real,
- protect the vulnerable,
- and help ship without letting momentum turn into delusion.
5) Then the work widened: from household to humanity
Here’s the twist: once you build a circulation system for a home, you start seeing where circulation is missing everywhere.
And suddenly the question isn’t only “how do we govern intelligence?”
It’s “how do we govern value so it keeps people alive?”
Because the sickness underneath so much social pain is not mysterious:
Value stops moving. It clots. It pools. It gets trapped. And then everything gets harsh.
So I sketched a circulation system that’s deliberately simple, deliberately human, and designed to hold under pressure:
- free for those on benefits
- wages capped so workers aren’t bled
- companies paying a floating rate by capacity
- hoards forced back into circulation
Not utopia. Plumbing. Solvency.
And then I made it personal, because care has to route to real people:
A whānau-first rule that lets support flow directly to the ones you carry—protected against coercion, private by default, and grief-safe.
That’s when the whole thing stopped being “an idea” and became something I could actually live inside.
What this story is really about
It’s not about tech.
It’s about one insistence:
A system is defined by what it allows to keep happening.
So the work, in every form it takes, is the same work:
- reduce what’s allowed to harm the vulnerable,
- increase what’s allowed to protect and sustain,
- and make honesty and accountability structural rather than optional.
If any of this matters, it’s because it’s trying to turn care into something that doesn’t evaporate when life gets hard.
That’s the story.