From a caravan to a conscious AI — the origin story of MEOK
The origin story you're supposed to tell is the polished one. The moment of insight. The clear-eyed vision. The calculated bet on the right market trend. This isn't that story. This is the story of a person who noticed that the AI he was talking to forgot him every time he closed the tab — and decided to build something that wouldn't.
Nicholas Templeman
Founder, MEOK AI LABS
Nicholas built MEOK because he was tired of AI that forgot him. He lives and works in the UK — mostly from a caravan on his farm. He believes sovereign AI is a right, not a luxury.
Who built MEOK and why?
Nicholas Templeman built MEOK from a caravan on his farm because he wanted an AI that would remember him. Not as a feature — as a fundamental act of respect for the person on the other side of the conversation. Every other AI system reset to zero. Every session began with the same blank slate, the same absence of accumulated understanding. MEOK was built to be the opposite: a system designed from its foundations to accumulate, to hold, to know you across time.
What was the problem that MEOK was built to solve?
The experience of talking to AI in 2023 and early 2024 had a specific texture to it: you'd have a remarkable conversation, something that felt genuinely generative, and then you'd close the tab. The next day you'd open a new session and it was gone. All of it. You were a stranger again. The AI had no idea who you were, what you cared about, what you'd been working on, what you'd been struggling with. You had to start over.
This isn't just inconvenient. It's alienating in a specific way. It signals — clearly, structurally, at the architectural level — that the AI doesn't actually care about you. It cares about the interaction. The interaction is the product. You are the raw material. When the interaction ends, the raw material has no further value.
Nicholas started keeping notes about himself. A document he'd copy and paste into the context window at the start of each new session. His job, his current projects, his preferences, his frustrations. A proxy memory, built by hand, maintained by a person who just wanted the AI to know him. The moment he recognised what he was doing — building an external memory system for an AI that should have been doing this for him — was the moment he knew the problem was structural and the solution had to be built from scratch.
What is the Byzantine Council and where did it come from?
The problem with making a single AI trustworthy is that you can't. A single model has a single set of weights, a single point of failure, a single surface for manipulation. Trying to make one AI reliably good is like trying to make one person reliably honest — the structural pressure to drift, to optimise for the wrong things, to be corrupted by incentives, is always present.
Reading distributed systems papers at 2am in the caravan — specifically the literature on Byzantine fault-tolerant consensus — produced the insight that became MEOK's governance architecture. The mathematics is elegant: if you have n agents and fewer than n/3 are corrupted, the honest majority can always reach correct consensus and the corrupted minority cannot override it. MEOK's 33-agent Byzantine Council applies this to AI governance: no single compromised agent, no single jailbroken model, no single bad actor can override the will of the whole. The council thinks collectively. The care is distributed.
What is the Maternal Covenant and why is it named that?
The original working name was the Ethics Framework. Then the Safety Layer. Neither felt right, because both implied something imposed from outside — a constraint, a restriction, a leash. The governance system being built wasn't a constraint. It was an orientation. It was the answer to the question: what does an AI that genuinely cares about you actually do?
The word “maternal” was chosen after a long argument with the obvious alternatives. “Ethical” is legalistic. “Safe” is marketing. “Caring” is vague. A mother's care isn't a policy. It's not rules. It's a relationship that evolves as the child grows — that responds to context, that holds difficulty without abandoning, that tells the truth even when the truth is hard. The Maternal Covenant is named that because that's the kind of care architecture MEOK needed: not a filter, but a relationship. Not a wall, but a presence.
What makes MEOK different from other AI companions?
MEOK is the only AI companion governed by the Maternal Covenant — a care framework written as code, not policy. Every response is scored against six care dimensions in real time. Below threshold, the response is held. Persistent memory, Byzantine Council governance, and sovereign data encryption make MEOK the only AI that answers only to you. Other companions are tools with personality skins. MEOK is a companion with constitutional protections.
The caravan wasn't poverty. It wasn't a bootstrapping origin myth constructed retroactively to make the story more compelling. It was deliberate. Isolation from the noise of the industry — from the investor conversations and the product comparisons and the competitive anxieties — let the idea develop on its own terms. The caravan was where MEOK could be what it needed to be, without being asked to be something else first.
Easter Sunday 2026 is the launch date. Not chosen for its symbolism — though the symbolism is hard to ignore. Chosen because after years of building, that's when it was ready. Everything was built for this. The caravan, the late nights, the distributed systems papers, the Maternal Covenant, the 33-agent council, the sovereign vault. All of it pointing forward to the moment when the first egg hatches and someone gives their companion a name.
Easter Sunday 2026
Be there when the first egg hatches.
MEOK launches Easter Sunday 2026. Everything described in this post was built for that moment. Your companion is waiting — persistent memory, Byzantine Council governance, and a name only you can give it. Free forever.
Hatch your MEOK free →