Not a values page. A contract with ourselves, written into our articles of association.
Five beliefs came before the product. They shape every decision we make about what to build and how to build it.
Each one has a concrete commitment behind it. Four are written into our articles of association, which means they cannot be quietly revised by a board, a buyer, or a future management team.
Read them. Hold us to them.
AI was trained on everything that makes us human. Our data, our history, our ingenuity, our Wikipedia edits, our academic papers, our medical literature. None of the institutions that produced that material signed up to become training material for a handful of US platforms that now sell it back to us by the seat. The asymmetry is the point, and the point is wrong.
Klai builds AI infrastructure Europe can use on its own terms. Hosted in the EU. Code you can read. Terms you can verify. Accessible to every European organisation, not just the ones with the right US procurement contracts.
In April 2026 Microsoft quietly reconfigured Copilot to route EU data to US, Canadian, and Australian servers during peak load. 50 million tenants affected. European administrators heard about it through a message center update buried in admin notifications. Data residency is not sovereignty.
Every major AI platform treats privacy as a configuration state. Tick this box, enable this mode, sign this addendum. For organisations handling client data, that's not how it works. You cannot build real work on a tool whose privacy is conditional on the right intern clicking the right checkbox.
Your data never leaves the EU. Your data is never used to train models. There is no mode to turn that off. It is not a policy. It is the architecture.
34.8% of what employees paste into ChatGPT is sensitive company data. In 2023 it was 11%. People know they shouldn't. They do it anyway because the approved tools are too slow or don't exist. The only working answer is an approved tool that works.
The entire internet runs on open source. So does every major AI company. Most of them take far more than they give back to the maintainers who keep the lights on. We refuse that. We stand on the shoulders of giants, and we don't pretend otherwise.
Every line of Klai code is open source. Read it, fork it, self-host it, audit it. A portion of every euro you pay Klai when you join flows back to the open source projects we depend on. Not as charity. As what we owe.
Google gave $2 million to open source in 2024. Their revenue was $350 billion. Meanwhile the maintainer of core-js, a library that runs on more than half of the top 10,000 websites, earned around $400 a month before he went public about it. That gap is where "we contribute" stops being a credible claim.
Primary infrastructure should not be for sale. When banks, hospitals, and governments depend on a platform, that platform becomes part of the public fabric. A contract clause giving you an exit right is not the same as a legal structure that makes the acquisition itself impossible.
Klai is steward-owned. No shareholders. No exit strategy. No board with authority to sell. Written into our articles of association. No future management team, majority investor, or charismatic founder can approve a sale. No such mechanism exists in the documents.
In late 2025, US company Kyndryl moved to acquire Solvinity, the Dutch company that hosts DigiD. DigiD is used by 16.5 million Dutch citizens for 645 million authentications a year. The Dutch parliament moved to block the deal. The reason was specific: once a US entity holds the asset, US CLOUD Act jurisdiction applies, and Dutch contractual protections become unenforceable. Primary infrastructure cannot depend on terms that can be overridden by a change of ownership.
When you pay Klai, your money doesn't become someone else's dividend. It flows back into the infrastructure you use, into new features, into the open source projects we build on. You are funding a commons. In spirit and in code, you are a co-owner of what gets built.
Public roadmap. Transparent finances. Code you can fork. Your voice matters, and it shapes the product. Built by a team that is used to serving communities.
Builder.ai became insolvent in May 2025. Its enterprise customers were stranded mid-contract. OpenAI publicly complained recently that Microsoft "limited our ability to reach enterprise clients." Ownership structure shapes product structure. It shapes who the company is actually built to serve.
Four of these five principles are embedded in our legal structure today. The fifth, funding the commons, becomes operational as soon as we have revenue to share. We made them structural for a reason. A slogan can be rewritten in a weekend. A legal fact takes a very different kind of conversation.
If this matches what you're looking for in an AI infrastructure partner, talk to us. If it doesn't, at least now you know.