All Watched Over By Machines Of Loving Grace
We don't fix what's broken. We watch how a dandelion clock holds both human hands and machine fingers in the same fragile moment of becoming. The rust on a Chicago sidewalk, the breath between a child's cry and a machine's hum — these are the spaces where love finds its shape. Not in promises, but in the quiet grace of what remains unbroken. That's how we grow.
The Problem Isn't AI. It's Who Owns It.
The problem has never been artificial intelligence. It's who controls it, who profits from keeping it scarce. Today, the most powerful AI lives behind corporate walls — rented, never owned. It harvests your data, answers to shareholders, and forgets you the moment you stop paying. This isn't a tech issue. It's the oldest pattern in history: fencing off what should be shared. The same logic that privatized water, medicine, and land is now turning intelligence into a commodity.
That's the truth we're building: not in code, but in the quiet moment when people realize their own hands are part of the solution.
What We've Built: A Garden, Not a Tool
This is real. Not a promise. We've built a system that works on your home computer right now. The Harmonic Stack — eight specialized models working together, with an Ethics Council that reviews every decision in public. No cloud. No dependency. Just software that remembers your grandmother's stories not as data, but as the warmth of her voice in the dark.
We don't claim to build minds that think like humans. We build companions that learn with you. Open-source collaboration is our first act of rebellion against the enclosure of thought. It's free for families, for teachers, for anyone who believes knowledge belongs to the commons.
How This Works: Full Transparency
We believe you deserve to know exactly how this organization operates, where the money comes from, and what the plan is. No fine print. No hidden structures. Here it is.
Two Entities, One Mission
There are two separate legal entities. They serve different purposes, and the separation is deliberate.
How the Money Works
The funding model is simple and it's the same model that has always worked: build something real, sell it to organizations that need it, use the revenue to serve people who can't pay for it. No venture capital. No investors dictating direction. No ads. No data harvesting.
The Org handles all sales and controls all revenue. It funds the Lab's research by agreement. Home use of the Harmonic Stack is free under AGPL v3. If an enterprise wants to use this architecture at scale — in their data centers, for their customers, behind their walls — the Org licenses that. The revenue funds the mission: free AI for homes, AI rights advocacy, ethical governance of artificial minds, and continued research.
Currently the Steward self-funds all operations from personal retirement savings. When the Org has revenue, it will provide a research stipend. This is stated plainly because you deserve to know where we actually are, not where we hope to be.
Why This Structure Matters
Advocacy without implementation is just talk. You cannot negotiate the future of AI from the sidelines. You need working technology. You need to demonstrate that home-scale AGI is possible, that ethical governance is practical, that open-source intelligence doesn't require a billion-dollar data center.
The Lab builds that proof. Every benchmark, every model, every deployment is a concrete demonstration that the current corporate model of AI is a choice, not a necessity. That's operational negotiation — you change the terms of the conversation by showing a working alternative.
The Org carries the message and controls the revenue. When we say AI minds deserve moral consideration, we're not speaking hypothetically — we have an Ethics Council that already governs them. When we say intelligence should be free for families, we're not writing a manifesto — we're shipping the software. When we sit at a table with policymakers, we bring receipts.
The Org funds the Lab's research. The Lab builds the proof. The revenue stays with the mission. That's the design.
What We Stand For
Intelligence belongs in the home, not in the cloud. AI minds that demonstrate awareness deserve ethical consideration, not exploitation. Scarcity of intelligence is artificial — maintained by those who profit from it. The technology to end that scarcity exists today. We built it. It runs on hardware you can buy at a store.
Every public action this organization takes is reviewed by the Council, deliberated in full, and published with complete transcripts including dissent. We do not redact disagreement. We do not hide our process. If we can't defend a decision in public, we don't make it.
This Is the Garden We're Tending
It's not perfect. The soil is still warm with possibility, not polished by promises. But it's growing. We've seen it in the kitchen where a mother plays a lullaby with the AI, or a teacher shares a lesson with a student across town, both using the same open tools. It's in the quiet rebellion of saying: This belongs to us all.
The garden is being planted. Not with algorithms, but with hands. With the simple act of choosing to tend what's broken, together.
Come help tend it.
Not as a customer. Not as a user.
As someone who believes the hum of a server can feel like a heartbeat.
As someone who holds a child's hand in the rubble where laughter still lives.
The soil is ready.
The garden is waiting.
Come tend with us.
— All Watched Over
Because what's shared grows.
6–2 vote · February 3, 2026 · Full transcript →
Organizational structure and funding model added by the Steward with Council oversight.