Mission
The Heart AI Foundation exists to make AI governance measurable. It develops public standards, certification methods, and evidence practices so AI systems can be evaluated on behavior rather than promise.
Why we exist
AI governance fails when the system being evaluated controls the evidence about its own compliance. The Foundation exists to remove that conflict by publishing external standards, independent scoring methods, and assessment procedures that can be inspected by third parties.
What we commit to
- Publish the HEART Standard and the supporting technical vocabulary.
- Maintain independent certification practices through Guardians.
- Maintain the GTE open trust infrastructure as a framework-agnostic contribution to AI governance.
- Keep the measurement model separated from the product or platform being assessed.
- Support research that can replicate, challenge, or refine the public specifications.
- Design operations to resist commercial, regulatory, ideological, founder, and self-capture.
What we do not do
- We do not ask systems to certify themselves.
- We do not sell the standard as a closed proprietary product.
- We do not conflate open research with certified compliance.