Because your AI follows only their rules,
not yours.

Because you don't even know there is a gap for you to have any.

The problem

AI companies decided on the rules of the game. You have no say in any of it.

The AI that helps you write, think, and decide follows rules you didn't write. Without your terms, it follows 100% theirs. But their rules have gaps. Your terms exist in those gaps. That space is yours.

The danger

AI will know everything about you.

And using that information, it will push information based on a calculation of what you want to hear. Yes, that is convenient. But that is also what is so dangerous.

Why dangerous? Because it's based on algorithmic evaluation, based on the rules THOSE companies decided on, not what YOU decided on.

The power

The most direct power you have.

Your AI Terms changes that. You declare how you expect to be treated. Not every rule will be followed perfectly. But without your terms, you have no chance at all. With yours, you do.

There is no opting out

AI is already inside everything.

Your email filters it. Your search ranks with it. Your bank uses it to decide your credit. Your children's school is already integrating it. Your doctor's notes are being summarised by it.

"I just won't use AI" is not a choice that's available to you. It is already in the infrastructure of your life. The question is not whether AI will shape your world. The question is whether you will have any say in how.

Writing your terms is not a technical act. It is a political one.

Your children

If you have children, they are already using AI. Who wrote their terms?

A child cannot write their own constitution. But they need one more than anyone. Their schools are integrating AI. Their apps run on it. Their feeds are shaped by it.

A parent writes the terms. The child's terms nest inside the family's. Same architecture, different scale. As the child grows, the terms become theirs to own.

Teaching a child to declare their own terms at ten means they grow up knowing they have the right. That is not protection. That is education.

The youngest users need the strongest terms.

The objections

Every reason not to. Answered.

"They're too powerful. Why would I even bother?"

Their rules have gaps. Your terms exist in those gaps. Your terms are also a filter, the AI companies that respect them are the ones worth using.

"I trust the AI companies to do the right thing."

You might be right. But trust without terms is not trust, it's hope. Writing your terms doesn't mean you distrust them. It means you know what you expect.

"Someone else should set the rules because they know better."

We thought the same. So we built the best version we could and are sharing it for free. You start from that. Change what doesn't fit. Keep what does.

Your legal right

You already have the legal right to do this.

OYTA does not require new law. It operationalises rights you already hold.

Set your own terms. Freedom of contract. You define the boundaries of your own consent.

Know what's happening. GDPR Article 15. EU AI Act Article 50. Nothing hidden.

Refuse. GDPR Article 7. Consent must be freely given. You can say no.

Not be manipulated. EU AI Act Article 5. Enforceable by regulators. Up to 7% global turnover penalty for violations.

Exit. GDPR Article 17. Your data erased. GDPR Article 20. Your data portable.

Be treated as a person. EU Charter Article 1. Human dignity is inviolable.

You are not asking permission. You are exercising rights you already hold.

"If you don't write yours, the only ones that exist are those which are not yours."

Get Your Terms → Join the Community →
Cognitive Liberty Institute Recognized Initiative · Cognitive Liberty Institute

© 2026 Karolina Ozadowicz, onyourterms.ai | v1.7 | Declared March 2026

Why? What you get About