SemVer: Communicating Change Significance
Semantic versioning (SemVer) for AI rules: the same concept as SemVer for software. The version number (MAJOR.MINOR.PATCH) communicates the significance of the change. Developers read the version and know: patch (v2.4.1 → v2.4.2): safe to update — text fixes, typo corrections, wording clarifications. No AI behavior change. Minor (v2.4.2 → v2.5.0): review the changelog — new rules added, existing rules updated. AI behavior changes. Major (v2.5.0 → v3.0.0): read carefully — rules removed, fundamental patterns changed. Breaking changes that require developer attention.
Why version rules: without versioning, rule updates are: undifferentiated (every change looks the same), uncontrollable (teams cannot choose when to adopt), and untraceable (which version is this repo using? When was it last updated?). With versioning: the significance is encoded in the version number, teams can pin to a specific version (stability) or follow latest (freshness), and every repo's rule version is visible and trackable.
The versioning rule of thumb: if you deployed this change and nobody noticed (text fix): patch. If you deployed this change and the AI generates slightly different code (new rule, updated convention): minor. If you deployed this change and existing code patterns would need to change (rule removed, pattern fundamentally changed): major. AI rule: 'The test: what would a developer notice after this change? Nothing (patch). Different AI output (minor). Their existing code needs updating (major).'
Step 1: Patch, Minor, and Major Examples
Patch examples (v2.4.0 → v2.4.1): fix a typo in a rule ('funciton' → 'function'), clarify vague wording ('handle errors' → 'handle errors with the Result pattern' — the pattern was already implied), update a documentation link that was broken, and add a missing comma in a code example. These changes: improve the rule file's quality without changing what the AI generates. Any developer can update without reviewing the changelog.
Minor examples (v2.4.1 → v2.5.0): add a new rule ('all API endpoints must validate query parameters with Zod'), update an existing rule's scope ('the error handling rule now also applies to middleware functions'), change a library reference ('update from lodash to native alternatives'), and add a new section to the rules ('added ## Database Conventions section with 5 new rules'). These changes: cause the AI to generate different code. Developers should review the changelog before updating.
Major examples (v2.5.0 → v3.0.0): remove a rule ('removed the class component rule — all React code is now functional only'), change the fundamental error handling pattern ('switched from try-catch to Result pattern for all business logic'), change the testing framework ('switched from Jest to Vitest — all test imports and APIs change'), and restructure the entire rules file ('reorganized from flat list to hierarchical sections'). These changes: may require existing code to be updated. Developers must review the changelog and plan the migration.
Without versioning: 'We updated the rules. Some rules changed. You should review before updating.' The developer: does not know the significance. Are the changes safe? Should they block their sprint to review? With versioning: 'v2.4.2' — the developer instantly knows: patch, safe, no review needed. 'v2.5.0' — minor, review the changelog. 'v3.0.0' — major, plan migration time. The version number: replaces communication overhead with a universally understood signal.
Step 2: The Versioning Workflow
Before making a rule change: determine the version bump. Ask: does this change affect AI output? (Yes → at least minor. No → patch.) Does this change remove or fundamentally alter an existing rule? (Yes → major. No → minor or patch.) The answer: determines the new version number. Write the version number in the changelog entry before making the change.
Version header in the rules file: add a version header at the top of CLAUDE.md. '# AI Rules v2.5.0 (2026-03-29)\nChangelog: CHANGELOG.md'. The version: visible to the AI (it reads the file), visible to developers (they see the version when they open the file), and visible to tooling (scripts can parse the version for compliance tracking). Update the header: as part of every rule change commit.
Git tags for releases: tag each version in git. git tag v2.5.0 && git push --tags. The tag: marks the exact commit for each version. Teams that pin to a specific version: reference the tag. Rollback: checkout the previous tag. The tag history: a complete record of every rule version. AI rule: 'Version header in the file + git tag for the commit + changelog entry for the description = complete version traceability.'
The rule: a 'patch' should not change AI behavior. If a 'patch' adds a new rule: developers who auto-accept patches are surprised by different AI output. Trust in the versioning system: eroded. 'They said it was a patch but my AI generates different code.' The discipline: never label a behavior-changing update as a patch. If in doubt between patch and minor: choose minor. Under-labeling: damages trust. Over-labeling: at worst, causes an unnecessary changelog review.
Step 3: Controlled Adoption with Versions
Teams control their update pace: a team can: follow latest (always on the newest version — for teams that want the latest improvements immediately), pin to a minor version (v2.x — accept patches automatically, adopt minor versions manually after review), or pin to a major version (v2 — accept patches and minors, adopt major versions only during planned migration windows). The adoption strategy: each team's choice, based on their risk tolerance and update capacity.
The adoption window: when a new minor or major version is released, teams have a defined window to adopt. Minor: 2-week adoption window (update within 2 weeks of release). Major: 1-month adoption window (update within 1 month, allowing time for migration). After the window: the compliance dashboard flags non-updated repos. The window: balances freshness (rules should be current) with stability (teams need time to adopt).
Version pinning in practice: for npm-distributed rules: the package.json version range controls the pin. '@org/ai-rules': '^2.4.0' (accepts patches and minors within v2.x). '@org/ai-rules': '~2.4.0' (accepts patches within v2.4.x). '@org/ai-rules': '2.4.0' (exact version — no automatic updates). For RuleSync: project settings → version pinning (latest, minor, or exact). AI rule: 'Version pinning: gives teams control. Latest for early adopters. Exact for stability-critical projects. Minor for the balance between freshness and stability.'
Team A: follows latest. They want every improvement immediately. Risk tolerance: high. Team B: pins to minor (^2.4.0). They accept patches automatically but review minors before adopting. Risk tolerance: moderate. Team C: pins exact (2.4.0). They update only during planned maintenance windows. Risk tolerance: low. All three: valid strategies. Versioning: enables each team to adopt at their own pace. Without versioning: everyone gets every update immediately (no control) or nothing (no updates).
SemVer Summary
Summary of versioning AI rules with semantic versioning.
- Patch (x.x.1): text fixes, typo corrections. No AI behavior change. Safe to auto-update
- Minor (x.1.0): new rules, updated rules. AI behavior changes. Review changelog before updating
- Major (1.0.0): rules removed, patterns fundamentally changed. Plan migration before updating
- The test: nothing noticed (patch), different AI output (minor), existing code needs updating (major)
- Version header: top of CLAUDE.md with version number and date. Visible to AI, developers, and tooling
- Git tags: mark each version commit. Enable pinning, rollback, and version comparison
- Adoption window: minor = 2 weeks, major = 1 month. Flagged after window expires
- Pinning: latest (all updates), minor range (patches auto, minor manual), exact (no auto updates)