Why 50 Repos Changes Everything
At 1-5 repos, AI rules are managed manually: copy the CLAUDE.md file, customize per project, update when needed. At 50+ repos: manual management breaks down. A rule change (update the TypeScript version requirement from 5.3 to 5.4) requires editing 50 files across 50 repos. One team forgets to update: their AI generates code targeting the old TypeScript version. Another team customizes the base rules so heavily that the update conflicts. A third team never adopted the rules at all.
The 50-repo threshold is where AI rules management becomes a systems problem, not a file editing problem. You need: a canonical source of truth for base rules, automated distribution to all repos, a customization layer that does not conflict with base rules, adoption tracking (which repos have current rules, which are outdated), and a governance process for rule changes (who approves changes that affect 50 teams).
The RuleSync approach: centralize base rules in a rules repository, distribute automatically via CI/CD or CLI sync, allow team-specific overrides in a separate section, and track adoption through dashboards. This guide covers each step for the 50-repo scale.
Rule File Architecture for Multi-Repo
The two-layer architecture: base rules (organization-wide, maintained centrally) and team rules (project-specific, maintained by each team). Base rules cover: language standards (TypeScript strict mode, Python type hints), security requirements (no secrets in code, input validation), testing standards (minimum coverage, test naming), and tool configurations (linter settings, formatter config). Team rules cover: project-specific patterns, domain-specific conventions, and framework-specific guidance.
File structure: the base rules live in a central repository (e.g., github.com/org/ai-rules). Each project repo has a CLAUDE.md (or equivalent) that imports or includes the base rules plus team-specific additions. AI rule: 'Base rules: read-only in project repos. Updates flow from the central repo. Team additions: clearly separated (different section or file). The AI reads both layers and follows both, with team rules taking precedence for conflicts.'
Versioning: base rules should be versioned (v1.2.0) so teams can adopt updates at their own pace during a transition period. AI rule: 'Version the base rules. When a new version is released: teams have a 2-week window to adopt. After the window: automated PRs update lagging repos. This prevents forced breaking changes while ensuring eventual consistency.'
Do not mix base and team rules in the same section โ it makes automated sync impossible. Pattern: CLAUDE.md has a clearly marked section (## Organization Standards โ DO NOT EDIT BELOW) for base rules that the sync tool manages, and a separate section (## Team Standards) that the team owns. Or use two files: .claude/base.md (synced automatically) and CLAUDE.md (team-managed, imports base). Physical separation prevents accidental edits to synced content.
Automated Distribution and Sync
Manual distribution at 50 repos: someone opens 50 PRs to update rule files. This takes hours and is error-prone. Automated distribution options: CI/CD pipeline that syncs rules on a schedule (GitHub Actions workflow that pulls from the central repo), CLI tool that pushes updates to all repos (RuleSync CLI: rulesync push --all), or a bot that opens PRs when the central rules change (similar to Dependabot for dependencies).
The sync workflow: central rules change โ CI triggers โ for each repo: compare current rules with new rules โ if different: open a PR with the diff โ team reviews and merges. This preserves team autonomy (they review the PR) while automating the distribution. AI rule: 'Never force-push rule changes without team review. Open PRs that show the diff. Teams merge when ready. Track which repos have merged and which are lagging.'
Handling conflicts: when team-specific rules conflict with base rule updates, the sync tool should: flag the conflict, show both versions, and let the team resolve. AI rule: 'Sync tool: detect conflicts between base updates and team customizations. Do not silently overwrite team rules. Open a PR with conflict markers. The team resolves the conflict, preserving their customizations while adopting the base update.'
A sync tool that force-pushes rule changes to 50 repos without review: breaks team trust, overwrites team customizations, and introduces changes that may not work for all projects. Always open PRs. Let teams review the diff. If a team rejects the change: that is valuable feedback that the base rule may need adjustment. The adoption dashboard tracks who has merged โ peer pressure and gentle reminders work better than force-pushes.
Adoption Tracking and Governance
Adoption dashboard: track which repos have current base rules, which are one version behind, and which have never adopted. Metrics: adoption rate (repos with rules / total repos), currency rate (repos on latest version / repos with rules), and customization rate (repos with team additions / repos with rules). AI rule: 'Generate a dashboard that queries all repos for their rule file version. Show: fully current (green), one version behind (yellow), two+ versions behind (red), and no rules (gray). This gives leadership visibility into AI rules adoption.'
Governance structure: who decides what goes into base rules? Pattern: a rules committee with representatives from major teams. Changes are proposed as PRs to the central rules repo. The committee reviews: is this rule universally applicable? Does it conflict with existing rules? Does it improve code quality across the organization? AI rule: 'Base rule changes: require committee review and approval. Team rules: each team manages independently. This prevents base rules from growing uncontrollably while giving teams autonomy.'
Onboarding new repos: when a new repository is created, the rules should be applied automatically. AI rule: 'New repo template: include the base rule file and sync configuration. When a team creates a new repo from the template: rules are already in place. The sync tool picks up the new repo automatically. First-day AI coding in a new repo: already has the organization's standards.'
Without visibility: leadership does not know if AI rules are adopted, and teams do not feel accountable. A simple dashboard showing green/yellow/red/gray per repo: creates healthy peer pressure (no team wants to be the only red dot), gives leadership confidence that standards are being followed, and identifies teams that need help (consistently red repos may have legitimate conflicts that need resolution, not just negligence).
50-Repo Rollout Summary
Summary of the enterprise AI rules rollout strategy for 50+ repositories.
- Two-layer architecture: base rules (central, read-only) + team rules (project-specific, team-managed)
- Versioning: semantic versions on base rules. 2-week adoption window. Automated PRs for lagging repos
- Distribution: automated sync via CI/CD, CLI tool, or bot. PRs for review, never force-push
- Conflict handling: detect conflicts with team rules. Flag, do not overwrite. Team resolves
- Adoption dashboard: current (green), behind (yellow), outdated (red), missing (gray)
- Governance: rules committee reviews base changes. Teams manage their own additions independently
- Onboarding: repo templates include base rules. New repos get standards from day one
- Scale: this architecture supports growth from 50 to 200+ repos without changing the approach