When Teams Merge, Rules Must Merge Too
Merge scenarios: team reorganization (Team A and Team B merge into one team — their rules must combine), monorepo migration (three separate repos with three CLAUDE.md files become one monorepo), ruleset consolidation (the organization has 15 technology rulesets with significant overlap — consolidate to 5), and acquisition integration (the acquired company's rules merge with the parent company's rules). Each scenario: requires combining rules from multiple sources without losing conventions or creating conflicts.
The merge challenge: Team A uses try-catch for error handling. Team B uses the Result pattern. Both cannot coexist in the same rule file (the AI would be confused). The merge: must choose one approach (or scope both — try-catch for Express routes, Result pattern for service functions). This conflict resolution: the most important part of the merge process. The rules: cannot be blindly concatenated.
The merge principle: the merged ruleset should be an improvement over any individual source, not a lowest-common-denominator compromise. If Team A has a great testing convention and Team B has a great error handling convention: the merged ruleset should have both. The merge: is an opportunity to create the best possible ruleset from the combined expertise of both teams.
Step 1: Inventory and Overlap Analysis (20 Minutes)
Create a side-by-side inventory: list every rule from each source, categorized by topic (naming, error handling, testing, security, etc.). For each topic: identify agreements (both sources have the same or compatible rules), differences (the sources address the topic differently), gaps (one source has a rule the other does not), and conflicts (the sources have contradictory rules for the same topic). The inventory: a spreadsheet or table with columns for each source and rows for each topic.
The typical distribution: 60% agreements (both teams follow the same conventions — merge is trivial). 25% gaps (one team has rules the other does not — adopt the rule if it improves the merged set). 10% differences (different approaches to the same topic — evaluate and choose the better one). 5% true conflicts (contradictory rules — requires discussion and resolution). AI rule: 'Most rules merge without conflict. Focus effort on the 5% that truly conflict. The rest: straightforward.'
Tools for the analysis: diff the two rule files (shows textual differences). Ask the AI to compare: 'Here are two rule files. Identify: agreements, differences, gaps, and conflicts.' The AI: efficient at comparing structured text and identifying overlaps. The AI's analysis: a starting point that the human reviewers refine. AI rule: 'Use the AI to do the initial comparison. It identifies overlaps and conflicts faster than manual reading. Human review: refines the AI's analysis and resolves the true conflicts.'
Both teams: use TypeScript strict mode, camelCase naming, Vitest for testing, and no secrets in code. These agreements: 60% of the typical rule inventory. They merge trivially: copy once into the merged file. No discussion needed. No conflict resolution. The merge effort: focused entirely on the 5% that truly conflict. Do not spend time on the 60% that already agree — they are free wins.
Step 2: Resolving Conflicts and Choosing Approaches
For each conflict: evaluate both approaches. Which produces better AI output? Which has better rationale? Which is more widely adopted in the team? Which aligns better with the organization's direction? Example: Team A uses try-catch. Team B uses Result pattern. Evaluation: the Result pattern is more explicit (errors are return values, not thrown exceptions), aligns with the org's functional programming direction, and was adopted more recently (represents newer thinking). Decision: adopt the Result pattern for the merged ruleset. Document: the rationale in the merged rule.
Involve both teams: conflicts should be resolved by representatives from both teams (the tech leads or senior engineers). Not by one team dictating to the other. The resolution session: each side presents their approach and rationale. The group: evaluates on merit. The decision: documented. Both teams: accept the decision because they participated. AI rule: 'Conflict resolution: collaborative. Both teams present their approach. The group decides on merit. Nobody is overruled without being heard.'
Scoping as resolution: some conflicts can be resolved by scoping rather than choosing. Team A's try-catch: appropriate for Express middleware (framework requirement). Team B's Result pattern: appropriate for service functions (composability requirement). Scoped resolution: 'Express middleware: try-catch with next(error). Service and repository functions: Result pattern.' Both approaches: survive, each in its appropriate scope. No team loses their convention. AI rule: 'Scoping: the win-win resolution. Both approaches survive in their appropriate contexts. Nobody loses. Evaluate scoping before choosing.'
Team A's tech lead writes the merged rules: 'We are keeping all of Team A's conventions. Team B adopts our standards.' Team B: resists. They feel their conventions (which worked well for them) are being dismissed. Adoption: reluctant. Quality: lower than either team's individual rules because Team B does not believe in the conventions. Fix: collaborative resolution. Both teams present. Both evaluate on merit. The decision: collective. Both teams: invested in the outcome.
Step 3: Execute the Merge and Validate
Write the merged ruleset: start with the agreements (copy directly — both teams already follow these). Add the adopted gaps (rules from one team that improve the merged set). Apply the conflict resolutions (the chosen approach or the scoped resolution). Structure: organize the merged rules into clear sections with the same hierarchy as the individual sources. The merged file: should be an improvement over both sources — more comprehensive, better organized, and conflict-free.
Validate with test prompts: run the 5-prompt benchmark suite against the merged rules. Verify: the AI generates correct code for conventions from Team A, correct code for conventions from Team B, and correct code for the conflict resolutions (scoped patterns applied to the right contexts). If any prompt produces incorrect output: the merge introduced an issue. Investigate and fix before deploying.
Deploy and communicate: deploy the merged ruleset to all projects. Communicate: 'The rules from Team A and Team B have been merged into a single ruleset. Key changes: [list the conflict resolutions and new additions]. The merged rules: take effect on your next rulesync pull.' The communication: ensures nobody is surprised. The merged rules: deployed as a new version (not as an update to either team's existing ruleset — it is a new artifact). AI rule: 'The merged ruleset: a new version, not a patch to either source. Deploy as a major version (v3.0.0) to signal the significance of the change.'
Team A: try-catch. Team B: Result pattern. Choosing one: the other team loses their preferred approach. Scoping: 'Express middleware: try-catch (framework requirement). Service functions: Result pattern (composability requirement).' Both approaches: survive in their appropriate contexts. Team A: keeps try-catch where they need it. Team B: keeps Result pattern where they need it. Nobody loses. The merged ruleset: more precise than either individual ruleset (it specifies which pattern for which context).
Ruleset Merge Summary
Summary of merging multiple team AI rulesets.
- Scenarios: team reorg, monorepo migration, ruleset consolidation, acquisition integration
- Principle: the merged ruleset should be better than any individual source, not a compromise
- Inventory: side-by-side comparison. 60% agreements, 25% gaps, 10% differences, 5% conflicts
- AI-assisted: ask the AI to compare rule files. Human review refines the analysis
- Conflict resolution: evaluate on merit. Involve both teams. Scoping often beats choosing
- Scoping: both approaches survive in appropriate contexts. The win-win resolution
- Validation: 5-prompt benchmark. Verify AI generates correct patterns from both source teams
- Deploy: as a new major version. Communicate the key changes and conflict resolutions