Team-Sourced Rules Have Higher Adoption
Rules written by one person: reflect that person's experience and preferences. The team: may not agree with all of them. Adoption: partial, with frequent overrides of rules the team did not choose. Rules crowdsourced from the team: reflect collective experience and consensus. Every developer contributed. Every developer sees their input in the final rules. Adoption: high, because the rules are theirs. The crowdsourcing investment: 1-2 hours of team time. The return: rules that the team follows willingly because they authored them.
The crowdsourcing principle: every developer on the team has conventions they care about. The senior engineer: cares about architectural patterns and error handling. The frontend developer: cares about component structure and state management. The junior developer: cares about naming conventions and test patterns (these are what confused them most during onboarding). All perspectives: valuable. The crowdsourced rule set: covers more ground and addresses more real-world concerns than any single person's rules.
When to crowdsource: when creating rules for the first time (the team writes the initial set together), when onboarding a new team member (their fresh perspective reveals unwritten conventions), when merging teams (both teams contribute their best conventions), and during the quarterly review (open the floor for new rule proposals from anyone). AI rule: 'Crowdsource for initial creation and quarterly expansion. Individual authoring for refinements and updates. The balance: collective wisdom for decisions, individual efficiency for execution.'
Step 1: Collect Conventions from Every Developer
Async collection (2-3 days before the session): send a form or shared document. Prompt: 'List your top 5 coding conventions — the patterns you care most about. For each: what is the convention, and why does it matter?' Each developer: writes 5 conventions independently. No collaboration at this stage — independent input prevents groupthink and ensures diverse perspectives. The facilitator: collects all submissions and identifies themes (multiple developers listing the same convention).
The themes that emerge: agreements (conventions listed by 3+ developers — these become rules immediately), unique contributions (conventions listed by only 1 developer — these are discussed, often revealing unwritten conventions that everyone follows but nobody documented), and disagreements (2 developers listing conflicting conventions — these are resolved in the group session). Typical distribution from a 10-person team: 8-10 agreements, 15-20 unique contributions, and 2-3 disagreements.
Template for the form: 'Convention 1: [description]. Why: [why this convention matters — what problem does it solve?]. Example: Convention: use named exports, not default exports. Why: named exports enable precise renaming during refactoring and better tree-shaking.' The why: helps the team evaluate each convention on its merits during the group session. AI rule: 'The async collection: gives every developer equal voice. Introverts write their conventions without speaking up. Juniors contribute without feeling intimidated by seniors. The submissions: anonymous until the group session.'
If the senior engineer shares first: the junior developers align with their conventions ('they must know best'). Groupthink: the team adopts one person's preferences, missing diverse perspectives. Anonymous async collection: each developer writes their conventions independently. The facilitator: presents the conventions without attribution. The discussion: evaluates each convention on its merit. A junior developer's convention: adopted if the team agrees, regardless of who proposed it.
Step 2: Vote and Prioritize (1-Hour Session)
Present the agreements (15 min): conventions listed by 3+ developers. These: adopted as rules with no debate needed (the team already agrees). The facilitator: reads each agreement, confirms nobody objects, and adds it to the rule list. Typical result: 8-10 rules adopted in 15 minutes. The quick wins: build momentum and show progress.
Discuss unique contributions (30 min): the facilitator presents each unique convention (without revealing who submitted it). The team: discusses whether it should become a rule. Questions: does the team follow this convention in practice (even if nobody documented it)? Would it improve code quality if encoded as a rule? Is it a personal preference or a team-wide convention? Vote: thumbs up (add as a rule), sideways (neutral — could go either way), or thumbs down (do not add). Majority rules. Typically: 5-8 unique contributions become rules, 10-12 are noted but not adopted (too specific or too personal).
Resolve disagreements (15 min): the facilitator presents each conflicting pair. Both submitters: explain their reasoning (1 minute each). The team: discusses and votes. If the vote is close (6-4): try scoping (both conventions survive in different contexts — 'use classes for NestJS services, use functions for utilities'). If one side clearly wins (8-2): adopt the majority's convention. The losing side: accepts the team decision because they participated and were heard. AI rule: 'The disagreement resolution: the most important part. It turns potential resistance into participation. The process: fair, transparent, and decisive.'
The session starts with agreements (conventions 3+ developers listed independently). These: adopted in 15 minutes with no debate. The team: sees 8-10 rules appear on the screen. Progress: visible and fast. Energy: high. By the time the team reaches the discussions and disagreements: they have already produced half the rules. The momentum: carries through the harder conversations. Starting with disagreements: would drain energy before the easy wins are captured.
Step 3: Finalize and Deploy
After the session: the facilitator compiles the adopted conventions into CLAUDE.md format. Structure: organized by category (from the session's themes), with rationale from the discussion. The compiled rules: shared with the team for a final review (24-hour comment period — catch any issues the session missed). After the review: committed to the repository. The team: starts using AI rules that they authored collectively.
Post-session follow-up: the conventions that were discussed but not adopted (the unique contributions that received neutral or negative votes): added to a backlog. These: revisited at the next quarterly review. Some: become relevant as the project evolves. Others: remain on the backlog permanently (team preference that was never adopted). AI rule: 'The backlog preserves ideas that were not adopted now but might be relevant later. It ensures: no good idea is lost, even if the timing was not right.'
Re-crowdsource quarterly: at each quarterly review, add 15 minutes for new crowdsourced contributions. Prompt: 'Since the last review: what conventions have you noticed that are not in the rules?' The team: proposes 3-5 new rules per quarter. The rules: grow organically from team experience. Over a year: the initial 15-20 crowdsourced rules grow to 30-40 through incremental crowdsourcing. AI rule: 'Initial crowdsourcing: the foundation. Quarterly additions: the growth mechanism. The rules: evolve with the team's experience.'
Vote: 6 developers want classes for services. 4 want functions. Choosing classes: the 4 who wanted functions feel overruled. They may resist or override. Scoping: 'Classes for NestJS services (framework convention). Functions for standalone utilities (simpler).' Now: everyone's preference survives in its appropriate context. The 6: get classes where they wanted them. The 4: get functions where they wanted them. Scoping: the resolution that leaves both sides satisfied.
Crowdsourcing Summary
Summary of crowdsourcing AI rules from the team.
- Value: team-sourced rules have higher adoption than individually-authored rules
- Async collection: each developer lists top 5 conventions independently. 2-3 days before session
- Themes: agreements (3+ developers, adopted immediately), unique contributions (discussed), disagreements (resolved by vote)
- Session: 1 hour. Agreements (15 min), unique contributions (30 min), disagreements (15 min)
- Voting: thumbs up/sideways/down. Majority rules. Close votes: try scoping before choosing
- Result: 15-20 team-owned rules from the first session. Growing 3-5 per quarter
- Backlog: unadopted ideas preserved for future quarterly reviews
- Quarterly: 15 minutes for new crowdsourced contributions. Rules grow with team experience