Different Motivations for Writing Rules
Solo developers write AI rules for: their future self ("I always forget which ORM pattern this project uses — the rule file reminds me and the AI"), consistency across sessions (the AI generates the same patterns whether you coded yesterday or three months ago), and reducing decision fatigue (the rule file pre-decides conventions so you can focus on the problem, not the style). The audience is: you and your AI. The rules are: personal conventions codified as AI instructions. Nobody reviews them. Nobody needs to agree with them.
Teams write AI rules for: cross-developer consistency ("every developer's AI generates the same patterns, regardless of who is coding"), onboarding speed (new developers read the rule file and immediately generate convention-matching code), code review efficiency (the AI follows the conventions, so reviews focus on logic, not style), and quality assurance (CI enforces rule compliance, so convention violations do not reach the codebase). The audience is: every developer on the team and their AI tools. The rules are: team agreements codified as shared instructions.
The motivation shapes the content: solo rules are personal and pragmatic ("I prefer async/await" — no justification needed). Team rules are collaborative and justified ("We use async/await because: consistent error handling, better stack traces, and most team members are more fluent with it" — the reasoning helps new members understand and accept the convention). Solo rules are: shorter (only you need to understand them). Team rules are: more explicit (they must be understood by people who did not write them).
Solo Developer Rules: AI Memory for Personal Projects
A solo developer CLAUDE.md contains: the technology stack ("Next.js 16, TypeScript, Tailwind, Drizzle, Neon" — the AI remembers which versions and libraries this project uses), personal patterns ("I use Zustand for state. TanStack Query for data fetching. React Hook Form for complex forms." — the AI follows your preferences without you re-explaining every session), project-specific context ("This is a SaaS for managing AI coding rules. The main entities are: users, rulesets, projects, and API keys." — domain context helps the AI generate relevant code), and quick reminders ("Use pnpm, not npm. Use Vitest, not Jest. The database is Neon serverless via @neondatabase/serverless." — prevents the AI from guessing wrong).
Solo rules are often: less formal (you write for yourself, not for a PR review), more context-heavy (you include project-specific knowledge that a team file might keep in documentation), and faster to evolve (you change a convention, you update the file, no approval needed). The solo CLAUDE.md is: a living document that evolves with your project. Change a library: update the rule. Discover a better pattern: update the rule. The file is: your project memory + your AI instructions in one document.
The solo advantage: zero governance overhead. Every rule change is: immediate (no PR, no review, no approval). Every convention is: your choice (no compromise, no committee). The file reflects: your current thinking, updated in real-time. The disadvantage: if you collaborate with anyone in the future (a freelancer, a contributor, a co-founder), the rules need to evolve from personal conventions to team agreements. The transition is: easier if the solo rules are already well-written.
- Content: stack + personal patterns + project context + quick reminders
- Audience: you and your AI. No justification needed for conventions
- Evolves: change a library, update the file immediately. Zero governance
- Project memory: the AI remembers your stack and conventions across sessions
- Transition risk: solo rules need evolution to team rules if collaborators join
Solo CLAUDE.md: 'Next.js 16, Drizzle, Neon, pnpm. Users have rulesets, rulesets have versions.' The AI remembers your stack and domain across sessions. Three months later: open the project, the AI generates code matching YOUR conventions, not generic patterns. The rule file is your project memory.
Team Rules: Shared Standards for Consistency
A team CLAUDE.md contains: agreed conventions ("We use async/await for all asynchronous code. Exception: legacy callback APIs that require .then()" — the convention + the documented exception), rationale for non-obvious choices ("We chose Drizzle over Prisma because: edge compatibility, TypeScript-first schema, and SQL transparency" — helps new members understand why, not just what), enforcement expectations ("These rules are enforced by CI. PRs that violate conventions will be flagged by linting and rule checks" — sets expectations for the team), and contribution guidelines ("To propose a rule change: open a PR to CLAUDE.md with the proposed change and rationale" — the governance process).
Team rules need: explicit language ("use X" not "I prefer X" — the impersonal voice signals team agreement), exception documentation ("unless Y" — without documented exceptions, developers argue about edge cases), and stability (rules do not change weekly — the team needs time to internalize and follow conventions before they change). Team rules are: a social contract between developers, codified as AI instructions. Everyone agreed to follow them. The AI enforces them consistently.
The team advantage: onboarding speed. A new developer: clones the repo, reads CLAUDE.md (5 minutes), and their AI generates convention-matching code from the first interaction. Without the file: the new developer generates generic patterns, receives convention feedback in code review (days of iteration), and spends weeks learning the team's unwritten conventions. The rule file is: the fastest onboarding tool for AI-assisted development. Five minutes of reading replaces weeks of osmotic learning.
- Content: agreed conventions + rationale + enforcement expectations + contribution process
- Language: impersonal ("use X" not "I prefer"), with documented exceptions ("unless Y")
- Stability: rules do not change weekly — team needs time to internalize before changes
- Onboarding: new developer reads CLAUDE.md in 5 minutes, AI generates matching code immediately
- Social contract: team agreed to follow. AI enforces. PRs verify. CI catches violations
New team member: clones repo, reads CLAUDE.md (5 minutes), their AI generates convention-matching code from the first interaction. Without the file: generic patterns, convention feedback in code review, weeks of osmotic learning. The rule file is the fastest onboarding tool for AI-assisted development.
Content Differences: What Each Includes
Solo-only content: project-specific domain knowledge ("Users have rulesets. Rulesets have versions. Projects link to rulesets." — too specific for a team file, but invaluable for the solo developer's AI to understand the domain), personal workflow preferences ("Always create a git branch before editing. Commit frequently with small changes." — personal workflow, not a team standard), and experimental conventions ("Trying: using Zod for form validation instead of React Hook Form. May revert." — experiments belong in personal rules, not team agreements).
Team-only content: contribution guidelines ("To change a rule: PR to CLAUDE.md with rationale" — governance that solo does not need), CI enforcement notes ("Rules enforced by: ESLint, TypeScript strict, rulesync check" — tells the team what is automated), cross-repo references ("Shared types in @org/shared. API client in @org/api-client." — multi-repo context), and security compliance requirements ("All PII encrypted at rest. Audit logging on all data access." — compliance requirements that individuals may not track).
Both include: technology stack, framework patterns, testing requirements, and naming conventions. The overlap is: 60-70% of the content. The difference is: solo adds domain context and personal preferences. Team adds governance, enforcement, and cross-team references. When transitioning from solo to team: keep the overlapping 60-70%, move domain context to documentation (not the rule file), remove personal preferences, and add governance and enforcement sections.
- Solo-only: domain knowledge, personal workflow, experiments — too specific/volatile for team
- Team-only: contribution guidelines, CI enforcement, cross-repo refs, compliance — team governance
- Both: stack, patterns, testing, naming — 60-70% content overlap
- Transition: keep overlap, move domain to docs, remove personal, add governance
- Solo → team: the rule file evolves, not restarts. The core conventions carry over
When Solo Rules Become Team Rules
The transition trigger: someone else will write code in this project. A co-founder joins, a freelancer is hired, or the project is open-sourced. At this moment: the solo rules need to become team rules. The transition steps: (1) review each rule — is it a personal preference or a team convention? Personal preferences ("I like verbose error messages"): move to global personal settings. Team conventions ("Use Zustand for state"): keep in the committed file. (2) Add rationale to non-obvious rules ("We chose Zustand because: simpler than Redux, sufficient for our state complexity"). (3) Add exception documentation ("Use Zustand unless: the feature requires middleware, in which case evaluate Redux Toolkit"). (4) Remove domain-specific knowledge (move to README or documentation). (5) Add contribution guidelines ("To propose a rule change: PR to CLAUDE.md").
The transition effort: 30-60 minutes for a well-written solo rule file (the conventions are already documented, just need team-oriented language). 2-4 hours for a poorly-written solo file (conventions are scattered, mixed with personal preferences, and lack rationale). The easiest transition: write solo rules as if a team will read them (even if the team is just you). Use impersonal language ("use X" not "I use X"). Include brief rationale ("because: Y"). Document exceptions ("unless: Z"). This way: the transition is a review, not a rewrite.
The advice: write solo rules with team-quality language from the start. The cost: 5 extra minutes when writing the rule. The benefit: zero transition effort when the team grows. The format is: the same (CLAUDE.md, .cursorrules). The content is: the same (conventions, patterns, security). The only addition for team: governance (contribution process, CI enforcement). Solo rules written with team-quality language: are team-ready from day one.
'Use Zustand because: simpler than Redux, sufficient for our complexity. Unless: feature requires middleware.' This solo rule is: team-ready. 'I use Zustand' requires rewriting later. The extra 5 minutes of team-quality language: saves 30-60 minutes of transition when collaborators join.
Comparison Summary
Summary of solo developer vs team AI rules.
- Solo motivation: AI memory + personal consistency. Team motivation: cross-developer consistency + onboarding
- Solo content: stack + patterns + domain context + personal preferences. Team: stack + patterns + governance + compliance
- Solo language: 'I prefer' (personal). Team language: 'Use X unless Y' (impersonal, with rationale)
- Solo management: immediate changes, zero governance. Team: PRs, reviews, CI enforcement
- Content overlap: 60-70%. Solo adds: domain + personal. Team adds: governance + enforcement
- Onboarding: solo = not applicable. Team = 5-minute CLAUDE.md read replaces weeks of osmotic learning
- Transition: 30-60 minutes if solo rules are well-written. 2-4 hours if not
- Best practice: write solo rules with team-quality language from day one. Zero transition cost later