Enterprise

AI Rules for Distributed Teams

Distributed teams have developers in multiple offices, countries, and time zones working on the same codebase. AI rules must encode cross-team conventions, handoff patterns, and the shared standards that prevent codebase fragmentation.

5 min read·July 5, 2025

Berlin, San Francisco, Singapore — same codebase, same quality. AI rules make location invisible in the code.

Cross-team standards, interface contracts, follow-the-sun handoffs, consistency metrics, and architecture alignment

Distributed = Multiple Teams, One Codebase

Distributed teams differ from remote teams: remote teams are individuals working from different locations. Distributed teams are entire teams in different locations — the payments team in Berlin, the platform team in San Francisco, and the mobile team in Singapore. The challenge: each team develops local conventions that diverge from other teams. Over time: the Berlin code looks different from the SF code. New developers cannot move between codebases. AI-generated code varies by team.

AI rules for distributed teams enforce: universal standards (apply to all teams regardless of location), team-specific rules (each team adds domain-specific conventions without violating universal standards), and cross-team interfaces (APIs between teams follow shared contracts). The AI rules platform must support: global rules (maintained by the architecture team), team rules (maintained by each local team), and interface rules (maintained jointly by the teams that interact).

The goal: a developer in Berlin and a developer in Singapore should produce code that is indistinguishable in style, patterns, and quality — because the AI generates code from the same rules regardless of location.

Cross-Team Standards and Ownership

Code ownership model: AI rule: 'Every file and directory has an owner (defined in CODEOWNERS). PRs that modify owned code: require approval from the owning team. Cross-team changes: require approval from both teams. The AI generates code in the correct team's directory and follows that team's conventions. When the AI generates code that crosses team boundaries: flag for multi-team review.'

Interface contracts: when Team A's service calls Team B's service, the API contract must be explicit. AI rule: 'Cross-team APIs: defined in a shared schema repository (OpenAPI, Protobuf, or GraphQL schema). Changes to shared schemas: require review from all consuming teams. The AI generates API calls using the shared client (generated from the schema), not ad-hoc HTTP calls. This ensures: type safety, version compatibility, and change notification.'

Shared libraries: distributed teams often duplicate utility functions across codebases. AI rule: 'Before creating a utility function: check the shared packages (@org/ scope in npm, internal Go modules). If a similar function exists: use it. If not: consider whether the function is generally useful enough for the shared package. The AI should prefer shared packages over local implementations for cross-cutting concerns.'

💡 Generated API Clients Prevent Integration Bugs

Team A changes their API. Team B's hand-written HTTP client does not know about the change. Production breaks. With generated clients (from OpenAPI or Protobuf): Team A updates the schema. The client generator runs. Team B's build fails with a type error at compile time — before deployment. The schema repo is the contract. Generated clients enforce it. The AI should always use generated clients for cross-team APIs, never hand-written fetch calls.

Time Zone Handoff and Follow-the-Sun

Follow-the-sun development: work progresses as the earth rotates. Berlin starts work, pushes to main, Singapore picks up where Berlin left off. AI rule: 'End-of-day: commit all work-in-progress to a branch with a descriptive PR. Include: what was done, what remains, any blockers, and how to run/test the current state. The AI generates PR descriptions that enable another team to continue the work without a synchronous meeting.'

Handoff documentation: AI rule: 'For multi-day features that span time zones: maintain a running document (in the PR description or linked doc) with: current state, next steps, decisions made, and open questions. When a team picks up work from another team: they read the handoff document instead of waiting for a meeting. The AI generates handoff summaries at natural breakpoints in the development process.'

On-call rotation: distributed teams can provide 24/7 on-call coverage without overnight shifts. AI rule: 'On-call follows the sun: Berlin covers European hours, SF covers Americas, Singapore covers Asia-Pacific. Runbooks must be written for any on-call engineer to follow — not just the team that wrote the code. The AI generates runbooks that assume the reader is not the author: explicit steps, no assumptions about local knowledge.'

⚠️ End-of-Day PRs Enable Follow-the-Sun

Without end-of-day PRs: Berlin stops work at 6pm, code is on a local branch. Singapore starts at 9am (midnight Berlin time): cannot access Berlin's progress. 8 hours of potential development wasted. With end-of-day PRs: Berlin pushes to a branch with a descriptive PR at 6pm. Singapore picks up at 9am with full context. Follow-the-sun only works when handoffs are explicit and well-documented.

Measuring Cross-Location Consistency

Code consistency metrics: measure how similar code is across teams. Metrics: lint rule violation rate per team (should be similar), test coverage per team (should be similar), dependency version consistency (all teams on the same major versions), and architecture pattern adherence (all teams using the same patterns for the same problems). AI rule: 'Generate consistency reports per team. Flag divergence: if one team's lint violation rate is 3x higher than another's, investigate (are the rules different? Is enforcement weaker?).'

Shared code review standards: AI rule: 'PR review checklist: consistent across all teams. Every PR: reviewed by at least one team member. Cross-team PRs: reviewed by both teams. The checklist includes: code follows conventions (AI rules), tests included, documentation updated, security considerations addressed. Same checklist regardless of which team authored the PR.'

Regular cross-team sync: AI rule: 'Monthly architecture sync across locations: review shared conventions, discuss proposed changes, align on new patterns. AI rules changes proposed at sync: reviewed by all teams before adoption. This is the one meeting where synchronous communication is worth the timezone pain — it prevents convention drift.'

ℹ️ Consistency Metrics Reveal Hidden Divergence

Without measurement: you assume all teams follow the same standards. With measurement: you discover Team A has 95% test coverage, Team B has 40%. Team C uses async/await everywhere, Team D still uses callbacks. Team E pins dependency versions, Team F uses ranges. These divergences accumulate silently. Monthly consistency reports surface them. The conversation shifts from 'everyone should follow standards' to 'here is where we differ and why.'

Distributed Team AI Rules Summary

Summary of AI rules for distributed engineering teams across multiple locations.

  • Universal standards: same rules for all teams. Location should not affect code style or quality
  • Code ownership: CODEOWNERS file. Cross-team changes require multi-team review
  • Interface contracts: shared schema repo (OpenAPI/Protobuf). Generated clients for type safety
  • Shared libraries: @org/ packages. Check shared before creating local utilities
  • Follow-the-sun: descriptive PRs at end-of-day. Handoff docs for multi-day features
  • Runbooks: written for any on-call engineer. No assumptions about author's local knowledge
  • Consistency metrics: lint rates, coverage, dependency versions compared across teams
  • Monthly sync: architecture alignment across locations. Prevent convention drift