Tutorials

How to Create an Onboarding Rule Guide

A new developer's first day: the onboarding guide that gets them from zero to AI-assisted coding in 30 minutes. Setup steps, rule verification, first prompt, and the quick-reference card they keep on their desk.

5 min read·July 5, 2025

New developer. Day 1. 30 minutes: tool installed, rules verified, first AI-generated code submitted. The onboarding guide that works.

30-minute setup path, rule verification, first prompt exercise, quick-reference card, and self-improving guide

First Day Goal: AI-Assisted Coding in 30 Minutes

A new developer's first day without an onboarding guide: they clone the repo, see CLAUDE.md, wonder what it is, ask in Slack, get a link to a wiki page written 6 months ago, spend 2 hours figuring out which AI tool to install and how to configure it, and finally generate their first AI-assisted code — with inconsistent results because the setup was incomplete. With an onboarding guide: they follow a step-by-step path that takes 30 minutes: install the AI tool, verify the rules are loaded, run their first prompt, and start coding with full AI rule compliance.

The onboarding guide: a single document (or page in the wiki/README) that a new developer follows on their first day. It contains: exactly the steps needed (no more), in the order they should be done, with verification at each step (so the developer knows if something went wrong), and a troubleshooting section for common issues. The guide: tested by every new hire. If a new hire encounters an issue: the guide is updated immediately. The guide: self-improving.

The target: a new developer submits their first AI-assisted PR within the first 48 hours. Not because they are rushed — because the setup is frictionless and the AI generates correct code from the first prompt. The onboarding guide: eliminates the 1-2 week ramp-up period where new developers learn the team's unwritten conventions. The rules: teach the conventions through AI-generated code.

Step 1: The 30-Minute Onboarding Path

Minute 0-5 — Install the AI tool: 'Install [Claude Code / Cursor / Copilot — whichever the team uses]. Follow the setup guide: [link to the tool's setup tutorial from the blog]. Verify: [tool-specific verification command].' The instruction: tool-specific and precise. Not 'install an AI coding tool' but 'run npm install -g @anthropic-ai/claude-code and verify with claude --version.' The new developer: follows the exact command and sees the expected output.

Minute 5-10 — Verify rules are loaded: 'Open the project in your AI tool. Prompt: What coding conventions does this project follow? The AI should reference: [list 3-4 key conventions from the CLAUDE.md]. If the AI does not reference these conventions: check [troubleshooting section].' The verification: confirms the rules are loaded before the developer starts coding. Without verification: they might code for hours with unloaded rules, producing generic AI output.

Minute 10-25 — First prompt exercise: 'Complete this exercise to verify everything works: Create a new function in src/utils/format-currency.ts that formats a number as USD currency (e.g., 1299 cents → $12.99). The AI should: use the project's naming convention, include TypeScript types, use the project's error handling pattern, and generate a co-located test file.' The exercise: a real, small task that exercises multiple rules. The developer: sees the AI generate convention-compliant code. The experience: confidence-building. Minute 25-30: commit and submit a PR.

💡 The Verification Step Catches Setup Issues Before Coding Begins

Without verification: the new developer codes for 3 hours. The AI generates generic code. They think: 'This AI is not very good.' After 3 hours: someone notices the rules are not loaded (wrong file name, wrong directory). With verification at minute 5: 'Prompt the AI: What conventions does this project follow? If it does not mention Result pattern and Zod validation: check the troubleshooting section.' The issue: caught in 30 seconds. Fixed in 2 minutes. The developer: never experiences the frustration of un-ruled AI output.

Step 2: The Quick-Reference Card

After the onboarding path: give the new developer a quick-reference card. The card: a one-page summary they keep visible (printed or pinned in their IDE). Contents: AI tool shortcut (the command to start the AI — claude, Cmd+L, etc.), verification prompt ('What conventions does this project follow?'), key rules (the 5 most important rules — naming, error handling, testing, security, and the primary framework pattern), how to report an AI issue ('If the AI generates wrong code: check the rules. If the rules are missing: report in #ai-standards.'), and useful links (rules file, changelog, FAQ, #ai-standards Slack).

The card format: concise. 5 sections. No section longer than 3 lines. The card: a reference, not a guide. The developer: glances at it when they need a reminder, not reads it end-to-end. Print-friendly: black and white, large font, no decorative elements. AI rule: 'The quick-reference card: the artifact the new developer uses daily for the first 2 weeks. After 2 weeks: the conventions are internalized and the card is no longer needed. But for those 2 weeks: the card is their lifeline.'

Digital version: a pinned message in the team's Slack channel or a bookmarked wiki page. The same content as the printed card but: accessible from any device. For remote developers: the digital version is the primary reference. For in-office developers: the printed card on the desk + the digital version for reference when not at the desk. AI rule: 'Both physical and digital. The physical card: visible reminder on the desk. The digital card: accessible from anywhere. Both: the same content.'

ℹ️ The Quick-Reference Card: Used Daily for 2 Weeks, Then Internalized

Week 1: the new developer glances at the card 10 times per day. 'What is our naming convention? camelCase. What is our error handling? Result pattern. What is the Slack channel? #ai-standards.' Week 2: they glance at it 3-4 times per day. By week 3: they have internalized the conventions and the card sits unused. The card: a bridge from day 1 (knows nothing) to week 3 (knows everything). It is not permanent documentation — it is a temporary learning aid.

Step 3: Testing and Improving the Guide

Test with every new hire: the onboarding guide is tested by real users — every new developer who joins the team. After each new hire completes the guide: 5-minute feedback conversation. Questions: did you encounter any issues? How long did the setup actually take? Was anything confusing? What would you add or change? The feedback: applied immediately. The guide: improves with every new hire. After 5 new hires: the guide is polished and covers every common issue.

Common improvements from feedback: 'The install command assumed I had Node.js 20 — I had Node.js 18 and it failed.' (Add: 'Requires Node.js 20+. Check with node --version.') 'The verification prompt returned different conventions than listed.' (The rules were updated but the guide was not — sync the guide with the current rules.) 'I did not know where to find the #ai-standards Slack channel.' (Add the direct Slack link.) Each improvement: prevents the next new hire from encountering the same issue.

Freshness: update the guide when: the AI tool changes (new version, new setup process), the rules change significantly (the verification section references specific rules), the team's workflow changes (new Slack channel, new tool, new conventions), or a new hire reports a stale section. The guide: a living document. If the last update was more than 3 months ago: review it proactively. AI rule: 'The onboarding guide: updated after every new hire's feedback and reviewed every 3 months even without feedback. Stale guides: the #1 cause of frustrating onboarding experiences.'

⚠️ A Stale Guide Is Worse Than No Guide

The guide says: 'Install: npm install -g claude-code.' The package was renamed to @anthropic-ai/claude-code 3 months ago. The new developer: runs the old command. Fails. Googles. Finds the new name. Installs. But: trust in the guide is damaged. They question every subsequent step. A stale guide: actively harmful. Update after every tool change, every rule update, and every new hire's feedback. If the guide was last updated 3+ months ago: it is likely stale.

Onboarding Guide Summary

Summary of creating an onboarding AI rules guide.

  • Goal: new developer AI-assisted coding in 30 minutes. First PR within 48 hours
  • Path: install tool (5 min) → verify rules (5 min) → first prompt exercise (15 min) → commit (5 min)
  • Verification: prompt the AI about conventions. If it does not reference them: troubleshoot before coding
  • Exercise: a real small task that exercises multiple rules. Confidence-building first experience
  • Quick-reference card: 5 sections, one page. AI shortcut, verification prompt, key rules, issue reporting, links
  • Physical + digital: printed card on desk + pinned Slack message. Same content both formats
  • Testing: 5-minute feedback from every new hire. Apply improvements immediately
  • Freshness: update after feedback, after rule changes, and proactively every 3 months