Thank you for contributing! This project follows Kanban + XP + Lean methodologies with BDD/TDD practices and is optimized for GitHub Copilot assistance.
- Check the board: Review GitHub Projects for available work
- Pick an issue: Pull from "Ready" column (respect WIP limits)
- Create branch:
git checkout -b feature/issue-number-description - Write tests first: Follow TDD/BDD approach
- Implement: Make tests pass
- Open PR: Use the PR template
- Review: Address feedback promptly
Create issues with clear acceptance criteria to enable Copilot assistance:
## User Story
As a [role]
I want [capability]
So that [benefit]
## Acceptance Criteria (BDD)
Scenario: [description]
Given [context]
When [action]
Then [expected outcome]
## Technical Implementation Hints
- Files to modify: src/components/MyComponent.tsx
- Related functions: handleSubmit(), validateForm()
- Test framework: Jest
- API endpoints: POST /api/usersBest Practices:
- ✅ Include specific file paths
- ✅ Mention related functions/classes
- ✅ Specify test framework
- ✅ Define expected behavior clearly
- ✅ Keep issues small (< 1 day of work)
Red → Green → Refactor
-
Red: Write failing test
# Write test that describes expected behavior npm test -- MyComponent.test.ts # Test should fail (Red)
-
Green: Make test pass with minimum code
# Implement just enough to pass npm test -- MyComponent.test.ts # Test should pass (Green)
-
Refactor: Improve while keeping tests green
# Clean up code npm test # All tests still passing
Assigning Issues to Copilot:
- Ensure issue has clear acceptance criteria (Gherkin format)
- Add technical implementation hints
- Label issue as
copilot-ready - Assign to @github-copilot (if available)
Requesting Copilot PR Review:
- Open your PR
- Click "Reviewers" → Add "github/copilot"
- Address Copilot's feedback
- Then request human review
When Copilot Works Best:
- ✅ Well-defined features with clear criteria
- ✅ Bug fixes with reproduction steps
- ✅ Refactoring with specific goals
- ✅ Test coverage improvements
- ❌ Vague requirements
- ❌ Complex architectural decisions
- ❌ Issues requiring business context
Before Opening PR:
- All tests passing locally
- Code follows project standards
- Self-review completed
- Commit messages are clear
PR Size Guidelines:
- 🟢 XS/S (< 250 lines): Ideal
- 🟡 M (250-500 lines): Good
- 🟠 L (500-1000 lines): Consider splitting
- 🔴 XL (> 1000 lines): Must split
PR Review Process:
- Automated checks run (CI)
- Copilot review (optional but recommended)
- Human review (required)
- Address feedback
- Merge after approval
- YAGNI: You Ain't Gonna Need It - Don't add unused features
- DRY: Don't Repeat Yourself - Extract common logic
- KISS: Keep It Simple, Stupid - Avoid unnecessary complexity
- SRP: Single Responsibility Principle
- Unit Tests: Test individual functions/classes
- BDD Tests: Test user scenarios (Given/When/Then)
- Coverage: Maintain or improve test coverage
- Edge Cases: Test boundary conditions
type(scope): brief description
Longer description if needed
Closes #123
Types: feat, fix, refactor, test, docs, chore
Examples:
feat(auth): add login functionalityfix(api): handle null response from user servicetest(cart): add BDD scenarios for checkout flow
- Backlog: Prioritized work awaiting capacity
- Ready: Issues with clear acceptance criteria
- In Progress: Active development (WIP limit: 1-2 per person)
- Review: PR open, awaiting review
- Done: Merged and verified
- Individual: 1-2 issues maximum
- Team: 3-5 total in "In Progress"
- Why: Maintain flow, reduce context switching
- Link blocking issues: "Blocked by #123"
- Use task lists for sub-tasks
- Resolve dependencies before starting work
Morning:
- Check Projects board
- Review PR notifications
- Pull new work if under WIP limit
Throughout Day:
- Commit frequently
- Run tests before push
- Update issue status
- Review others' PRs promptly
Before End of Day:
- Push work in progress
- Note any blockers on issues
- Update PR if feedback received
- Focus on: Tests, simplicity, maintainability
- Avoid: Nitpicking, premature optimization
- Response time: Within hours, not days
- Be kind: Constructive feedback only
- Respond quickly: Address feedback promptly
- Explain decisions: Add context when needed
- Don't take personally: Feedback improves code
- Thank reviewers: Appreciate their time
We track:
- ✅ Cycle time (Ready → Done)
- ✅ Test coverage percentage
- ✅ Build success rate
- ❌ Individual productivity (we don't measure this)
- ❌ Velocity/story points (Scrum metrics)
# Install dependencies (example for Node.js)
npm install
# Run tests
npm test
# Run linter
npm run lint
# Start development server
npm run devCopy .env.example to .env and configure:
cp .env.example .env- Process questions: Check DEVELOPMENT_PROCESS.md
- Technical questions: Ask in issue comments
- Blockers: Update issue and notify team
Remember: Small increments, frequent delivery, continuous improvement! 🚀