Part 2: Code Review Process and Best Practices
Series Navigation: β Part 1: Foundations and Principles | Part 3: Refactoring Techniques and Patterns β
Introduction
After establishing the foundations in Part 1, let's dive into the practical aspects of code review. Over the years of maintaining TypeScript microservices, I've participated in thousands of code reviewsβboth as a reviewer and as an author. I've seen reviews that caught critical bugs before production and reviews that turned into unproductive debates about tab vs. spaces.
In my current team, we process 50-100 pull requests weekly across our microservices. Each service has different owners, but we all review each other's code. This cross-pollination has been invaluable for maintaining consistency and sharing knowledge across the organization.
The Code Review Workflow
Our Pull Request Process
Here's the workflow we've refined over two years:
Developer creates feature branch: Branching off
mainordevelopImplementation with self-review: Review your own diff before creating PR
Create pull request: With clear description and context
Automated checks: CI/CD runs tests, linting, type checking
Peer review: Minimum one approval, typically 1-2 reviewers
Address feedback: Discussion and code updates
Final approval: Reviewer approves changes
Merge: Squash and merge into main branch
Deploy: Automatic deployment to staging/production
PR Size Guidelines
From experience, I've learned that PR size dramatically affects review quality:
Ideal: 200-400 lines changed
Acceptable: Up to 600 lines
Large: 600-1000 lines (needs extra care)
Too large: 1000+ lines (should be split)
When I was implementing a new payment provider, my initial PR was 2,500 lines. The review took three days and missed several issues. I split it into five focused PRs:
Add payment provider interface and types (150 lines)
Implement core payment logic (300 lines)
Add error handling and retries (250 lines)
Integration tests (400 lines)
Update documentation and examples (200 lines)
Each PR was reviewed within hours, and the quality improved significantly.
What to Look For During Review
1. Correctness and Logic
Does the code do what it's supposed to do?
Review Comment I Would Leave:
Should the second condition be
>=instead of>? Currently, an order of exactly $50 gets no discount. Is this intentional?
2. Error Handling
Are errors properly handled?
Review Comment:
We should handle the case where the API returns a non-404 error. What happens if we get a 500 or 503? Also, we should validate the response structure before assuming it matches UserProfile.
3. Security Vulnerabilities
In my order service, I caught this during review:
Other Security Checks:
Sensitive data in logs
Authentication/authorization checks
Input validation and sanitization
Secrets in code (should use environment variables)
Rate limiting for public endpoints
4. Performance Issues
Review Comment:
This will cause N+1 queries. For 100 orders, we'd make 101 database calls. We should batch load the customers.
5. Testing Coverage
During review, I check:
Are edge cases tested?
Are error paths tested?
Are mocks used appropriately?
Is there integration test coverage for critical paths?
6. Code Organization and Architecture
Giving Effective Feedback
The Feedback Framework I Use
1. Be Specific and Actionable
β Poor feedback:
"This code is messy"
β Good feedback:
"This function has three different responsibilities: validation, business logic, and persistence. Consider extracting the validation into a separate method. For example:
2. Explain the "Why"
β Poor feedback:
"Use async/await instead of promises"
β Good feedback:
"Consider using async/await instead of promise chains here. It's easier to read and handle errors consistently with try/catch. Example:
3. Distinguish Must-Fix from Suggestions
I use these prefixes in my reviews:
π¨ Critical: Must be fixed (security, bugs, breaking changes)
β οΈ Important: Should be fixed (performance, error handling)
π‘ Suggestion: Nice to have (naming, minor refactoring)
β Question: Seeking clarification
Example review:
π¨ Critical: This endpoint doesn't validate the user has permission to access the order. We need to add an authorization check.
β οΈ Important: This will cause N+1 queries under load. Consider using a DataLoader or batch fetching.
π‘ Suggestion: The variable name
datais generic.orderDetailswould be more descriptive.β Question: Why are we using a 30-second timeout here? Our normal timeout is 5 seconds.
4. Praise Good Work
Code review isn't just about finding problems:
β Nice use of the builder pattern here! This makes the test data creation much more readable.
β Good catch adding the null check. That would have caused issues in production.
β Excellent error messages. These will make debugging much easier.
Receiving Feedback Gracefully
As a PR author, I've learned to:
1. Don't Take It Personally
Feedback is about the code, not about you as a developer. I used to get defensive when reviewers found issues. Now I view every piece of feedback as a learning opportunity.
2. Ask for Clarification
If feedback isn't clear:
"Thanks for the suggestion! Can you clarify what you mean by 'this could cause race conditions'? I want to make sure I understand the scenario you're concerned about."
3. Explain Your Reasoning
If you disagree:
"I considered using a Set here, but chose an Array because we need to maintain insertion order for the UI display. The collection size is capped at 10 items, so the O(n) lookup isn't a concern. What do you think?"
4. Accept You'll Make Mistakes
I recently pushed code that had an obvious bug. The reviewer caught it:
"Line 45: You're comparing the string 'true' instead of the boolean true"
My response:
"Oof, good catch! Fixed in the latest commit. Added a test case for this too."
No excuses, just acknowledgment and fix.
Common Code Review Patterns
The Checklist Approach
Before marking a PR as ready for review, I run through this checklist:
Self-Review Checklist:
Reviewer Checklist:
Time-Boxing Reviews
I limit reviews to 60 minutes at a time. After that, my effectiveness drops. For large PRs, I do multiple review sessions.
The Two-Pass Review
First Pass (10 minutes): High-level
Does the overall approach make sense?
Is the PR size reasonable?
Are tests present?
Second Pass (30-45 minutes): Detailed
Line-by-line code review
Logic verification
Edge case analysis
Real-World Review Example
Here's an actual review I did on a colleague's PR for our notification service:
Original Code:
My Review Comments:
π¨ Critical: No error handling. If the email API is down, the SMS won't be sent. These should be wrapped in try-catch.
π¨ Critical: API keys are missing. How are we authenticating with these services?
β οΈ Important: These API calls are sequential. They could run in parallel with Promise.all() to improve performance.
β οΈ Important: No retry logic. Transient network failures will cause notifications to be lost.
π‘ Suggestion: The API URLs should come from configuration, not be hardcoded.
π‘ Suggestion: Consider extracting email and SMS sending to separate services for better testability.
β Question: What happens if the user doesn't exist? Should we throw an error or silently fail?
Revised Code After Discussion:
Building a Review Culture
What's Worked in My Team
No code in production without review: Hard rule, no exceptions
Review within 24 hours: We prioritize reviews to avoid blocking teammates
Limit reviewers to 2-3: More reviewers = diffusion of responsibility
Author can't approve their own PR: Obvious, but needs to be enforced
Celebrate good reviews: Recognize reviewers who catch important issues
Handling Disagreements
Sometimes reviewers and authors disagree. Our process:
Discussion in PR comments: Most disagreements resolve here
Video call if needed: Complex issues benefit from real-time discussion
Team decision: If still unresolved, bring to team meeting
Document the decision: Update architectural decision records (ADRs)
Review Metrics We Track
Average PR size (lines changed)
Time to first review
Number of review cycles per PR
Review approval rate
Issues found in production vs. in review
These metrics help us improve our process over time.
Conclusion
Effective code review is a skill that requires practice and empathy. Key takeaways:
Size matters: Keep PRs small and focused
Be thorough: Check correctness, security, performance, and tests
Be constructive: Explain the "why" and suggest solutions
Be timely: Don't block teammates waiting for reviews
Learn from reviews: Both as author and reviewer
In Part 3, we'll dive into specific refactoring techniques and patterns you can apply during code review or as follow-up improvements.
Series Navigation: β Part 1: Foundations and Principles | Part 3: Refactoring Techniques and Patterns β
Last updated