Concurrency - Goroutines and Channels
The Microservice That Processed 10,000 Records in 12 Minutes
def sync_users():
user_ids = get_all_user_ids() # 10,000 IDs
for user_id in user_ids:
user_data = fetch_user_from_api(user_id) # ~70ms per call
update_database(user_data) # ~5ms per call
print(f"Synced {len(user_ids)} users")func syncUsers() error {
userIDs, err := getAllUserIDs()
if err != nil {
return err
}
var wg sync.WaitGroup
results := make(chan error, len(userIDs))
// Process 50 users concurrently
sem := make(chan struct{}, 50)
for _, id := range userIDs {
wg.Add(1)
go func(userID string) {
defer wg.Done()
sem <- struct{}{} // Acquire semaphore
defer func() { <-sem }() // Release semaphore
userData, err := fetchUserFromAPI(userID)
if err != nil {
results <- err
return
}
if err := updateDatabase(userData); err != nil {
results <- err
}
}(id)
}
wg.Wait()
close(results)
return nil
}What Are Goroutines?
Starting a Goroutine
Basic Example
Channels: Communicating Between Goroutines
Creating and Using Channels
Simple Example
Real Example: Concurrent URL Checker
Buffered vs Unbuffered Channels
Unbuffered Channels
Buffered Channels
Channel Directions
Closing Channels
Checking if Channel is Closed
Ranging Over Channels
The Select Statement
Select with Default
Select with Timeout
Real Example: Worker with Timeout
Real Example: Concurrent Data Processor
Common Pitfalls and How to Avoid Them
1. Goroutine Leaks
2. Channel Deadlocks
3. Closing Channels Multiple Times
4. Sending to Closed Channel
5. Data Races
Your Challenge
Key Takeaways
What I Learned
Last updated