Part 4: Async Programming with Tokio - Concurrent Execution
Introduction
Why Async in Rust?
The Problem: Waiting on I/O
// Synchronous code - blocking
fn test_payloads_sync(payloads: &[Payload]) {
for payload in payloads {
let response = make_http_request(payload); // Blocks here
process_response(response);
}
}
// 280 payloads × 100ms per request = 28 seconds!The Solution: Async/Await
// Asynchronous code - concurrent
async fn test_payloads_async(payloads: &[Payload]) {
let mut tasks = Vec::new();
for payload in payloads {
let task = tokio::spawn(async move {
let response = make_http_request(payload).await; // Doesn't block
process_response(response).await
});
tasks.push(task);
}
for task in tasks {
task.await.unwrap();
}
}
// 280 payloads running concurrently = ~2 seconds!Setting Up Tokio
Add to Cargo.toml
Async Main Function
Async/Await Basics
Async Functions Return Futures
Futures Are Lazy
Real Example: HTTP Requests
Single Async Request
Concurrent Requests
Controlling Concurrency
Problem: Too Many Concurrent Requests
Solution: Semaphore for Limiting
Better Solution: Stream with Concurrency Limit
Rate Limiting
Adding Delays Between Requests
Token Bucket Rate Limiter
Real Implementation from My Scanner
Async Error Handling
Propagating Errors in Async
Timeout for Async Operations
Channels for Communication
Sending Results Between Tasks
Select - Racing Futures
Join - Waiting for Multiple Futures
Testing Async Code
Key Takeaways
Common Patterns
Next in Series
PreviousPart 3: Error Handling and Result Types - Robust Production CodeNextPart 5: HTTP Clients and Real-World Integration - Complete Application
Last updated