Build Custom Stores and Destinations in 30 minutes
Learn ETL's extension patterns by implementing working custom components
What You'll Build
By the end of this tutorial, you'll have:
- A working custom in-memory store that logs all operations for debugging
- A custom HTTP destination that sends data with automatic retries
- A complete pipeline using your custom components that processes real data
Time required: 30 minutes
Prerequisites: Advanced Rust knowledge, running Postgres, basic HTTP knowledge
Step 1: Create Project Structure
Create a new Rust project for your custom ETL components:
1 2 | |
Result: You should see Created library 'etl-custom' package output.
Step 2: Add Dependencies
Replace your Cargo.toml with the required dependencies:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 | |
Result: Running cargo check should download dependencies without errors.
Step 3: Create Custom Store Implementation
Create src/custom_store.rs with a dual-storage implementation and cleanup primitives:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 | |
Result: Your file should compile without errors when you run cargo check.
Step 4: Create HTTP Destination Implementation
Create src/http_destination.rs with retry logic and proper error handling:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 | |
Result: Run cargo check again - it should compile successfully with both your store and destination implementations.
Step 5: Create Working Pipeline Example
Create src/main.rs that demonstrates your custom components in action:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 | |
Result: Running cargo run should now start your pipeline and show detailed logs from your custom components.
Step 6: Test Your Implementation
Verify your custom components work correctly:
1 2 | |
Result: Should see "Finished dev [unoptimized + debuginfo] target(s)"
1 2 | |
Result: You should see logs from your custom store being created and HTTP destination being configured.
Checkpoint: What You've Built
You now have working custom ETL components:
✅ Custom Store: Implements dual-layer caching with detailed logging
✅ HTTP Destination: Sends data via HTTP with automatic retry logic
✅ Complete Pipeline: Integrates both components with ETL's core engine
✅ Proper Error Handling: Follows ETL's error patterns and logging
Key Patterns You've Mastered
Store Architecture:
- Cache-first reads for performance
- Dual-write pattern for data consistency
- Startup loading from persistent storage
- Thread-safe concurrent access with Arc/Mutex
Destination Patterns:
- Exponential backoff retry logic
- Smart error classification (retry 5xx, fail 4xx)
- Efficient batching and empty batch handling
- Clean data transformation from ETL to API formats
Next Steps
- Connect to real Postgres → Configure Postgres for Replication
- Understand the architecture → ETL Architecture
- Contribute thoughtfully → Open an issue before proposing a new destination; we currently accept new destinations only when there is clear, broad demand due to the maintenance cost.
See Also
- ETL Architecture - Understanding the system design
- API Reference - Complete trait documentation
- Build your first pipeline - Start with the basics if you haven't yet