--- title: "Batch Insert" description: "High-performance bulk insert operations for large datasets." canonical: "https://roomsharp.dev/docs/v0.4.7/batch-insert" source: "src/content/v0.4.7/batch-insert.mdx" --- # Batch Insert RoomSharp provides high-performance batch insert operations optimized for each database provider. ## Basic Batch Insert Use `[Insert]` with collection parameters for batch operations: ```csharp [Dao] public interface ITodoDao { [Insert] Task InsertAsync(IEnumerable todos); [Insert] Task InsertAsync(Todo[] todos); [Insert] long[] Insert(List todos); } ``` ## Provider-Specific Optimizations Each provider uses optimized strategies for bulk inserts: | Provider | Strategy | Notes | |----------|----------|-------| | **SQLite** | Multi-value INSERT | `INSERT INTO t (a,b) VALUES (?,?),(?,?),...` | | **SQL Server** | Table-valued parameters | Uses structured TVP for best performance | | **PostgreSQL** | COPY / unnest | Uses `COPY` or `unnest` for bulk loading | | **MySQL** | Multi-value INSERT | Extended insert syntax | ## Async Streaming Insert For very large datasets, use `IAsyncEnumerable`: ```csharp [Dao] public interface ITodoDao { [Insert] Task InsertAsync(IAsyncEnumerable todos); } // Usage async IAsyncEnumerable GenerateTodos() { for (int i = 0; i < 10_000; i++) { yield return new Todo { Title = $"Task {i}" }; } } var ids = await todoDao.InsertAsync(GenerateTodos()); ``` `IAsyncEnumerable` is ideal for streaming data from files, APIs, or other sources without loading everything into memory. ## Transaction Batching Batch inserts within a transaction: ```csharp await db.RunInTransactionAsync(async () => { var todos = Enumerable.Range(0, 1000) .Select(i => new Todo { Title = $"Batch item {i}" }); await db.TodoDao.InsertAsync(todos); }); ``` ## Performance Tips 1. **Use arrays or lists** - Pre-sized collections are faster than `IEnumerable` 2. **Batch appropriately** - SQLite works best with 500-1000 items per batch 3. **Disable constraints temporarily** - For large imports, consider disabling indexes 4. **Use transactions** - Batch inserts are faster within a transaction 5. **Monitor memory** - Very large batches may cause memory pressure ## Batch Size Configuration For very large datasets, batch internally to avoid memory and SQL limits: ```csharp public async Task ImportLargeDataset(IEnumerable todos) { const int batchSize = 1000; await db.RunInTransactionAsync(async () => { foreach (var batch in todos.Chunk(batchSize)) { await db.TodoDao.InsertAsync(batch); } }); } ``` ## Return Values Batch insert methods return arrays of generated IDs: ```csharp var todos = new[] { new Todo { Title = "Task 1" }, new Todo { Title = "Task 2" }, new Todo { Title = "Task 3" } }; var ids = await todoDao.InsertAsync(todos); // ids = [1, 2, 3] (generated primary keys) ``` Not all providers support returning IDs for batch inserts. Check provider documentation for specifics. ## Comparison with Single Insert | Method | Records | Time | Speedup | |--------|---------|------|---------| | Single Insert (loop) | 5,000 | ~2.5s | 1x | | Batch Insert | 5,000 | ~50ms | **~50x** | Batch inserts can be 30-100x faster than individual inserts in a loop, especially with SQLite and WAL mode enabled.