The Dapper project is a database tool that makes working with SQL in .NET projects in a clean and understandable manner when you know SQL.

As an open source project it lends itself to understanding and extension, but there are areas it has deliberately decided to stay out of to avoid project creep and inconsistencies and the two main areas are the per database bulk loading features and the SQL Data Definition statements of each supported database it works with.

In my experience large scale use of bulk insertion causes table locking that deadlocks at the entire table, the use of a temp table to do the insertion is a cautious approach to loading the data.

The point of using the DataReader and the DataTable (the use of AcceptChanges is optional, but allows for checking the integrity of the loaded table values before opening a SQL transaction) is to load the inputs without having to go into the details of the structures and statements and as I stated the 100,000 or so limit on this approach works fine and should work within your database servers memory and perform reasonable well (please test to confirm against your data)


// Quickly convert list to DataTable this serialize -> deserialize hack is good for DataTable
// Makes the bulk copy possible and it will work with record set of say 100,000
var asTable = JsonConvert.DeserializeObject<DataTable>(JsonConvert.SerializeObject(mySetOfRecords));

using (var connection = new SqlConnection(connectionString))
	var trans = connection.BeginTransaction("MyTestTransaction");
		@"SELECT TOP (0) [Id],[Name]
		INTO [#TempInsert]
		FROM [dbo].[MyTargetTable]", null, trans);

	// Quick load with bulk insert to then merge fast
	using (var bulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, trans))
		bulkCopy.BulkCopyTimeout = (int) TimeSpan.FromSeconds(30).TotalSeconds;
		bulkCopy.BatchSize = 100;
		bulkCopy.DestinationTableName = tempTableName;
	// Do update, insert, or merge here against original target table from bulk loaded temp