SQL Server 2008 INSERT Optimization

I’ve to INSERT a lot of rows (more than to a SQL Server data base. The table has an AI Id, two varchar(80) cols and a smalldatetime with GETDATE as default value. The last one is just for auditory, but necesary.

I’d like to know the best (fastest) way to INSERT the rows. I’ve been reading about BULK INSERT. But if posible I’d like to avoid it because the app does not run on the same server where database is hosted and I’d like to keep them as isolated as posible.

  • Memory leak using SQL FileStream
  • Creating Dataset Dynamically and passing to ReportViewer
  • SQL Server: UPDATE a table by using ORDER BY
  • Update query cancelled by user
  • Update Table records ignoring duplicates
  • SSAS IgnoreUnrelatedDimensions and Grand Total behaviour
  • Thanks!


    3 Solutions collect form web for “SQL Server 2008 INSERT Optimization”

    Another option would be bcp.

    Alternatively, if you’re using .NET you can use the SqlBulkCopy class to bulk insert data. This is something I’ve blogged about on the performance of, which you may be interested in as I compared SqlBulkCopy vs another way of bulk loading data to SQL Server from .NET (using SqlDataAdapter). Basic example loading 100,000 rows took 0.8229s using SqlBulkCopy vs. 25.0729s using the SqlDataAdapter approach.

    Create an SSIS package that will copy the file to SQL server machine and then use the data flow task to import data from file to SQL server database.

    There is no faster/more efficient way than BULK INSERT and when you’re dealing with such large ammount of data, do not even think about anything from .NET, because thanks to GC, managing millions of object in memory causes massive performance degradation.

    MS SQL Server is a Microsoft SQL Database product, include sql server standard, sql server management studio, sql server express and so on.