Delete a huge amount of data from a SQL Server table

31 August,2014 by Jack Vamvas

Data amounts  are growing , causing a faster processing requirement for DELETE operation . How is it possible to make the DELETE functionality faster  in table with a very large amount of data?

The DELETE dml operation does not support bulk logging  , so using the following  pattern  will exploit the bulk logging of  SELECT … INTO and INSERT INTO … SELECT.

Step 1 – Identify the data which won’t be deleted and isolate with SELECT … INTO

Step 2 – Use TRUNCATE to prepare table. Truncate removes all records from the table without logging each delete separately

Step 3 – Complete operation by INSERT INTO... SELECT and the TABLOCK table hint

For further tips on optimising DELETE in different scenarios – read SQL Server – Optimize delete from table - delete in sql

Read More

SQL Server Query Optimizer and Statistics - SQL Server DBA

Find queries causing Lock Timeouts per - SQL Server DBA

SQL Server measure i\o transaction rates and sys.dm_io_virtual_file_stats

Data Loading Performance Guide - SQL Server DBA


Author: Jack Vamvas (http://www.sqlserver-dba.com)


Share:

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Working...
Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Working...

Post a comment on Delete a huge amount of data from a SQL Server table


sqlserver-dba.com | SQL Server Performance Tuning | SQL Server DBA:Everything | FAQ | Contact|Copyright & Disclaimer