Minimizing table locking on a SqlDataReader loop?

.net c# dapper sql-server

Question

I have the following scenario (exception handling removed, code simplified):

SqlDataReader rdr = executeQuery(query);
while (rdr.Read()) {
   //There are thousands of records, so this happens thousands of times
   handleRecord(rdr); //Takes a couple of seconds
}
rdr.Close();

So, the table (running on an overcrowded Sql Server 2000) the query involves has shared locks for a couple of hours or more, depending on the amount of records.

These locks sometimes affect other applications querying this database so I've been asked to remove the locks as soon as possible.

So, short of the obvious

List<Record> rList = new List<Record>();
SqlDataReader rdr = executeQuery(query);
while (rdr.Read()) {
   //There are thousands of records, so this happens thousands of times
   storeRecord(rList,rdr); //Takes milliseconds
}
rdr.Close();
foreach (Record r in rList) {
    handleRecord(r);
}

that would put a limit on the amount of records I can handle on the machine's memory, is there any other alternative?

(This app I'm slowly moving to Dapper.NET, so it's already a dependency for other parts of the code, in case there's something in Dapper that could help with this scenario.)

Popular Answer

It's an old topic but here's a trick I sometimes use:

  1. Order your sql query on primary key.
  2. Read top ## records.
  3. Read/buffer those ## records.
  4. Use your buffer to do processing.
  5. Remember the highest id.

Go back to 2 with one modification: You read the top ## records WHERE id > highest id.

In this way you can read and process batches of data.




Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Is this KB legal? Yes, learn why
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Is this KB legal? Yes, learn why