When I retrieve data from SQL Server, it takes 2 minutes for bulk data (2000000 rows) in SQL query window. But it takes long time (approximately 20 minutes or more) for mapping as a list using dapper ORM.
Is there any idea to increase the performance in mapping.
This is my code:
public List<T> DapperToList<T>(string SqlQuery, CommandType? CommandType = null) where T : class
{
if (Con.State == ConnectionState.Open)
Con.Close();
Con.Open();
var result = this.Con.Query<T>(SqlQuery, null, null, true, int.MaxValue, CommandType).ToList();
Con.Close();
return result;
}
public List<LOG_REPORTS> REPORT_LOG(DateTime? FROMDATE, DateTime? TODATE)
{
DynamicParameters param = new DynamicParameters();
param.Add("@FROMDATE", FROMDATE, DbType.DateTime);
param.Add("@TODATE", TODATE, DbType.DateTime);
var lstt = db.DapperToList<LOG_REPORTS>("PROC_LOG_REPORTS", param, CommandType.StoredProcedure);
return lstt;
}
As you explained in the question var result = this.Con.Query<T>(SqlQuery,.....)
line takes 20 minutes to execute. Based on your research, 2 minutes for executing query and remaining 18 minutes for mapping.
Mapping such huge data will take time (and memory and processor and ...) and there is near to no scope to improve this. Consider paging on database side using ROW_NUMBER
and pump one page at a time. As you said you are writing data to Excel; so this should not be an issue. But this will actually increase total time even though per batch time will be reduced and application will be responsive.
You want to write a Excel, right? then bypass Dapper and use ADO.NET. Pass DataTable
to Excel writer module. This will be huge gain.