Timeout Problem
-
Hi, I am getting an arror like this. "The timeout period elapsed prior to completion of the operation or the server is not responding." (here i am retrieving more than 60000 records) Any Idea? Known is a drop, unknown is an ocean
-
Hi, I am getting an arror like this. "The timeout period elapsed prior to completion of the operation or the server is not responding." (here i am retrieving more than 60000 records) Any Idea? Known is a drop, unknown is an ocean
Did you try google. Have a look
Cheers!! Brij Check my latest Article :URL Routing with ASP.NET 4.0
-
Did you try google. Have a look
Cheers!! Brij Check my latest Article :URL Routing with ASP.NET 4.0
Wow Wat a logic? Thanks Keep posting this kind of answers Known is a drop, unknown is an ocean
-
Wow Wat a logic? Thanks Keep posting this kind of answers Known is a drop, unknown is an ocean
Did you bother to go through any of the link? This is a very common problem and have lots of answers on google. As simply by the exception. Timeout is getting expired. So there are two ways.. First, increase the CommandTimeout say 120 then try, Another thing, look into it why it is taking so much time. One more thing, As I believe you would not be displaying this much record in the front end so better do some paging etc.. at DB level. One last thing, before posting any question TRY GOOGLE
Cheers!! Brij Check my latest Article :URL Routing with ASP.NET 4.0
-
Did you bother to go through any of the link? This is a very common problem and have lots of answers on google. As simply by the exception. Timeout is getting expired. So there are two ways.. First, increase the CommandTimeout say 120 then try, Another thing, look into it why it is taking so much time. One more thing, As I believe you would not be displaying this much record in the front end so better do some paging etc.. at DB level. One last thing, before posting any question TRY GOOGLE
Cheers!! Brij Check my latest Article :URL Routing with ASP.NET 4.0
k, thanks But i tried google. before posting question here, i always do it. But that didnt work . i tried commandout method also. but no use. This is my stored proc CREATE procedure [dbo].[C_ToRemoveLocalDataDuplicateRecords] as begin transaction --Insert the duplicate record to duplicate table insert into tblLocalDataDuplicate ([AddressPK],[SlNo], [CategoryID],[ProjectID], [WebUrl],[Name],[StreetAddress],[City],[State],[ZipCode],[Telephone],[SecondaryPhone], [EmailID],[EnteredDate],[UpdatedDate],[UserId]) SELECT distinct(T1.[AddressPK]),T1.[SlNo],T1.[CategoryID],T1.[ProjectID], T1.[WebUrl],T1.[Name],T1.[StreetAddress],T1.[City],T1.[State],T1.[ZipCode],T1.[Telephone],T1.[SecondaryPhone], T1.[EmailID],T1.[EnteredDate],T1.[UpdatedDate],T1.[UserId] FROM tblLocalData T1, tblLocalData T2 where T1.name = T2.name and T1.streetaddress =T2.streetaddress and T1.city =T2.city and T1.state =T2.state and T1.zipcode =T2.zipcode and T1.telephone =T2.telephone and T1.SecondaryPhone =T2.SecondaryPhone and T1.addresspk < T2.addresspk order BY name,streetaddress, city, state, zipcode,telephone,T1.addresspk --Delete the duplicate records delete T1 FROM tblLocalData T1, tblLocalData T2 where T1.name = T2.name and T1.streetaddress =T2.streetaddress and T1.city =T2.city and T1.state =T2.state and T1.zipcode =T2.zipcode and T1.telephone =T2.telephone and T1.SecondaryPhone =T2.SecondaryPhone and T1.addresspk < T2.addresspk if (@@error > 0) rollback transaction else commit transaction i am not displaying any data in gvw instead of that i am displaying it in csv file. Here i am checking for duplicate records (total records upto 60,000). Known is a drop, unknown is an ocean
-
k, thanks But i tried google. before posting question here, i always do it. But that didnt work . i tried commandout method also. but no use. This is my stored proc CREATE procedure [dbo].[C_ToRemoveLocalDataDuplicateRecords] as begin transaction --Insert the duplicate record to duplicate table insert into tblLocalDataDuplicate ([AddressPK],[SlNo], [CategoryID],[ProjectID], [WebUrl],[Name],[StreetAddress],[City],[State],[ZipCode],[Telephone],[SecondaryPhone], [EmailID],[EnteredDate],[UpdatedDate],[UserId]) SELECT distinct(T1.[AddressPK]),T1.[SlNo],T1.[CategoryID],T1.[ProjectID], T1.[WebUrl],T1.[Name],T1.[StreetAddress],T1.[City],T1.[State],T1.[ZipCode],T1.[Telephone],T1.[SecondaryPhone], T1.[EmailID],T1.[EnteredDate],T1.[UpdatedDate],T1.[UserId] FROM tblLocalData T1, tblLocalData T2 where T1.name = T2.name and T1.streetaddress =T2.streetaddress and T1.city =T2.city and T1.state =T2.state and T1.zipcode =T2.zipcode and T1.telephone =T2.telephone and T1.SecondaryPhone =T2.SecondaryPhone and T1.addresspk < T2.addresspk order BY name,streetaddress, city, state, zipcode,telephone,T1.addresspk --Delete the duplicate records delete T1 FROM tblLocalData T1, tblLocalData T2 where T1.name = T2.name and T1.streetaddress =T2.streetaddress and T1.city =T2.city and T1.state =T2.state and T1.zipcode =T2.zipcode and T1.telephone =T2.telephone and T1.SecondaryPhone =T2.SecondaryPhone and T1.addresspk < T2.addresspk if (@@error > 0) rollback transaction else commit transaction i am not displaying any data in gvw instead of that i am displaying it in csv file. Here i am checking for duplicate records (total records upto 60,000). Known is a drop, unknown is an ocean
Hi Sree, if you want to optimise, need more help then you put it in datatbase section. You also can have some sql job that will run after certain period to remove th duplicate records. So this will be an independent process and will reduct the work that you are doing in your SProc here. One thing, I dont know how are you generating the CSV file, but in one of my project we used SSRS that enables us to export the data in several format like csv,pdf etc so you can also use that.
Cheers!! Brij Check my latest Article :URL Routing with ASP.NET 4.0
-
Hi, I am getting an arror like this. "The timeout period elapsed prior to completion of the operation or the server is not responding." (here i am retrieving more than 60000 records) Any Idea? Known is a drop, unknown is an ocean
Since you have already tried few options, not sure if you have done this too, but it looks like response time got elapsed. One of the ways to increase it is using
httpRuntime
element in the Web.Config file. You need to add something like:<httpRuntime
executionTimeout = "number"
maxRequestLength = "number" />For more details on it, read here: http://msdn.microsoft.com/en-us/library/e1f13641.aspx[^]
-
Hi Sree, if you want to optimise, need more help then you put it in datatbase section. You also can have some sql job that will run after certain period to remove th duplicate records. So this will be an independent process and will reduct the work that you are doing in your SProc here. One thing, I dont know how are you generating the CSV file, but in one of my project we used SSRS that enables us to export the data in several format like csv,pdf etc so you can also use that.
Cheers!! Brij Check my latest Article :URL Routing with ASP.NET 4.0
Thanks Brij Known is a drop, unknown is an ocean
-
k, thanks But i tried google. before posting question here, i always do it. But that didnt work . i tried commandout method also. but no use. This is my stored proc CREATE procedure [dbo].[C_ToRemoveLocalDataDuplicateRecords] as begin transaction --Insert the duplicate record to duplicate table insert into tblLocalDataDuplicate ([AddressPK],[SlNo], [CategoryID],[ProjectID], [WebUrl],[Name],[StreetAddress],[City],[State],[ZipCode],[Telephone],[SecondaryPhone], [EmailID],[EnteredDate],[UpdatedDate],[UserId]) SELECT distinct(T1.[AddressPK]),T1.[SlNo],T1.[CategoryID],T1.[ProjectID], T1.[WebUrl],T1.[Name],T1.[StreetAddress],T1.[City],T1.[State],T1.[ZipCode],T1.[Telephone],T1.[SecondaryPhone], T1.[EmailID],T1.[EnteredDate],T1.[UpdatedDate],T1.[UserId] FROM tblLocalData T1, tblLocalData T2 where T1.name = T2.name and T1.streetaddress =T2.streetaddress and T1.city =T2.city and T1.state =T2.state and T1.zipcode =T2.zipcode and T1.telephone =T2.telephone and T1.SecondaryPhone =T2.SecondaryPhone and T1.addresspk < T2.addresspk order BY name,streetaddress, city, state, zipcode,telephone,T1.addresspk --Delete the duplicate records delete T1 FROM tblLocalData T1, tblLocalData T2 where T1.name = T2.name and T1.streetaddress =T2.streetaddress and T1.city =T2.city and T1.state =T2.state and T1.zipcode =T2.zipcode and T1.telephone =T2.telephone and T1.SecondaryPhone =T2.SecondaryPhone and T1.addresspk < T2.addresspk if (@@error > 0) rollback transaction else commit transaction i am not displaying any data in gvw instead of that i am displaying it in csv file. Here i am checking for duplicate records (total records upto 60,000). Known is a drop, unknown is an ocean
To me, the query is not well optimized and I think, you should optimize your stored procedure. I have optimized the query and it may work for you: - WITH DuplicateRecords(Name,StreetAddress,City,State,ZipCode,Telephone,SecondaryPhone,AddressPK) AS ( SELECT T1.[Name],T1.[StreetAddress],T1.[City],T1.[State],T1.[ZipCode],T1.[Telephone],T1.[SecondaryPhone],COUNT(*) AS 'NoOfEntries',min(AddressPK) as 'AddressPK' from #Test1 GROUP BY T1.[Name],T1.[StreetAddress],T1.[City],T1.[State],T1.[ZipCode],T1.[Telephone],T1.[SecondaryPhone] HAVING COUNT(*) > 1 ORDER BY T1.[Name],T1.[StreetAddress],T1.[City],T1.[State],T1.[ZipCode],T1.[Telephone],T1.[SecondaryPhone] ) INSERT INTO tblLocalDataDuplicate ([AddressPK],[SlNo], [CategoryID],[ProjectID], [WebUrl],[Name],[StreetAddress],[City],[State],[ZipCode],[Telephone],[SecondaryPhone], [EmailID],[EnteredDate],[UpdatedDate],[UserId]) SELECT T1.[AddressPK],T1.[SlNo],T1.[CategoryID],T1.[ProjectID],T1.[WebUrl],T1.[Name],T1.[StreetAddress],T1.[City],T1.[State],T1.[ZipCode],T1.[Telephone],T1.[SecondaryPhone],T1.[EmailID],T1.[EnteredDate],T1.[UpdatedDate],T1.[UserId] FROM tblLocalData T1 INNER JOIN DuplicateRecords T2 ON T1.addresspk = T2.AddressPK --Delete the duplicate records DELETE tblLocalData WHERE addresspk IN(SELECT AddressPK FROM DuplicateRecords) please let me know when you are done. I hope this may help you. Regards, Suresh Dayma
Everything Is Possible!