Operation has Timed Out !
-
I have to download lots of web pages simultaneously, I'm able to queue all the discovered URLs and each is going to be downloaded in a new thread but I'm only able to download 20 to 25 web pages and for the remaining ones i get the Operation has Timed Out exception. I am using the HttpWebRequest/HttpWebResponse classes to communicate with the web server. AliAmjad (MCP)
-
I have to download lots of web pages simultaneously, I'm able to queue all the discovered URLs and each is going to be downloaded in a new thread but I'm only able to download 20 to 25 web pages and for the remaining ones i get the Operation has Timed Out exception. I am using the HttpWebRequest/HttpWebResponse classes to communicate with the web server. AliAmjad (MCP)
You cannot download any number of web pages you want at the same time. IE usually only allows 4 downloads to be going at once. Microsoft's implementation of TCP/IP on XP SP2 also only allows a limited number of connection attempts per second to occur. Any more than, I think, 10/second and the connection attempts fail. Why?? It's there to limit the proliferation of viruses and reduce the impact of "denial of service" attacks. Your queue should be popping off URLs to your threads at a throttled rate to stay under these limits.
A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007 -
You cannot download any number of web pages you want at the same time. IE usually only allows 4 downloads to be going at once. Microsoft's implementation of TCP/IP on XP SP2 also only allows a limited number of connection attempts per second to occur. Any more than, I think, 10/second and the connection attempts fail. Why?? It's there to limit the proliferation of viruses and reduce the impact of "denial of service" attacks. Your queue should be popping off URLs to your threads at a throttled rate to stay under these limits.
A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007Is there any other way to extend this limitation or we have to go with it actually I've to download a huge number of web pages and 4 simultaneous downloads will decrease the performance of this Web Crawler I want to at least download 100 web pages at once and for this I've generated 100 Worker threads using a custom Thread Pool and to prevent the denial of service attacks I've also implemented the Politeness Policy. So, What should i do should i change the operating system or is there any other way on windows...
AliAmjad(MCP)
-
Is there any other way to extend this limitation or we have to go with it actually I've to download a huge number of web pages and 4 simultaneous downloads will decrease the performance of this Web Crawler I want to at least download 100 web pages at once and for this I've generated 100 Worker threads using a custom Thread Pool and to prevent the denial of service attacks I've also implemented the Politeness Policy. So, What should i do should i change the operating system or is there any other way on windows...
AliAmjad(MCP)
AliAmjad wrote:
Is there any other way to extend this limitation
Not that I'm going to suggest.
AliAmjad wrote:
I've to download a huge number of web pages and 4 simultaneous downloads will decrease the performance of this Web Crawler I want to at least download 100 web pages at once and for this I've generated 100 Worker threads using a custom Thread Pool
Run it on either XP SP1, or, preferrably, on a Server edition of Windows.
AliAmjad wrote:
and to prevent the denial of service attacks
This statement tells me you don't understand what a DOS, or DDOS, attack is.
A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007 -
AliAmjad wrote:
Is there any other way to extend this limitation
Not that I'm going to suggest.
AliAmjad wrote:
I've to download a huge number of web pages and 4 simultaneous downloads will decrease the performance of this Web Crawler I want to at least download 100 web pages at once and for this I've generated 100 Worker threads using a custom Thread Pool
Run it on either XP SP1, or, preferrably, on a Server edition of Windows.
AliAmjad wrote:
and to prevent the denial of service attacks
This statement tells me you don't understand what a DOS, or DDOS, attack is.
A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007Dave Kreskowiak wrote:
This statement tells me you don't understand what a DOS, or DDOS, attack is.
Not exactly but I was thinking of it in terms of not overloading the web server. Did I say something incorrect. But thanks I'll run this web crawler under a server edition of Windows.
AliAmjad(MCP)