Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. Visual Basic
  4. Operation has Timed Out !

Operation has Timed Out !

Scheduled Pinned Locked Moved Visual Basic
sysadminmcpdata-structures
5 Posts 2 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • A Offline
    A Offline
    AliAmjad
    wrote on last edited by
    #1

    I have to download lots of web pages simultaneously, I'm able to queue all the discovered URLs and each is going to be downloaded in a new thread but I'm only able to download 20 to 25 web pages and for the remaining ones i get the Operation has Timed Out exception. I am using the HttpWebRequest/HttpWebResponse classes to communicate with the web server. AliAmjad (MCP)

    D 1 Reply Last reply
    0
    • A AliAmjad

      I have to download lots of web pages simultaneously, I'm able to queue all the discovered URLs and each is going to be downloaded in a new thread but I'm only able to download 20 to 25 web pages and for the remaining ones i get the Operation has Timed Out exception. I am using the HttpWebRequest/HttpWebResponse classes to communicate with the web server. AliAmjad (MCP)

      D Offline
      D Offline
      Dave Kreskowiak
      wrote on last edited by
      #2

      You cannot download any number of web pages you want at the same time. IE usually only allows 4 downloads to be going at once. Microsoft's implementation of TCP/IP on XP SP2 also only allows a limited number of connection attempts per second to occur. Any more than, I think, 10/second and the connection attempts fail. Why?? It's there to limit the proliferation of viruses and reduce the impact of "denial of service" attacks. Your queue should be popping off URLs to your threads at a throttled rate to stay under these limits.

      A guide to posting questions on CodeProject[^]
      Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
           2006, 2007

      A 1 Reply Last reply
      0
      • D Dave Kreskowiak

        You cannot download any number of web pages you want at the same time. IE usually only allows 4 downloads to be going at once. Microsoft's implementation of TCP/IP on XP SP2 also only allows a limited number of connection attempts per second to occur. Any more than, I think, 10/second and the connection attempts fail. Why?? It's there to limit the proliferation of viruses and reduce the impact of "denial of service" attacks. Your queue should be popping off URLs to your threads at a throttled rate to stay under these limits.

        A guide to posting questions on CodeProject[^]
        Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
             2006, 2007

        A Offline
        A Offline
        AliAmjad
        wrote on last edited by
        #3

        Is there any other way to extend this limitation or we have to go with it actually I've to download a huge number of web pages and 4 simultaneous downloads will decrease the performance of this Web Crawler I want to at least download 100 web pages at once and for this I've generated 100 Worker threads using a custom Thread Pool and to prevent the denial of service attacks I've also implemented the Politeness Policy. So, What should i do should i change the operating system or is there any other way on windows...

        AliAmjad(MCP)

        D 1 Reply Last reply
        0
        • A AliAmjad

          Is there any other way to extend this limitation or we have to go with it actually I've to download a huge number of web pages and 4 simultaneous downloads will decrease the performance of this Web Crawler I want to at least download 100 web pages at once and for this I've generated 100 Worker threads using a custom Thread Pool and to prevent the denial of service attacks I've also implemented the Politeness Policy. So, What should i do should i change the operating system or is there any other way on windows...

          AliAmjad(MCP)

          D Offline
          D Offline
          Dave Kreskowiak
          wrote on last edited by
          #4

          AliAmjad wrote:

          Is there any other way to extend this limitation

          Not that I'm going to suggest.

          AliAmjad wrote:

          I've to download a huge number of web pages and 4 simultaneous downloads will decrease the performance of this Web Crawler I want to at least download 100 web pages at once and for this I've generated 100 Worker threads using a custom Thread Pool

          Run it on either XP SP1, or, preferrably, on a Server edition of Windows.

          AliAmjad wrote:

          and to prevent the denial of service attacks

          This statement tells me you don't understand what a DOS, or DDOS, attack is.

          A guide to posting questions on CodeProject[^]
          Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
               2006, 2007

          A 1 Reply Last reply
          0
          • D Dave Kreskowiak

            AliAmjad wrote:

            Is there any other way to extend this limitation

            Not that I'm going to suggest.

            AliAmjad wrote:

            I've to download a huge number of web pages and 4 simultaneous downloads will decrease the performance of this Web Crawler I want to at least download 100 web pages at once and for this I've generated 100 Worker threads using a custom Thread Pool

            Run it on either XP SP1, or, preferrably, on a Server edition of Windows.

            AliAmjad wrote:

            and to prevent the denial of service attacks

            This statement tells me you don't understand what a DOS, or DDOS, attack is.

            A guide to posting questions on CodeProject[^]
            Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
                 2006, 2007

            A Offline
            A Offline
            AliAmjad
            wrote on last edited by
            #5

            Dave Kreskowiak wrote:

            This statement tells me you don't understand what a DOS, or DDOS, attack is.

            Not exactly but I was thinking of it in terms of not overloading the web server. Did I say something incorrect. But thanks I'll run this web crawler under a server edition of Windows.

            AliAmjad(MCP)

            1 Reply Last reply
            0
            Reply
            • Reply as topic
            Log in to reply
            • Oldest to Newest
            • Newest to Oldest
            • Most Votes


            • Login

            • Don't have an account? Register

            • Login or register to search.
            • First post
              Last post
            0
            • Categories
            • Recent
            • Tags
            • Popular
            • World
            • Users
            • Groups