Iterating and Modifying a .NET Queue?
-
Then what should I use for this purpose. I want to add items to the queue and at the same time want to manipulate it. Should I use collection object. And what's wrong with the code given above i know i cannot modify it while iterating but then what should i do to achieve the objective. Thanks for your time man. AliAmjad (MCP)
Why are you using a Queue?? A queue is a list of items that's organized and manipulated on a First-In, First-Out basis. What's the purpose of this collection?? The type of collection you use isn't so much dictated by what it's going to hold, but how the collection is going to be used. If you want to manipulate items, adding them and removing them, then a
List(Of T)
would work, an array of strings, ArrayList, ...A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007 -
Why are you using a Queue?? A queue is a list of items that's organized and manipulated on a First-In, First-Out basis. What's the purpose of this collection?? The type of collection you use isn't so much dictated by what it's going to hold, but how the collection is going to be used. If you want to manipulate items, adding them and removing them, then a
List(Of T)
would work, an array of strings, ArrayList, ...A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007Actually I'm working on a Distributed Web Crawler a Multithreaded application where each URL discovered will going to be added in a queue to implement the breadth-first search model. Many threads will access this queue and get the URLs from the top and then add the newly discovered URLs at the end of the queue and them remove the carwled URLs from the queue. Now what should i use in such a scenario. but when i implemented this logic i get this error "Collection was modified; enumeration operation may not execute". AliAmjad (MCP)
-
Actually I'm working on a Distributed Web Crawler a Multithreaded application where each URL discovered will going to be added in a queue to implement the breadth-first search model. Many threads will access this queue and get the URLs from the top and then add the newly discovered URLs at the end of the queue and them remove the carwled URLs from the queue. Now what should i use in such a scenario. but when i implemented this logic i get this error "Collection was modified; enumeration operation may not execute". AliAmjad (MCP)
Why are you enumerating them?? I don't see anything in this description that would justify it. Well, now I see where your code came from. It's a modification of the sample code on MSDN. You don't need a loop to dequeue items. The Dequeue method pops the top item off the queue and returns it an as Object. You have to cast the Object back to a String in order to use it.
Dim url As String = DirectCast(que.Dequeue(), String)
A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007 -
Why are you enumerating them?? I don't see anything in this description that would justify it. Well, now I see where your code came from. It's a modification of the sample code on MSDN. You don't need a loop to dequeue items. The Dequeue method pops the top item off the queue and returns it an as Object. You have to cast the Object back to a String in order to use it.
Dim url As String = DirectCast(que.Dequeue(), String)
A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007 -
OKKK i get it but as i am in a multithreaded application then lots of threads will access this queue then should i use SyncLock or what else to ensure thread safety. AliAmjad (MCP)
Wrap the queue object in a class that implements adding and dequeuing. Use the SyncLock on an object to make it thread safe.
A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007 -
Wrap the queue object in a class that implements adding and dequeuing. Use the SyncLock on an object to make it thread safe.
A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007Cooool thanks buddy but I'm going to save MD5 hash algorithm of URLs in hexadecimal form in the Queue so which Queue(of T) i should use so that It'll consume as little memory as possible because I am gonna Enqueue it with lots of URLs. Again thanks for your valuable time Dave. AliAmjad (MCP)
-
Cooool thanks buddy but I'm going to save MD5 hash algorithm of URLs in hexadecimal form in the Queue so which Queue(of T) i should use so that It'll consume as little memory as possible because I am gonna Enqueue it with lots of URLs. Again thanks for your valuable time Dave. AliAmjad (MCP)
AliAmjad wrote:
but I'm going to save MD5 hash algorithm of URLs in hexadecimal form in the Queue so which Queue(of T) i should use so that It'll consume as little memory as possible because I am gonna Enqueue it with lots of URLs
That's nice. You better do some research on this first. MD5 is a one-way encryption hash. You can't reverse the hash and get the original URL back. Forget hashing or compressing the URL. It's adding more complexity than you need, and unjustifiably so.
A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007 -
AliAmjad wrote:
but I'm going to save MD5 hash algorithm of URLs in hexadecimal form in the Queue so which Queue(of T) i should use so that It'll consume as little memory as possible because I am gonna Enqueue it with lots of URLs
That's nice. You better do some research on this first. MD5 is a one-way encryption hash. You can't reverse the hash and get the original URL back. Forget hashing or compressing the URL. It's adding more complexity than you need, and unjustifiably so.
A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007Yes it is a one-way encryption hash and that's why I'll use these for the reference of original URLs saved in a file in this way I'll be able to store many URLs in memory because a web crawler needs to maintain a huge list of URLs in memory and also i'll calculate the MD5 for the page itself for preventing the duplicate entries in index because its size is very small and always equal. So what you suggest on this. Should i go with MD5 or just manipulate raw URLs in memory but this way it will consume lot of memory (RAM). AliAmjad (MCP)
-
Yes it is a one-way encryption hash and that's why I'll use these for the reference of original URLs saved in a file in this way I'll be able to store many URLs in memory because a web crawler needs to maintain a huge list of URLs in memory and also i'll calculate the MD5 for the page itself for preventing the duplicate entries in index because its size is very small and always equal. So what you suggest on this. Should i go with MD5 or just manipulate raw URLs in memory but this way it will consume lot of memory (RAM). AliAmjad (MCP)
I don't see the need for MD5 anything in your app. On top of that, with the skill level you've shown in the original post, this shouldn't even be a concern to you right now. Just getting the basic functionality should be. Queue up URL's, dequeue one, download it and move on. THEN you can add complex indexing and hashing to your database.
A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007 -
I don't see the need for MD5 anything in your app. On top of that, with the skill level you've shown in the original post, this shouldn't even be a concern to you right now. Just getting the basic functionality should be. Queue up URL's, dequeue one, download it and move on. THEN you can add complex indexing and hashing to your database.
A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007Thank you very much man for helping me out I'll surely implement this valuable Logic and analyze the results then I'll tell you about it.
Dave Kreskowiak wrote:
The skill level you've shown in the original post
I consider myself as a beginner because I always try to grasp these important concepts and I've learned allot from you. What do you think about my Skill level? Am i capable enough to take this challenge of making a Distributed Web Crawler although I am up to some degree successfully able to achieve it. But still need a lot of guidance and help form people like you. AliAmjad (MCP)