Comments in Articles
-
Chris Maunder wrote: but I'm sure we can find other means to improve performance without sacrificing usability. Yes, we can, I have a suggestion: Using XML on the server and some smart scripting on the client, in the Dynamic view, you can pull ONLY the headers to form the forum tree. Then, when you click on a message, the script downloads only that message from the server. This will reduce a lot the bandwidth. Most people don't read all the messages on the page, but they DO want to see all the headers. You could even come with a 100~150 messages per page with this scheme, to speed up things even more. I see dumb people
-
Chris Maunder wrote: but I'm sure we can find other means to improve performance without sacrificing usability. Yes, we can, I have a suggestion: Using XML on the server and some smart scripting on the client, in the Dynamic view, you can pull ONLY the headers to form the forum tree. Then, when you click on a message, the script downloads only that message from the server. This will reduce a lot the bandwidth. Most people don't read all the messages on the page, but they DO want to see all the headers. You could even come with a 100~150 messages per page with this scheme, to speed up things even more. I see dumb people
The issue isn't the amount of data returned between the SQL and WebServer (though this is important), but rather the work done in scanning tables and indexes (indices?) and sorting the messages to return the messages in the correct threaded order for the specified forum with the specified constraints. Once the messages to be displayed has been determined then actually returning that info is trivial. Forcing the client to implement some kind of XSLT to convert the XML to HTML would mean some browsers simply wouldn't be able to view the information. The solution is: Better database and index management, further query and SP tuning, and more hardware. cheers, Chris Maunder
-
I think Chris would also include a total site rewrite in that list. :) Tim Smith I'm going to patent thought. I have yet to see any prior art.
You got it in one ;) cheers, Chris Maunder
-
The issue isn't the amount of data returned between the SQL and WebServer (though this is important), but rather the work done in scanning tables and indexes (indices?) and sorting the messages to return the messages in the correct threaded order for the specified forum with the specified constraints. Once the messages to be displayed has been determined then actually returning that info is trivial. Forcing the client to implement some kind of XSLT to convert the XML to HTML would mean some browsers simply wouldn't be able to view the information. The solution is: Better database and index management, further query and SP tuning, and more hardware. cheers, Chris Maunder
Chris Maunder wrote: The issue isn't the amount of data returned between the SQL and WebServer (though this is important), but rather the work done in scanning tables and indexes (indices?) and sorting the messages to return the messages in the correct threaded order for the specified forum with the specified constraints. Once the messages to be displayed has been determined then actually returning that info is trivial. You are not seeing the caching oportunities this allows. You don't need to get the forum data from the SQL Server everytime. This way you end up locking tables too much. The volume of data inserted on CP's SQL Server daily is very low (ok, if you are logging IIS on the same SQL Server, forget about it, redirect them to another el-cheapo server. If you are logging to text files, you have serious security problems I would not mention on the Lounge). I would bet you have a 10x1 or 100x1 read x insert&update ratio. So, if the data you are returning from the WebServer have a simpler structure, it can be easily cached on the WebServer, reducing lock contention on the SQL Server. I can see you probably have an index on the message text to prevent duplicate messages. Believe me, if you did it, this is the worst mistake you can make. A better approach would be calculating some hashing on the message and storing it, this would make detecting duplicate messages a no-brainer. Once I started three searches on CP at the same time (it was not my intention) and the whole CP started to get slow for some minutes. It seems you have serious problems here, too. Chris Maunder wrote: The solution is: Better database and index management, further query and SP tuning, and more hardware. I would gladly help in this kind of work, I love to do it. I see dumb people
-
Oh, thanks for asking. Currently we found no partner and doing it for ourself so far. You know, to find a TRUSTWORTHY partner is really really difficulty. Is YOUR company interested? -- - Free Windows-based Web Content Management System: http://www.zeta-software.de/enu/producer/freeware/download.html - Scanned MSDN Mag ad with YOUR name: www.magerquark.de/misc/CodeProject.html - See me: www.magerquark.de
Uwe Keim wrote: Is YOUR company interested? We would be if ZetaProducer ran on ASP and used MS technologies. We are an MS company so all our skills tend towards that.
Paul Watson
Bluegrass
Cape Town, South AfricaChristopher Duncan wrote: Which explains why when Santa asked, "And what do you want for Christmas, little boy?" I said, "A life." (Accesories sold separately)