I agree 100%, there is too many drawbacks on using adhock queries in the client application that most of the times prevents scaling an application. what if you have a table holding customer records, and you want to move the address columns to a new table to introduce multiple addresses for a customer. how are you going to due this when you have a web app, thick client, analysis cubes, reporting that will all require change and how do you manage deploying this change out? by preventing adhock or table direct queries all one would have to due is modify a few views and procs. secondly how can you ensure that all the adhock queries in all applications that are consuming this database are all using indexes, common business logic and formulas. and how would you trace down a poorly written adhock query that is table scanning, how do you go about fixing it and deploying it without impacting other user applications. my developers may only select from a view or call a proc from any application period.... look on the bright side, you have an opportunity to catalog all objects and send out impact statements to understand who is using what and also have an opportunity to find synergies between your development teams to come to a common process
PoweredByOtgc
Posts
-
Stored Procs, Packages, Views...Pah! -
T-SQL LOOPThis will fix your select issue when nothing is returned FETCH NEXT FROM ContactsCursor INTO @Contact_ContactID WHILE (@@FETCH_STATUS = 0)) BEGIN DECLARE @Vehicle_Registration AS VARCHAR(25) SELECT @Vehicle_Registration = coalesce(V.Registration,null), FROM tblVehicle V (NOLOCK) WHERE V.ContactID = @Contact_ContactID ... FETCH NEXT FROM ContactsCursor INTO @Contact_ContactID END
-
Dataset containing nullable datetime values does not like a null datetime?when you write values to the database you can simply reverse the logic IIF(ds("Datacolumn") = "01/01/1900",nothing,ds("datacolumn")
-
Dataset containing nullable datetime values does not like a null datetime?you can still use the designer to bind your controls to the dataset, but I personaly would use an nTier approach and create a class to fill the dataset. From the top of my head it will go something like this, but you may want to google sqlconnection and sqlcommand on the proper implementation
public class mydataleyer
public function FillDataset() as mydataset
dim results as new mydataset
dim con as new sqlconnection("server","database","username","password")
dim qry as new sqlcommand("select * from mytable")
con.open
dim rdr as datareader = qry.executereader(cn)
while rdr.read()
dim rw as datarow = results.tables(0).newrow
rw("datecolumn") = IIF(IsDBNull(rdr(0)),"01/01/1900",rdr(0))
results.tables(0).rows.add(rw)
end while
return results
end function
end classfrom you form onload event
private sub myform_load(byval sender as object,byval e as eventargs) handles mybase.load
dim ldr as new mydatalayer
me.dataset1 = ldr.FillDataset
end sub -
Dataset containing nullable datetime values does not like a null datetime?personaly me I am not a fan or leaving vales null, most controls like the dataset will through a null exception, that said what I prefer to do is when filling the dataset with defaults on null
din ds as datarow = mydataset.mydatatable.newrow ds("MyDateColumn") = IIf(IsDbNull(MyDbColumn),"01/01/1900",MyDbColumn) mydataset.mydatatable.rows.add(ds)
-
Help spawing more then 25 threads per processor in a thread poolthis is good to hear, even though the collection will hold 165 objects, it's unlikly I will actualy spawn all 165 threads at once, each object will represent a pop3 client, that will pop a mailbox return all it's mail perform some formating, auto alerting and store the results in a DB then die. the reason for spawing as many as possible at once is reduce the time between poping, also the reason for passing messages back is for the management interface to watch activity through ipclient....
-
Copy File From One Location To Another + Urgent + Sourabh DasPrivate Sub btnTransfer_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles btnTransfer.Click if not io.directory.exists("D:\TransferTest\") then io.directory.createdirectory("D:\Tran.....") enf if File.Copy(filename, "D:\TransferTest\") End Sub
-
Help spawing more then 25 threads per processor in a thread poolneed some guidance, I have a collection of properties 166 to be spacific, each object in the collection will spawn a thread to perform a long running task, I also want to pump meessages back to the calling thread. threadpool will provide me the solution unfortunatly MS has limited the number of threads to 25 per processor, simple math will tell me that I need a huge box. so that another approach would be to loop from 0 to max threadpool and spawn an instance of a thread pulling the objectid from the collection based on the threadid. that's fine or the first 50 objects. how would I now pull the remaing objects from the collection. example for i as 0 to threadpool.maxthreads -1 dim node as basenode = basecollection(i) dim rem as new workernode(node) addhandler rem.messageback new messagehandler(addressof OnMessagesReceived) rem.dowork next assuming dual CPU will make the maxthread value be 50 and my basecollection will hold 166 objects I am stuck spawning objects 51-165
-
Protect yourself - download the source codeI will only adapt and/or purchase a third party control if the source is available and clearly illustrated that I can do anything I want to the code as long as it does not end up competing with the original product, much like DevExpress. who would of though that Nortel, GM, Chrysler would be in this kind of financial mess, as a component shop there will be no one there to bail you out.. A company who offers the source code along with the product illustrates that they stand behind there product and they are ensuring there customers interests come first. Microsoft is a prime example there best interests is there bank account nothing more nothing less
-
If I actually wanted to become a Sql Server DBA, what would I read?If you want to get down and dirty writing code and designing tables, because at the end of the day that is what matters most, then I would pick up a book on Sybase. The methodology behind Sybase applies to MS SQL your just not filling your brain cells with marketing garbage.