it seems like to me that you would do something like this instead Case When Isnull(LP.RefinanceAmount, '')='' Then cast(L.LoanAmount as varchar(20)) Else 'N.F' + cast(LP.RefinanceAmount as varchar(20)) END [Amount],
M H 1 2 3
Posts
-
Implicit conversion varchar to money -
Web services and SoapExceptionsIf you modify the auto generated class that wraps the web service you can get the raw xml. I was more or less just curious about whether or not the detail I added to my soap exception was being sent to the consumer correctly. See OmegaMans article http://www.codeproject.com/cs/webservices/SerializationFun.asp
-
Web services and SoapExceptionsI have a webservice method that throws a soapexception if it has any errors, but when I test the service via the services.asmx?op=webserviceMethod page and an exception is thrown the web page shows a 500 error. I expected it to show some xml with the error message I created, but maybe I'm missing something. Question: Is there a way to view the soap error message (xml) thats part of the SoapException instead of the 500 error?
-
how to get column wise data in a row in sql ?if you are using sql server 2005 you can use the pivot and unpivot commands in your sql code to go from rows to columns or vice versa. It's pretty easy to use.
-
Doubt in Stored procedureCorrect me if I am wrong but wont't the in operator have problems with a varchar string that happens to have commas? http://www.projectdmx.com/tsql/sqlarrays.aspx You might also want to consider dynamic sql. I have a feeling that on a lot of records you could get better performance.
-
Temporary Tables - SQL Stored ProceduresIf you call the 10 procedures for the main procedure you don't need the ##temp a local #temp will work and will fix your problem. This assuming that the connection/ session is new on each page view or call to the main procedure.
-
Delete Insert VS. Update InsertThanks for the information. Do inserts always cause fragmentation or is this more of a problem with bulk inserts?
-
Delete Insert VS. Update InsertI have a nightly process (SISS Package) that will do data dump from several tables used for order entry to tables used for reporting. My question is whether there is a general rule from a performance stand point that says wether it is faster to do an update on existing rows in the reporting table then insert new ones or just delete for a range then do a bulk insert. The number of records being updated/inserted would be anywhere from 2 records to 150,000. Assume that appropriate indexes will be used for deletes and updates.
-
grouping data in terms of time (weeks, months, year etc)If you have it grouping correctly is the problem what is getting displayed? I did a query like below on an example table and the grouping seems to be working and so does the display: select count(1), max(dateName(month, invcDate)+', '+datename(year,invcDate)) from invc group by dateName(month, invcnbr)+', '+datename(year,invcnbr) I had to add the max around the month string since it wasn't an aggregate, but it displays and groups.
-
grouping data in terms of time (weeks, months, year etc) -
Filling a typed datasetIf you haven't already solved your problem I think it would be helpful to clarify what part you are having a problem with. Are you trying to change the stored procedure or add a customer row to the datagrid after you bind the data?
-
Query Help -
Using parameter on Query condition not workingThe easiest way to make the column in the where clause dynamic is to build the query in the procedure then execute the sql code. example from article: Article on dynamic sql [^] use Northwind go declare @RECCNT int declare @ORDID varchar(10) declare @CMD Nvarchar(100) set @ORDID = 10436 SET @CMD = 'SELECT @RECORDCNT=count(*) from [Orders]' + ' where OrderId < @ORDERID' print @CMD exec sp_executesql @CMD, N'@RECORDCNT int out, @ORDERID int', @RECCNT out, @ORDID print 'The number of records that have an OrderId' + ' greater than ' + @ORDID + ' is ' + cast(@RECCNT as char(5)) Another idea is if you only have a couple possible columns you could do something like this Select * From table where column1 = isnull(@Column1Value, column1) and column2 = isnull(@Column2Value, column2) This might not perform well on a large table
-
Querying Distinct Usersselect userid,ipaddress, count(1) as[number of unique ipaddress] from AU_commontable where result=1 and actiontaken='firsttimesignon' group by userid,ipaddress having count(1)>1
-
Variable with multiple values?http://www.projectdmx.com/tsql/sqlarrays.aspx The link above gives you three examples of how you could do this.
-
Dynamic Select query issueI'm not sure if I totally understand your requirements but would the following work. Select firstname ,lastname ,dob ,ssn ,case when (firstName = 'abc' and lastname ='xyz') or (firstname ='ypx' and lastname ='por') or (firstname ='stup' and lastname ='rob') then 'Exact Match' else 'Potential Match' end as [MatchType] from person where ( firstname like 'abc%' and lastname like 'xyz%' OR firstname like 'ypx%' and lastname like 'por%' OR firstname like 'stup%' and lastname like 'rob%' )
-
Help required in stored proceduresAs far as I know you'd have to alter the proc to change it's parameters and you could do this programatically but at that point you might as well just do dynamic sql. Could you use a comma seperated list of values as parameters? Do have some example code for the stored procedure and what it's supposed to do?
-
Update query [modified] -
hi1.stored procedures allow default values for parameters user defined functions do not. 2.stored procedures allow you to use temp tables user defined functions only allow table variables. 3.you can use table valued user defined functions for normal table functionality such as joins, select *, etc. which can be useful sometimes. 4.both have cached execution plans. In general I would suggest to use procedures when you can and function only when a procedure won't do the job, just because it seems that functions could allow you to kill your performance when used in where clauses on queries with alot of rows, or even if used in select list on queries with alot of rows. Also functions calling functions makes sense to an application developer but doing something like that in your databse functions may be tempting, but imho is a bad idea.
-
Procedure with multiple parameters and multiple Select StatementsYou could use something like this. Default the parameters to null then coalesce or inull the parameter. The example proc below returns one result set but if you wanted to you could just as easily return one result set per parameter passed in. Create Proc usp_SearchItems @pName nvarchar(75) = null, @pNSN nvarchar(16)= null, @pTPart nvarchar(20)=null, @pVPart nvarchar(50) =null AS Select * From tblItems Where (fldInactive=0) and fldItem Like '%' + coalesce( @pName , fldItem) + '%' and fldNSN Like '%' + coalesce( @pNSN , fldNSN)+ '%' ....