to your all posts, and somehow feel how the time has passed, and stupid? I was reading a long thread with John Simmons, at that time I had just two years of business work. Now I have just a little bit more, and I absolutely do not agree with myself at that time... Here is the thread in question, it was in 2006... http://www.codeproject.com/Lounge.aspx?msg=1740465#xx1740465xx[^]
_Zorro_
Posts
-
Do you ever go back... -
Embarrassing code admission of the day (or why C.S. is good for you)Pete O'Hanlon wrote:
what happens when you remove at 0? How is this handled in terms of resizing when you remove from the start of the list
I see it now, thank's!
-
Embarrassing code admission of the day (or why C.S. is good for you)Oh, what would be a better approach? ElementAt? I thought it would be the same...
-
Embarrassing code admission of the day (or why C.S. is good for you)What's wrong with that?
-
Embarrassing code admission of the day (or why C.S. is good for you)I guess you were expecting a queue or a stack maybe?
-
Making multiple teams coexistThank you for your answers
-
Making multiple teams coexistThank you jschell, What you’re describing about the BL is more or less what we started. We made a project where we allow other developers to override the implementation of our functions. If we find that/those DLLs then we use theirs otherwise we use ours. It works great actually. But the main concern is the UI. We do not want to have to maintain a lot of duplicated (replaced) pages. What we were thinking was to expose some events and let them write their code dynamically. It would be interpreted at runtime. What do you think about that solution? Do you see any inconvenient I may be not anticipating? Thanks again.
-
Making multiple teams coexistHi, I'll make it short and clear (I hope). We have two teams developping our product, the R&D and the 'Integrators'. I'm not sure how the Integrators are called in english, maybe TPAM (Third Party Application Maintenance), anyway, let's say that Integrators = People working on the base product, adding or changing functionalities for a specific customer. And the R&D team beeing in charge of doing what's common to all of them (the customers). The R&D team is in charge of developping functionalities available to all our customers, and the other team only does specific developments for specific customers. The problem we're dealing with is that both teams may develop on the same web pages, controls, business logic etc. What happens is that our Release Manager deploys what the R&D team has been doing in some sort of common workspace, and he can't be aware of what's from one team or another, so a lot of times, the Integrators work is lost and our release manager spends hours finding the 'lost changesets' and re applying those modifications. We're need to solve this problem and what we came with far now is to develop some sort of API exposing all the common asp.net events (page_init, load, render, etc.), we also should fire events when entering/exiting BL and DAL functions allowing the Integrators to develop 'manually', maybe injecting some asp.net code into the existing pages, also some code behind, javascript, etc. I'm not really sure if this is a reasonable idea or not, we've just been thinking for like 10 minutes and I wondered what did you recon in this kind of situations. This is just an idea, because they may develop everywhere... Web, BL, DAL and Database (we have a project for that). Just to be clear, we're developing a product, I'm not sure how it's technically called in English but our application will keep growing over time so we really need to do this right otherwise it will cost us a lot of time and money to correct that later when we'll have people using our 'API' (I'm calling this an API but I may be wrong about that) Anyway, we're stuck here, I hope I was clear enough, if not tell me and I'll try again! Thanks in advance PD: I posted this in the ASP.NET and Design and Architecture forums, I apologize if this is not ok and you're free to delete one of them.
-
Making multiple teams coexistHi, I'll make it short and clear (I hope). We have two teams developping our product, the R&D and the 'Integrators'. I'm not sure how the Integrators are called in english, maybe TPAM (Third Party Application Maintenance), anyway, let's say that Integrators = People working on the base product, adding or changing functionalities for a specific customer. And the R&D team beeing in charge of doing what's common to all of them (the customers). The R&D team is in charge of developping functionalities available to all our customers, and the other team only does specific developments for specific customers. The problem we're dealing with is that both teams may develop on the same web pages, controls, business logic etc. What happens is that our Release Manager deploys what the R&D team has been doing in some sort of common workspace, and he can't be aware of what's from one team or another, so a lot of times, the Integrators work is lost and our release manager spends hours finding the 'lost changesets' and re applying those modifications. We're need to solve this problem and what we came with far now is to develop some sort of API exposing all the common asp.net events (page_init, load, render, etc.), we also should fire events when entering/exiting BL and DAL functions allowing the Integrators to develop 'manually', maybe injecting some asp.net code into the existing pages, also some code behind, javascript, etc. I'm not really sure if this is a reasonable idea or not, we've just been thinking for like 10 minutes and I wondered what did you recon in this kind of situations. This is just an idea, because they may develop everywhere... Web, BL, DAL and Database (we have a project for that). Just to be clear, we're developing a product, I'm not sure how it's technically called in English but our application will keep growing over time so we really need to do this right otherwise it will cost us a lot of time and money to correct that later when we'll have people using our 'API' (I'm calling this an API but I may be wrong about that) Anyway, we're stuck here, I hope I was clear enough, if not tell me and I'll try again! Thanks in advance PD: I posted this in the ASP.NET and Design and Architecture forums, I apologize if this is not ok and you're free to delete one of them.
-
Data on one server, structure on another [modified]Mycroft Holmes wrote:
possibly we have simpler deployments
Heh :) We have quite complex deployment issues right now indeed. We're maintaining a lot of branches for our customers. And it's not always as easy as it's supposed to be when the time has come to merge all the stuff together after a long period of time.
-
Data on one server, structure on another [modified]smcnulty2000 wrote:
I'm curious enough to continue researching this but let us know if you decide on a solution.
I don't think there's one... but I will let you know if I find one, indeed.
-
Data on one server, structure on another [modified]Here's why:
CREATE SYNONYM [ schema_name_1. ] synonym_name FOR <object> <object> :: = { [ server_name.[ database_name ] . [ schema_name_2 ].| database_name . [ schema_name_2 ].| schema_name_2. ] object_name }
USE tempdb; GO -- Create a synonym for the Product table in AdventureWorks2008R2. CREATE SYNONYM MyProduct FOR AdventureWorks2008R2.Production.Product; GO -- Query the Product table by using the synonym. USE tempdb; GO SELECT ProductID, Name FROM MyProduct WHERE ProductID < 5; GO
Maybe I'm not seeing it, but how would you manage to use the same procedure and call different databases depending on who called the procedure? Let's say we have db1 and db2 and we're doing:SELECT A, B, C FROM dbo.Table
What I need is to call the data from db1 if the application made the call, but go read the data on db2 if the call was made by the build server (the unit/integration tests). -
Data on one server, structure on another [modified]It works fine for us, but we still have to do some stuff manually. We start having huge problems when humans begin to interact. That's actually why we wanted to validate each deployment step, because there's always something wrong. A file not merged correctly, something forgotten on the database, etc. and it implies a big loss of time to go and repair everything... each time... I suppose your humans are better than ours! :)
-
Single row for multiple results -
Single row for multiple resultsSTUFF is what you're looking for indeed.
-
Data on one server, structure on another [modified]Mycroft Holmes wrote:
Still does not make sense to me. Why not take a copy of the procedures and run them on the test server.
Because we're using continuous integration with tfs 2010, and want to launch the test sets while the build server is... building. If we have to copy our stored procedures to the test database then this will require a manual action each time a sp is updated, which I think, is a bad idea. Less humans around, less problems. Apart from checking bugs, we'd also like to guarantee that the deploy succeed regardless of the environment (dev, tst, post test, production). I know this seems a bit strange since we should only test on one of those, but we have numerous errors due to deployment issues (large amount of files to merge, a lot of manual actions) and we'd like to prevent that, or at least, be warned as soon as possible. Thanks
-
Data on one server, structure on another [modified]That's not bad, but still, requires me to update all my stored procedures... not cool! :sigh:
-
Data on one server, structure on another [modified]djj55 wrote:
This rings a bell but I cannot remember.
Ok, thanks for your time! :)
-
Data on one server, structure on another [modified]djj55 wrote:
If you try running a stored procedure that stores to the c:\ drive, see where the file ends up.
Are you talking about Sql Server, or Windows Server? Checking the location of a file stored from a stored procedure makes me believe you are talking about the second one. If you have an Sql Server that has two databases, db1 and db2, and you create a Stored Procedure only on db2, trying to run that stored procedure from db1 won't work. Check the details of
EXEC sp_who
you'll see that the dbname is specified. We too are referencing the schema, and we still use dbo too by the way. I think that the only way would be prefixing all our table/views/etc. calls with the database name, the name should be a parameter (if we're running tests then the data should be retrieved from the test database, otherwise it should get the data from the context's database). If we use a parameter, then, that would mean converting all our procedures to dynamic sql. Considering that we almost have 3000 stored procedures, this is not an option, as said before. I was wondering if Sql Server supported some sort of parameter, indicating that it should read the data from another database, while executing a procedure from another (without having to edit the stored procedures, this is a key factor). -
Data on one server, structure on another [modified]djj55 wrote:
The stored procedure is executed on a server not a database. You can have multiple databases on one server.
Not so sure about that, I may be wrong though. We have multiple databases in one single server, indeed, but some databases contain some procedures and other databases contain other procedures, so I'd say that the database is the procedure's owner. Anyway, linked servers may be a solution indeed, but it would require to prefix the table names with the server/database name. We actually have too many stored procedures, so this is not an option right now. Thanks