Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Database & SysAdmin
  3. Database
  4. Data on one server, structure on another [modified]

Data on one server, structure on another [modified]

Scheduled Pinned Locked Moved Database
databasesql-serversysadminquestion
22 Posts 5 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • C Corporal Agarn

    What I was talking about with the c: drive was if you bcp data (export) it saves to the server you are running on. You could try and use an alias for your database but I have only created those while studying. (Not to be confused with table alias in query). If that does not work I do not know of a way to change where the sp looks. This rings a bell but I cannot remember.

    _ Offline
    _ Offline
    _Zorro_
    wrote on last edited by
    #7

    djj55 wrote:

    This rings a bell but I cannot remember.

    Ok, thanks for your time! :)

    1 Reply Last reply
    0
    • _ _Zorro_

      Hi, I wondered if Sql Server supported somehow to do that? By that, I mean having the structure on one database and the data on another. The purpose is to be able to lunch a set of integration tests that would get the data on the test database but would execute the stored procedures from the original database. I didn't found much on google, so I suppose this isn't possible. Edit: I didn't said, that I'd like to leave the stored procedures as they are, prefixing table names with a database would be too much work. Ideally it should be a config parameter or something like that.

      modified on Friday, July 22, 2011 8:06 AM

      S Offline
      S Offline
      smcnulty2000
      wrote on last edited by
      #8

      Try CREATE SYNONYM. I think that's your best bet.

      _____________________________ Give a man a mug, he drinks for a day. Teach a man to mug...

      _ 1 Reply Last reply
      0
      • _ _Zorro_

        Hi, I wondered if Sql Server supported somehow to do that? By that, I mean having the structure on one database and the data on another. The purpose is to be able to lunch a set of integration tests that would get the data on the test database but would execute the stored procedures from the original database. I didn't found much on google, so I suppose this isn't possible. Edit: I didn't said, that I'd like to leave the stored procedures as they are, prefixing table names with a database would be too much work. Ideally it should be a config parameter or something like that.

        modified on Friday, July 22, 2011 8:06 AM

        M Offline
        M Offline
        Mycroft Holmes
        wrote on last edited by
        #9

        Still does not make sense to me. Why not take a copy of the procedures and run them on the test server. We use SQL Compare to do this except we do it the other way around. Develop on the dev server, test on UAT and deploy to Production, SQL Compare does the synching of the code in both steps. When we require another environment we take a backup of production onto yet another server. Then the only change is the server name and we don't use that in the qualification in the code. I admit this may become complicated if you are using distributed databases but we use Oracle for any really big data (multi TB)

        Never underestimate the power of human stupidity RAH

        S _ 2 Replies Last reply
        0
        • M Mycroft Holmes

          Still does not make sense to me. Why not take a copy of the procedures and run them on the test server. We use SQL Compare to do this except we do it the other way around. Develop on the dev server, test on UAT and deploy to Production, SQL Compare does the synching of the code in both steps. When we require another environment we take a backup of production onto yet another server. Then the only change is the server name and we don't use that in the qualification in the code. I admit this may become complicated if you are using distributed databases but we use Oracle for any really big data (multi TB)

          Never underestimate the power of human stupidity RAH

          S Offline
          S Offline
          smcnulty2000
          wrote on last edited by
          #10

          Mycroft Holmes wrote:

          but we use Oracle for any really big data (multi TB)

          I'm jealous. I've only gotten to big gig levels.

          _____________________________ Give a man a mug, he drinks for a day. Teach a man to mug...

          M 1 Reply Last reply
          0
          • S smcnulty2000

            Mycroft Holmes wrote:

            but we use Oracle for any really big data (multi TB)

            I'm jealous. I've only gotten to big gig levels.

            _____________________________ Give a man a mug, he drinks for a day. Teach a man to mug...

            M Offline
            M Offline
            Mycroft Holmes
            wrote on last edited by
            #11

            We recently sized an Oracle system at 6TB, we can identify 3tb as the initial data coming from part of the trading system. Really big data is only useful if your structure is good and your data clean. Unfortunately our legacy systems were designed by incompetent, moronic, lazy, uneducated, idiots who only had a vague idea about the business and absolutely no idea about data structures.

            Never underestimate the power of human stupidity RAH

            D 1 Reply Last reply
            0
            • M Mycroft Holmes

              We recently sized an Oracle system at 6TB, we can identify 3tb as the initial data coming from part of the trading system. Really big data is only useful if your structure is good and your data clean. Unfortunately our legacy systems were designed by incompetent, moronic, lazy, uneducated, idiots who only had a vague idea about the business and absolutely no idea about data structures.

              Never underestimate the power of human stupidity RAH

              D Offline
              D Offline
              David Skelly
              wrote on last edited by
              #12

              Mycroft Holmes wrote:

              our legacy systems were designed by incompetent, moronic, lazy, uneducated, idiots who only had a vague idea about the business and absolutely no idea about data structures

              This is in fact the standard methodology for database development in 99% of companies worldwide.

              1 Reply Last reply
              0
              • S smcnulty2000

                Try CREATE SYNONYM. I think that's your best bet.

                _____________________________ Give a man a mug, he drinks for a day. Teach a man to mug...

                _ Offline
                _ Offline
                _Zorro_
                wrote on last edited by
                #13

                That's not bad, but still, requires me to update all my stored procedures... not cool! :sigh:

                S 2 Replies Last reply
                0
                • M Mycroft Holmes

                  Still does not make sense to me. Why not take a copy of the procedures and run them on the test server. We use SQL Compare to do this except we do it the other way around. Develop on the dev server, test on UAT and deploy to Production, SQL Compare does the synching of the code in both steps. When we require another environment we take a backup of production onto yet another server. Then the only change is the server name and we don't use that in the qualification in the code. I admit this may become complicated if you are using distributed databases but we use Oracle for any really big data (multi TB)

                  Never underestimate the power of human stupidity RAH

                  _ Offline
                  _ Offline
                  _Zorro_
                  wrote on last edited by
                  #14

                  Mycroft Holmes wrote:

                  Still does not make sense to me. Why not take a copy of the procedures and run them on the test server.

                  Because we're using continuous integration with tfs 2010, and want to launch the test sets while the build server is... building. If we have to copy our stored procedures to the test database then this will require a manual action each time a sp is updated, which I think, is a bad idea. Less humans around, less problems. Apart from checking bugs, we'd also like to guarantee that the deploy succeed regardless of the environment (dev, tst, post test, production). I know this seems a bit strange since we should only test on one of those, but we have numerous errors due to deployment issues (large amount of files to merge, a lot of manual actions) and we'd like to prevent that, or at least, be warned as soon as possible. Thanks

                  M 1 Reply Last reply
                  0
                  • _ _Zorro_

                    That's not bad, but still, requires me to update all my stored procedures... not cool! :sigh:

                    S Offline
                    S Offline
                    smcnulty2000
                    wrote on last edited by
                    #15

                    I'm not clear as to why. If the synonym is on the test server and points to the code on the production server then any change to the prod server should immediately be available to the test server. You run the code on the test server but it is the code from production. Or am I not seeing the problem?

                    _____________________________ Give a man a mug, he drinks for a day. Teach a man to mug...

                    _ 1 Reply Last reply
                    0
                    • _ _Zorro_

                      That's not bad, but still, requires me to update all my stored procedures... not cool! :sigh:

                      S Offline
                      S Offline
                      smcnulty2000
                      wrote on last edited by
                      #16

                      Nevermind. I'm trying it on my servers and can't seem to see anything but views and tables through my linked server. So maybe the other way around would work. I'm curious enough to continue researching this but let us know if you decide on a solution.

                      _____________________________ Give a man a mug, he drinks for a day. Teach a man to mug...

                      _ 1 Reply Last reply
                      0
                      • _ _Zorro_

                        Mycroft Holmes wrote:

                        Still does not make sense to me. Why not take a copy of the procedures and run them on the test server.

                        Because we're using continuous integration with tfs 2010, and want to launch the test sets while the build server is... building. If we have to copy our stored procedures to the test database then this will require a manual action each time a sp is updated, which I think, is a bad idea. Less humans around, less problems. Apart from checking bugs, we'd also like to guarantee that the deploy succeed regardless of the environment (dev, tst, post test, production). I know this seems a bit strange since we should only test on one of those, but we have numerous errors due to deployment issues (large amount of files to merge, a lot of manual actions) and we'd like to prevent that, or at least, be warned as soon as possible. Thanks

                        M Offline
                        M Offline
                        Mycroft Holmes
                        wrote on last edited by
                        #17

                        _Zorro_ wrote:

                        continuous integration with tfs 2010

                        Never have been able to get the hang of automated testing/deployment I guess I'm old school, I want a human driving the deployment.

                        Never underestimate the power of human stupidity RAH

                        _ 1 Reply Last reply
                        0
                        • M Mycroft Holmes

                          _Zorro_ wrote:

                          continuous integration with tfs 2010

                          Never have been able to get the hang of automated testing/deployment I guess I'm old school, I want a human driving the deployment.

                          Never underestimate the power of human stupidity RAH

                          _ Offline
                          _ Offline
                          _Zorro_
                          wrote on last edited by
                          #18

                          It works fine for us, but we still have to do some stuff manually. We start having huge problems when humans begin to interact. That's actually why we wanted to validate each deployment step, because there's always something wrong. A file not merged correctly, something forgotten on the database, etc. and it implies a big loss of time to go and repair everything... each time... I suppose your humans are better than ours! :)

                          M 1 Reply Last reply
                          0
                          • S smcnulty2000

                            I'm not clear as to why. If the synonym is on the test server and points to the code on the production server then any change to the prod server should immediately be available to the test server. You run the code on the test server but it is the code from production. Or am I not seeing the problem?

                            _____________________________ Give a man a mug, he drinks for a day. Teach a man to mug...

                            _ Offline
                            _ Offline
                            _Zorro_
                            wrote on last edited by
                            #19

                            Here's why: CREATE SYNONYM [ schema_name_1. ] synonym_name FOR <object> <object> :: = { [ server_name.[ database_name ] . [ schema_name_2 ].| database_name . [ schema_name_2 ].| schema_name_2. ] object_name } USE tempdb; GO -- Create a synonym for the Product table in AdventureWorks2008R2. CREATE SYNONYM MyProduct FOR AdventureWorks2008R2.Production.Product; GO -- Query the Product table by using the synonym. USE tempdb; GO SELECT ProductID, Name FROM MyProduct WHERE ProductID < 5; GO Maybe I'm not seeing it, but how would you manage to use the same procedure and call different databases depending on who called the procedure? Let's say we have db1 and db2 and we're doing: SELECT A, B, C FROM dbo.Table What I need is to call the data from db1 if the application made the call, but go read the data on db2 if the call was made by the build server (the unit/integration tests).

                            1 Reply Last reply
                            0
                            • S smcnulty2000

                              Nevermind. I'm trying it on my servers and can't seem to see anything but views and tables through my linked server. So maybe the other way around would work. I'm curious enough to continue researching this but let us know if you decide on a solution.

                              _____________________________ Give a man a mug, he drinks for a day. Teach a man to mug...

                              _ Offline
                              _ Offline
                              _Zorro_
                              wrote on last edited by
                              #20

                              smcnulty2000 wrote:

                              I'm curious enough to continue researching this but let us know if you decide on a solution.

                              I don't think there's one... but I will let you know if I find one, indeed.

                              1 Reply Last reply
                              0
                              • _ _Zorro_

                                It works fine for us, but we still have to do some stuff manually. We start having huge problems when humans begin to interact. That's actually why we wanted to validate each deployment step, because there's always something wrong. A file not merged correctly, something forgotten on the database, etc. and it implies a big loss of time to go and repair everything... each time... I suppose your humans are better than ours! :)

                                M Offline
                                M Offline
                                Mycroft Holmes
                                wrote on last edited by
                                #21

                                _Zorro_ wrote:

                                I suppose your humans are better than ours

                                Ow, crap, I just had to pick myself up off the floor, possibly we have simpler deployments and I'll take it that we have a very good team of developers but the main systems deployment not a chance!

                                Never underestimate the power of human stupidity RAH

                                _ 1 Reply Last reply
                                0
                                • M Mycroft Holmes

                                  _Zorro_ wrote:

                                  I suppose your humans are better than ours

                                  Ow, crap, I just had to pick myself up off the floor, possibly we have simpler deployments and I'll take it that we have a very good team of developers but the main systems deployment not a chance!

                                  Never underestimate the power of human stupidity RAH

                                  _ Offline
                                  _ Offline
                                  _Zorro_
                                  wrote on last edited by
                                  #22

                                  Mycroft Holmes wrote:

                                  possibly we have simpler deployments

                                  Heh :) We have quite complex deployment issues right now indeed. We're maintaining a lot of branches for our customers. And it's not always as easy as it's supposed to be when the time has come to merge all the stuff together after a long period of time.

                                  1 Reply Last reply
                                  0
                                  Reply
                                  • Reply as topic
                                  Log in to reply
                                  • Oldest to Newest
                                  • Newest to Oldest
                                  • Most Votes


                                  • Login

                                  • Don't have an account? Register

                                  • Login or register to search.
                                  • First post
                                    Last post
                                  0
                                  • Categories
                                  • Recent
                                  • Tags
                                  • Popular
                                  • World
                                  • Users
                                  • Groups