Copy data In table [modified]
-
Hi all, I have two tables one is live real time where data is updated every 30 minutes. I created another duplicate table for backup purposes.Hence i want this back up to have the data of the live table every 2 hour. In short i have two table one live table and another to keep a backup of the data on the live table every two hours. Basically, i want to have the two tables having the same data before the 30 minutes update on the live table. How could i go about this problem? Thank you
modified on Tuesday, February 19, 2008 12:29 PM
-
Hi all, I have two tables one is live real time where data is updated every 30 minutes. I created another duplicate table for backup purposes.Hence i want this back up to have the data of the live table every 2 hour. In short i have two table one live table and another to keep a backup of the data on the live table every two hours. Basically, i want to have the two tables having the same data before the 30 minutes update on the live table. How could i go about this problem? Thank you
modified on Tuesday, February 19, 2008 12:29 PM
Are you actually using the backup table? By differentiating between live and backup I'm assuming you're not. In that case, why not place the table in it's own filegroup and backup that filegroup on a schedule that meets your needs? If you need to rollback you can restore the filegroup by itself.
Mark's blog: developMENTALmadness.blogspot.com
-
Are you actually using the backup table? By differentiating between live and backup I'm assuming you're not. In that case, why not place the table in it's own filegroup and backup that filegroup on a schedule that meets your needs? If you need to rollback you can restore the filegroup by itself.
Mark's blog: developMENTALmadness.blogspot.com
Thank you for your response.Its much appreciated. The back_up table is used with some application for reporting purposes.Instead of saying back_up i should probably say another copy of live table every 2 hour. In other words, To have two tables will have the same data as it updates every two hours.Within two hours live table will have more data until the update is done. Thank you
-
Thank you for your response.Its much appreciated. The back_up table is used with some application for reporting purposes.Instead of saying back_up i should probably say another copy of live table every 2 hour. In other words, To have two tables will have the same data as it updates every two hours.Within two hours live table will have more data until the update is done. Thank you
Does the data in the production table change at all (UPDATE/DELETE)? Or is it always new data (INSERT)? One answer is to use Red Gate's SQL Data Compare. It's pretty cheap and you can write a script to sync the tables every two hours. You could dump/refresh the backup table. Method 1: This would work much like using the Red Gate tool. 1) Add a timestamp column to the livedata table and a binary column to the backupdata table 2) Compare and update existing data (UPDATE b SET column = live.column FROM backupdata b INNER JOIN livedata live ON live.ID = B.ID WHERE live.stamp <> b.stampcopy) 3) Insert all data from livedata that doesn't exist in backupdata Method 2: 1) Create a logging table that records which records in livedata have changed and the change that was made (INSERT, UPDATE, DELETE). 2) Create a trigger on livedata would add a record to the logging table including ID, Action, ActionTime. (ActionTime could be a timestamp or datetime). 3) Write a script that reads the log table, and syncs the backupdata table on a schedule based on the ID and the action with the oldest ActionTime. Method 3:
TRUNCATE TABLE backupdata INSERT INTO backupdata (columns) SELECT columns FROM livedata
Method 4:CREATE TABLE backupdata_temp ( columns..... ) INSERT INTO backupdata_temp (columns) SELECT columns FROM livedata DROP TABLE backupdata EXEC sp_rename ......
Either way make sure you wrap the whole thing in a transaction and set the logging mode to BULK LOGGED before you do it and then change the logging mode back when you're done. Also, if you use method 3, make sure you recreate any indexes you had that were on the backup table. The choice of methods will depend on the availability requirements for the backup table and the amount of data you're working with.Mark's blog: developMENTALmadness.blogspot.com
-
Does the data in the production table change at all (UPDATE/DELETE)? Or is it always new data (INSERT)? One answer is to use Red Gate's SQL Data Compare. It's pretty cheap and you can write a script to sync the tables every two hours. You could dump/refresh the backup table. Method 1: This would work much like using the Red Gate tool. 1) Add a timestamp column to the livedata table and a binary column to the backupdata table 2) Compare and update existing data (UPDATE b SET column = live.column FROM backupdata b INNER JOIN livedata live ON live.ID = B.ID WHERE live.stamp <> b.stampcopy) 3) Insert all data from livedata that doesn't exist in backupdata Method 2: 1) Create a logging table that records which records in livedata have changed and the change that was made (INSERT, UPDATE, DELETE). 2) Create a trigger on livedata would add a record to the logging table including ID, Action, ActionTime. (ActionTime could be a timestamp or datetime). 3) Write a script that reads the log table, and syncs the backupdata table on a schedule based on the ID and the action with the oldest ActionTime. Method 3:
TRUNCATE TABLE backupdata INSERT INTO backupdata (columns) SELECT columns FROM livedata
Method 4:CREATE TABLE backupdata_temp ( columns..... ) INSERT INTO backupdata_temp (columns) SELECT columns FROM livedata DROP TABLE backupdata EXEC sp_rename ......
Either way make sure you wrap the whole thing in a transaction and set the logging mode to BULK LOGGED before you do it and then change the logging mode back when you're done. Also, if you use method 3, make sure you recreate any indexes you had that were on the backup table. The choice of methods will depend on the availability requirements for the backup table and the amount of data you're working with.Mark's blog: developMENTALmadness.blogspot.com
-
Hi Mark, Thank you once again Mark.Its has been very helpfull.I am only INSERTING new records to the live table.No Update or Delete. Which method will best fit this case? Thank you.
-
Hi Mark, I would like to use the First step.Does it mean that i have to create a relationship between Live Table and Backup_table.As you have mentioned INNER JOIN. Thank you.
INNER JOIN does not require an explicit relationship, so no you have an implicit relationship due to the fact that backuptable will have the same primary key as livetable. You will be able to join the table using the primary key from each table. But if you are only doing inserts then there are two solutions: 1)
INSERT INTO backuptable (columns) SELECT columns FROM livetable L WHERE NOT EXISTS(SELECT * FROM backuptable WHERE ID = L.ID)
2) If you are using an IDENTITY (aka AutoIncrement) column as your primary key you could also do thisINSERT INTO backuptable (columns) SELECT columns FROM livetable WHERE ID > (SELECT MAX(ID) FROM backuptable)
The second will be faster as long as there is an index on ID for both tables (which if you have created them correctly and assigned them as primary keys then there will be).Mark's blog: developMENTALmadness.blogspot.com
-
Hi Mark, Thank you once again Mark.Its has been very helpfull.I am only INSERTING new records to the live table.No Update or Delete. Which method will best fit this case? Thank you.