Not a good practice. Yet where I am now, we're doing something potentially worse. Because (as in your case) deployment of code changes takes a while, and the report is "Very Important! It goes all the way up to the CIO!", the report developers argued that in order to provide the most accurate numbers, it has to be run against the most up-to-date code -- and that means running the production report on the DEV database* :sigh: . So while I'm trying to improve the quality of the data I provide to the reporting team they keep complaining that the data is changing while they try to run the report -- of course it's changing! It's DEV! So they want to run in DEV so they can receive changes in the data quickly, but they don't want the data to change. I'm reminded of the line from "The Twelve Chairs" -- "Hurry home, but don't gallop." :^) To make matters worse; I suspect that the reporting team is now referring to DEV as PROD! :omg: * This is basically just a warehouse of data that has been ETLed from other databases within the enterprise. <kvetch> Oh, and I forgot the latest wrinkle -- my ETLs fill staging, then someone else reads staging to populate a Data Vault, then someone else reads that to populate a cube -- and lastly a report is run. And now the reporting team wants me to tell them every change I make that will impact their report -- I have no idea what their report entails, I have no idea what parts of the cube are involved, I have no idea what parts of the Data Vault the cube uses, I have no idea what parts of staging go where in the Data Vault, yet I'm supposed to know exactly how this data is used three levels downstream of me?! I don't even know what most of the data in staging means; I just copy it from other places. And, get this, they want me to tell them this so they don't have to waste their time investigating fluctuations they spot in the report. :wtf: I'm so glad I'm on vacation this week. </kvetch>