Feeding parameters from a C# app into an SSIS package that converts query results to CSV and FTP [modified]
-
Hi guys. This post is building upon the solution from my previous post in here on Jun 29th. Synopsis: Building a scheduling/routing application on top of a data-driven app that lives in SQLServer that identifies customers that have dropped our service for "Winback" call center campaigns (and the like) The logic for each of the campaigns is encapsulated within a query run within a stored procedure that is located on our Reporting database (it runs there because Reporting has the $**T indexed out of it) The stored procedure is executed from our local production database across a server link. My application knows absolutely nothing about the underlying stored proc. It knows of a command that it is wishing to execute for a particular campaign, and I have a mapping table that stores a pointer to the remote stored proc. I have a brokerage entry point exposed as a stored proc to my application that accepts parameters, dips into the local Mapping table, pulls the name of the stored proc, executes it across the database link, receives the result set, generates a GUID for the batch, enters the results into a local table called "Solicitations" with the GUID as a correlator and returns the GUID to the calling client. ...I was proud that I was able to accomplish all of that. I am nowhere *NEAR* what one would call a "DBA" and all of my "education" is in Oracle9i. I'm currently "working" in SQL Server 2008R2. But now: All of the previous work has been the "sourcing" side of our process. Being able to get the data out of Reporting and into our local table is just the data pump that allows our campaigns to run. After the calling client receives the GUID as its return, I need to pass the GUID into an SSIS package that can receive that GUID, identify the campaign that is being run, dip into the mapping table to pick up the filename associated with that campaign, pull the relevant rows out of the Solicitations table, map them to a canonical format, and FTP them to an FTP server that acts as an integration point between our Customer/Account system and the system that manages call routing for our call center. I have created an SSIS package that has a Data Flow task that Flat File Exports the Solicitations table to csv as "testy.csv" on my local file system and then an FTP task that picks that resulting file up and :21's it to the integration point. I went to the server and picked the file up and it was the file I was expecting. All of the guts of the "happy path" flow are there.
My actual question:
What -
Hi guys. This post is building upon the solution from my previous post in here on Jun 29th. Synopsis: Building a scheduling/routing application on top of a data-driven app that lives in SQLServer that identifies customers that have dropped our service for "Winback" call center campaigns (and the like) The logic for each of the campaigns is encapsulated within a query run within a stored procedure that is located on our Reporting database (it runs there because Reporting has the $**T indexed out of it) The stored procedure is executed from our local production database across a server link. My application knows absolutely nothing about the underlying stored proc. It knows of a command that it is wishing to execute for a particular campaign, and I have a mapping table that stores a pointer to the remote stored proc. I have a brokerage entry point exposed as a stored proc to my application that accepts parameters, dips into the local Mapping table, pulls the name of the stored proc, executes it across the database link, receives the result set, generates a GUID for the batch, enters the results into a local table called "Solicitations" with the GUID as a correlator and returns the GUID to the calling client. ...I was proud that I was able to accomplish all of that. I am nowhere *NEAR* what one would call a "DBA" and all of my "education" is in Oracle9i. I'm currently "working" in SQL Server 2008R2. But now: All of the previous work has been the "sourcing" side of our process. Being able to get the data out of Reporting and into our local table is just the data pump that allows our campaigns to run. After the calling client receives the GUID as its return, I need to pass the GUID into an SSIS package that can receive that GUID, identify the campaign that is being run, dip into the mapping table to pick up the filename associated with that campaign, pull the relevant rows out of the Solicitations table, map them to a canonical format, and FTP them to an FTP server that acts as an integration point between our Customer/Account system and the system that manages call routing for our call center. I have created an SSIS package that has a Data Flow task that Flat File Exports the Solicitations table to csv as "testy.csv" on my local file system and then an FTP task that picks that resulting file up and :21's it to the integration point. I went to the server and picked the file up and it was the file I was expecting. All of the guts of the "happy path" flow are there.
My actual question:
WhatI found an answer to my question. I created a stored procedure that accepts the GUID as a parameter and within the proc, I (create, add, and kick off) a job that executes the SSIS package as its only step, passing the GUID into the command line args of the command property of sp_add_jobstep. Right now, the only thing in my way is that the server was restarted last night and the Job Agent wasn't part of the restart....so I'm waiting for server support to get back from lunch so that I can test out my implementation. If anyone has any questions about my approach, I'd be more than happy to share.
"I need build Skynet. Plz send code"