Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Database & SysAdmin
  3. Database
  4. DTS

DTS

Scheduled Pinned Locked Moved Database
questiondatabasehelp
6 Posts 2 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • K Offline
    K Offline
    kibromg 0
    wrote on last edited by
    #1

    Hi all, I am trying to fetch a CsV file from an Ftp site on to a SQL table every three hours.The Csv has the same fields as the table in Sql. How do i go about this problem? your ideas are most appreciated. Please advice. Thank you

    M 1 Reply Last reply
    0
    • K kibromg 0

      Hi all, I am trying to fetch a CsV file from an Ftp site on to a SQL table every three hours.The Csv has the same fields as the table in Sql. How do i go about this problem? your ideas are most appreciated. Please advice. Thank you

      M Offline
      M Offline
      Mark J Miller
      wrote on last edited by
      #2

      If you're using 2005 search google for SSIS FTP Task. If you're using an earlier version of SQL Server search google for DTS FTP Task. If you run into any problems, then let us know what your specific problem is and what the specific symptoms are and we'll give you a hand.

      Mark's blog: developMENTALmadness.blogspot.com

      K 1 Reply Last reply
      0
      • M Mark J Miller

        If you're using 2005 search google for SSIS FTP Task. If you're using an earlier version of SQL Server search google for DTS FTP Task. If you run into any problems, then let us know what your specific problem is and what the specific symptoms are and we'll give you a hand.

        Mark's blog: developMENTALmadness.blogspot.com

        K Offline
        K Offline
        kibromg 0
        wrote on last edited by
        #3

        Thank you mark, I have manged to copy the file from the FTP task to the local Pc where SQl is running.However now would like to update my table with the Bulk Insert task in to the table. How will I make sure that every 30 Minutes its getting a new file from the Ftp and then Insert it to My table in Sql2005.No Duplicated data will be enterd in to the table more than once. As I am dumping the file on two one folder how will the bulkinsert task remembers this file is used and the other one is going to be used. Thank you

        M 1 Reply Last reply
        0
        • K kibromg 0

          Thank you mark, I have manged to copy the file from the FTP task to the local Pc where SQl is running.However now would like to update my table with the Bulk Insert task in to the table. How will I make sure that every 30 Minutes its getting a new file from the Ftp and then Insert it to My table in Sql2005.No Duplicated data will be enterd in to the table more than once. As I am dumping the file on two one folder how will the bulkinsert task remembers this file is used and the other one is going to be used. Thank you

          M Offline
          M Offline
          Mark J Miller
          wrote on last edited by
          #4

          Create a data flow task. Use the flat file transform as your source and either Sql Server destination or OleDb destination transform. I personnally don't like the Sql Server destination because it seems buggy to me. If you want to prevent duplicate data you have a couple options: 1) create an empty table you can load using the data flow task. Then use a sql command to copy the data from the loading table into your target for all records which don't exist in the target. I recommend that you include a step to truncate your loading table before loading it will data. This option is probably the easiest if you're not familiar with SSIS and will probably perform best if you are working with a large dataset. 2) Add a lookup transform to your data flow task to match existing records from your target table with records in your flat file source. Then redirect any errors (ie. records which don't match, thus don't exist already in your target) to your destination transform. 3) Use 2 source transforms, one for your flat file source and one for your target table. Then use a merge transform and use it to filter any matches and return only records which don't have a match. Then pass the results to your destination transform. Options 2 and 3 may not be viable if your target table is large, unless you have a way to filter the data for the lookup/merge. Like if you can filter the data by a date range or by some foreign key based on your knowledge of the data that is in your flat file source. Hope this helps

          Mark's blog: developMENTALmadness.blogspot.com

          K 1 Reply Last reply
          0
          • M Mark J Miller

            Create a data flow task. Use the flat file transform as your source and either Sql Server destination or OleDb destination transform. I personnally don't like the Sql Server destination because it seems buggy to me. If you want to prevent duplicate data you have a couple options: 1) create an empty table you can load using the data flow task. Then use a sql command to copy the data from the loading table into your target for all records which don't exist in the target. I recommend that you include a step to truncate your loading table before loading it will data. This option is probably the easiest if you're not familiar with SSIS and will probably perform best if you are working with a large dataset. 2) Add a lookup transform to your data flow task to match existing records from your target table with records in your flat file source. Then redirect any errors (ie. records which don't match, thus don't exist already in your target) to your destination transform. 3) Use 2 source transforms, one for your flat file source and one for your target table. Then use a merge transform and use it to filter any matches and return only records which don't have a match. Then pass the results to your destination transform. Options 2 and 3 may not be viable if your target table is large, unless you have a way to filter the data for the lookup/merge. Like if you can filter the data by a date range or by some foreign key based on your knowledge of the data that is in your flat file source. Hope this helps

            Mark's blog: developMENTALmadness.blogspot.com

            K Offline
            K Offline
            kibromg 0
            wrote on last edited by
            #5

            Your Help is much appreciated, What i want mark now is how to select files that are not picked up from the Ftp site and only load them as they are not copied. That is to say for instance You Have a file name : -- 20080430_11.csv file generated from Ftp site at 11 O'clock today. and --20080430_12.csv for 12 hours ftp. How will the FTP task remembers that it has copied one file but not another one? Thank you once again. Reagrds

            modified on Wednesday, April 30, 2008 11:53 AM

            M 1 Reply Last reply
            0
            • K kibromg 0

              Your Help is much appreciated, What i want mark now is how to select files that are not picked up from the Ftp site and only load them as they are not copied. That is to say for instance You Have a file name : -- 20080430_11.csv file generated from Ftp site at 11 O'clock today. and --20080430_12.csv for 12 hours ftp. How will the FTP task remembers that it has copied one file but not another one? Thank you once again. Reagrds

              modified on Wednesday, April 30, 2008 11:53 AM

              M Offline
              M Offline
              Mark J Miller
              wrote on last edited by
              #6

              You could use another FTP task to delete the file after you're done. Or you could try and rename it. I don't know if that is directly supported, but if you have permissions you could possibly rename your local copy when you're done, then upload it back to the ftp server and delete the original copy from the server. You could try and keep a local list of files you're already processed if you don't have permissions on the FTP server and use that to determine which files need to be downloaded and processed.

              Mark's blog: developMENTALmadness.blogspot.com

              1 Reply Last reply
              0
              Reply
              • Reply as topic
              Log in to reply
              • Oldest to Newest
              • Newest to Oldest
              • Most Votes


              • Login

              • Don't have an account? Register

              • Login or register to search.
              • First post
                Last post
              0
              • Categories
              • Recent
              • Tags
              • Popular
              • World
              • Users
              • Groups