#1
  1. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Aug 2013
    Posts
    10
    Rep Power
    0

    SSIS foreach loop ADO enumerator(from one table to another table after 10 records)


    I have a database source table and database destination table. I need to load (using foreach loop) data from first table to second table. In the normal situation is that rows are loaded singly, is that true? but I need to repeat loop after load by ten records. reason why Loop:I want to have overview over the course of loading data

    what is done: -I have a sql task where I select all of the records from source table and save to full result set as variable with data type object.

    -drag and drop foreach loop container and set ADO enumerator and set object variable.

    I dont know what now.
    Last edited by adamreha; August 20th, 2013 at 02:49 AM. Reason: long title too much
  2. #2
  3. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Aug 2013
    Posts
    10
    Rep Power
    0
    Could you please help me someone?
  4. #3
  5. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Oct 2012
    Posts
    101
    Rep Power
    9
    May I ask you why you want to use for each loop and not just data flow task?
    if all what you need just load data from one table to another just
    set dataflow task acc data source and data destination, configure them and if you need any data transformation add between data source and data destination
  6. #4
  7. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Aug 2013
    Posts
    10
    Rep Power
    0
    Originally Posted by gk53
    May I ask you why you want to use for each loop and not just data flow task?
    if all what you need just load data from one table to another just
    set dataflow task acc data source and data destination, configure them and if you need any data transformation add between data source and data destination
    This is what I dont exactly need. I want to have oversight on trasmitted data. Previous solution was based on MERGE, but that action load all data without any control, and some oversight too.
    In case that will be huge amount of trasmitted data, I want control loading process(by using output variables). I know that is maybe a little bit confusing. Maybe some way which attacked my mind is split list with data from database (Which is in SQL task saved into variable) to parts which consist of 10 records. And after that start loop for each of these parts.

    Simply: I need to start loop after after every 10 rows(not after one) and make record about execution this process.

    Thanks for advance
  8. #5
  9. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Oct 2012
    Posts
    101
    Rep Power
    9
    you do not need that. You can insert all your validation script between data source and data destination, I did number of packages which load 500,000,000 recods and all validation runs in dataflow, ssytem will manage that. You need just need to add steps in process of loading data (something like "Derived column" where you can change column size or remove some characters, script transformation task, data conversion componendt, counters, conditional split for output errors in data asn ect.)
    Look on that. It is everything already there, for each loop container was designed for different purpose it is not ro each record, it is for each value in variable
    Last edited by gk53; August 21st, 2013 at 01:50 PM.
  10. #6
  11. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Aug 2013
    Posts
    10
    Rep Power
    0
    Originally Posted by gk53
    you do not need that. You can insert all your validation script between data source and data destination, I did number of packages which load 500,000,000 recods and all validation runs in dataflow, ssytem will manage that. You need just need to add steps in process of loading data (something like "Derived column" where you can change column size or remove some characters, script transformation task, data conversion componendt, counters, conditional split for output errors in data asn ect.)
    Look on that. It is everything already there, for each loop container was designed for different purpose it is not ro each record, it is for each value in variable
    Yes I know, its better for, for example, few data sources with same structure and one destination. And loop can ensure transmission between the sources and destination. And one loop will be for each of sources separately. I have a this solution resolved by same way how you writed But team leader said his opinion and idea, and Im the small master
  12. #7
  13. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Oct 2012
    Posts
    101
    Rep Power
    9
    but the way you trying to set up read all data into variable and do all manipulation after that may work only for small tables otherwise it will blow server... (computer where you will run ssis) and even if it will run it will take long time and a lot of resources... (with direct load, validation and transformation I loaded 300,000 records from flat file into sql table in 40 sec in your case it takes an hours)
    In this case you should read like you said 10 records, do what ever you need whith them, save into destination, and read again 10 records which not in first 10 I do not know if you can join source and destination tables to set query for read data if not you need to have additional table on your data sorce where you can track which records already has been processed
  14. #8
  15. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Aug 2013
    Posts
    10
    Rep Power
    0
    Originally Posted by gk53
    but the way you trying to set up read all data into variable and do all manipulation after that may work only for small tables otherwise it will blow server... (computer where you will run ssis) and even if it will run it will take long time and a lot of resources... (with direct load, validation and transformation I loaded 300,000 records from flat file into sql table in 40 sec in your case it takes an hours)
    In this case you should read like you said 10 records, do what ever you need whith them, save into destination, and read again 10 records which not in first 10 I do not know if you can join source and destination tables to set query for read data if not you need to have additional table on your data sorce where you can track which records already has been processed
    Yes sure! I convinced my team leader that solution with loop is useless and I give him my MERGE solution and solution was agreed
    So, while we are at it, maybe it could be done through table with all data and task which get always top 10 rows,,and after that delete them from source table and again get top 10, until the end.

IMN logo majestic logo threadwatch logo seochat tools logo