Hi - I am trying to extract data from some extremely large tables (10’s of millions of rows) which are too large for EM to handle in one pass so I am trying to chunk the tables. My issue is that I don’t have an element that works in an efficient enough manner. My thought now is to attempt something like this:
- Get row count of table
- calculate the number of iterations needed for efficient processing (right now about 1M rows per iteration) round((row_count/1000000) + 1,0) - table has 1M+ rows would come out to 2 iterations.
- iterate against the table for that number of times - iteration has sql where clause that would use a modulus calc to determine which batch of data to select (first iteration would grab 1M rows, next would grab the rest).
What I am struggling with is how to define the loop - do I use REPEAT or something else?