Very Large Datasets - How do you do it without Iterations?

Hi all,

I am facing a situation where I have to process tables in excess of 500 million rows. I have managed to do so with iterations, but for smaller tables. In this case it’s tough to impossible.

  1. Is it far fetched to ask for a functionality that keeps data in the database (e.g. Snowflake), and allows EasyMorph transformations using the Snowflake engine (ELT), even if it would only be for a few core actions?

  2. Does anyone know of a tool that has similar transformation capabilities compared to Easymorph, but for large data?

Thanks in advance for any pointers or tips!