On the website of EasyMorph we read that :
- It is advised to have >16GB of RAM if your typical datasets exceed 1 billion data points (i.e. rows x columns).
I am working on a dataset of 1.7 mln records and about 80 columns in a rather complex ETL (many transformations maybe mor than 100).
I have the impression that I am using about 65% of RAM (16 GB machine) and it takes more than 5 min to run it in desktop. This can be annoying when a recalc is necessary because I entered a transformation in the beginning of the flow.
Do I need more RAM to speed this up ? How can I make it more efficient?