Best Practice for loading a huge file

Wonderful !

BTW, starting from version 4.0 it will be possible to save loaded data right in EasyMorph projects. So that when you open a project its start transformations already contain last loaded data.

Does this happen automatically, or does it need to be configured somewhere. I'm asking because I have a project that loads a CSV containing 6.7 million rows by 330 columns. When I close the project and re-open it, it runs for several minutes before displaying the data. This makes me think it's re-loading the data from the CSV each time, instead of using the data stored in the project.

We decided to not store data in EasyMorph projects. Instead, we provide two options:

  1. Store date in the native EasyMorph format as .DSET files. EasyMorph reads and writes files in the .DSET format very quickly, even if it’s millions of rows. A .DSET file can be created using the “Export to dataset” action, or simply right-clicking an action and choosing “Send to file”.

    Importing a .DSET file can be done using the “Import dataset” action.

  2. Store datasets in EasyMorph Server and hot-linking them in EasyMorph projects so they import datasets from the Server on opening. This is a more secure option, but it requires setting up EasyMorph Server. More details here: Server Link explained