Easymorph Memory Capacity

Hello good morning everyone,
just to confirm with you what capacity easymoprh has to work with its internal memory?
Referring to uploading a csv format file and how many megabytes can it hold


Dmitry/team may chime in to fine-tune this response, but my belief is that EasyMorph, working in real-time in RAM, is limited only by the capacity of the system you’re running it on.

If it’s a massive csv file and it’s eating up a ton of memory, try converting it to a more efficient .dset file (EasyMorph native file format) and read that in. If the csv is too large or cumbersome to read all at once to do that, try splitting it, read each piece in, and convert it to .dset, then load all of the .dsets and merge them into a single, large .dset.

Totally my opinion, as I understand how EasyMorph works with RAM.

Could you confirm if I need to load 100 million records in a csv format file?

and a query of this within the preview view the records can be visible?

Hi, Serch1,

While I’m not 100% what you’re asking, I’ll do my best to answer.

The resources required to load would also depend on the # of columns/fields in addition to rows (100M rows x 10 columns requiring much less than 100M x 1000 columns).

Personally, I would try to load it as-is. It might take a little while to load. While that’s going on, open Task Manager (CTL+ALT+DEL, “Task Manager”, and make sure you’re on the Performance tab). Find the EasyMorph process in the first column and refer to the “Memory” column to see how much RAM it’s taking up. I keep an eye on the large # at the top for Memory. If that creeps up over 80-90%, and it’s still going, it’s probably going to be too large to load at one time, and you should split it. Otherwise, your system will perform extremely poorly (if at all).

In that case, convert the one or split files into the .dset format and load that way - compare how much RAM is being used and if you’re still able to work with it that way. IF it’s still pretty large, and sluggish to work with, try cleaning out as many records as you can (any “junk” data you wouldn’t need/use) and resave as a new, smaller .dset file, and continue working with that.

As for “preview view”… The results of changes/transformations to the data happen, and are visible, in real time within the app.

Does that answer your questions?

EasyMorph compresses CSV data on the fly so typically it can load CSV files that are bigger than the available RAM. However, compression effectiveness highly depends on the loaded data - its cardinality, data types, etc. The very rough rule of thumb is 1 GB of RAM is required for 1 mln rows of CSV data - that would give you a ballpark idea. More precise memory consumption can be estimated by loading a subset of data (or all of it) and using Window Task Manager as suggested by Craig.

Very large CSV datasets that don’t fit RAM even compressed can still be processed conveniently in EasyMorph. Split big CSV files into smaller files using the “Split delimited file action” and use iterations to process each of them in a loop.