About EasyMorph Tutorials & Examples Web-help


We’re experimenting with additional in-memory data compression. The first test results look promising. On average, we see RAM footprint reduced by a factor of 3 at the cost of slowing down calculations by only 10-15%. And we’re not done with optimizations yet.

If things go well, we will be adding an option for enabling super-compression in v4.2.1.


Here is an example of the super-compression in action. The chart shows RAM consumption with and without the super-compression when loading a relatively large CSV file (445K rows, 84 columns). The total load time hasn’t changed.


1 Like

Here is what super-compression looks like two weeks before release:

A test project that generates 10 million random values, then converts them to text, saves into a file, and re-reads it.

Version 4.2.1 (current release), no super-compression, run time 59 s, RAM 4.3GB.

Version 4.2.2 (beta), super-compression is on, run time 37 s, RAM 1.4GB.

Remarks and observations:

  • The bigger datasets, the better the gains from super-compression
  • In projects with relatively small datasets and lots of actions the super-compression has a moderate effect
  • Super-compression is generally stable but we want to keep it in experimental mode for couple releases, just in case. In v4.2.2 it will be turned ON by default in Launcher, and by default OFF in Desktop and Server.
  • Super-compression has basically no performance overhead. In I/O heavy workflows (as in the test above) it even improves performance.
  • We’re intended to make super-compression always on after a couple releases. However, the final decision will be made after examining field reports.
To learn more about EasyMorph visit easymorph.com.