Limiting data transfers with EasyMorph Server

I am trying to set up a process that would allow my users to configure and run EasyMorph Server workflows while using as little of local resources as possible. The input data to these workflows will always be different, but the final step produces a standard output file. The steps are:

  1. Download raw data files from a portal or receive via email
  2. Merge files, and run cleanup to produce intermediate dataset to be mapped
  3. Map field names to target state and run final validation

Some of these raw files are quite large, 10GB+, which causes EasyMorph desktop to perform slowly or crash. We have an EasyMorph server instance stood up with 128GB of RAM, which should allow us to ingest these large datasets. We’re getting stuck on how to allow the flexibility to create these custom workflows server side without hitting local limitations. Any insights on how to set this up would be great. Thanks!

Hi Will, and welcome to the Community!

You can use the “Split delimited text file” action to split large text files into multiple smaller files. Then process these smaller files using iterations.

To design a workflow on Desktop just use one or two of the smaller files, then publish the workflow to the Server. It might be a good idea to keep splitting big files and iterating on the Server too to reduce total memory consumption.