I am trying to set up a process that would allow my users to configure and run EasyMorph Server workflows while using as little of local resources as possible. The input data to these workflows will always be different, but the final step produces a standard output file. The steps are:
- Download raw data files from a portal or receive via email
- Merge files, and run cleanup to produce intermediate dataset to be mapped
- Map field names to target state and run final validation
Some of these raw files are quite large, 10GB+, which causes EasyMorph desktop to perform slowly or crash. We have an EasyMorph server instance stood up with 128GB of RAM, which should allow us to ingest these large datasets. We’re getting stuck on how to allow the flexibility to create these custom workflows server side without hitting local limitations. Any insights on how to set this up would be great. Thanks!