Try saving into a dataset file (.dset) instead of Excel. Then have another project that just reads dataset and writes is into Excel.
The reason – the original project keeps some intermediate tables in memory. Probably the resulting table exported into Excel is also large so the generated XML for Excel takes a lot of RAM too and breaks the memory limit. By saving into a dataset file first, the first project will complete and therefore will release the intermediate tables and free up RAM. Writing/reading datasets doesn’t take much RAM as it’s a native format for EasyMorph.
Another suggestion – process big DB tables in chunks using iterations. For instance, if you have multiple daily periods in the original table, then pull the list of unique dates first, then do iterations and pull only data related for each date at once. Save each daily result into a dataset file. Once you process the entire DB table, load all datasets from multiple dataset files and export the result into Excel.
Alternatively, get more RAM It seems like you’ve recently started experiencing more problems with insufficient RAM size on your server. Upgrading RAM can be more cost-effective than spending time on optimizing projects for lower memory footprint.