Hello EM community,
I am hoping to get some pointers to in dealing with a scenario I have using the iterate web request action.
I have a dataset that has poor, if any, change tracking in the tables. I am pulling the records and performing transformations on them and then writing those to a project management software called Monday.com via a graphql api.
The size of the dataset is about 5000-6000 records. I am finding that on average it takes approximately 7 seconds for the API to process each request. If I let the process iterate through sequentially I am looking at essentially an entire business day to write these to the destination.
I have tried splitting out the table into smaller tables to attempt to get them to write in parallel. I would really love to get this down to about 2 hours altogether. However, what I am noticing is that if I breakout the table into smaller tables like this, Easymorph will not necessarily execute them in parallel in the manner I would think it would. I am attaching a screenshot below of what I see while it is processing.
Can anyone think of any ideas on how I might be able to make this more performant?