Working with large JSON Files

Hi,

Here is some background info: I am relatively new working with JSON, but I have been using the Add Data > Import File > Import JSON from file. This normally takes quite a while as the file sizes are so large. (Some are 500MB) JSON format is the only way that we can export this data from where it resides.

Here are my questions:
I have a few smaller sample files to work with (20MB) and these even take a while to load. I saw the community post where you can import from a delimited text file and using the Concatenate text/parse JSON actions you are able to get to a great end result. Should I try to convert my JSON files to .txt? Is that even possible to do easily?

I then have to do a number of actions such as remove unnecessary columns, convert from Unix times, etc. These actions will be the same for every file I do. Right now, we have about 200-300 of these files. Is there a way to create a template in EasyMorph to run the same actions with different files?

Again, I am relatively new working with JSON files, and I only have a few months experience working with EasyMorph. I might be thinking about this wrong, but any suggestions will be helpful. I can provide any screenshots or examples needed to the best of my ability (without revealing sensitive data).

Thank you in advance.

Hi Jaden,

What exactly is your concern here - is it the long load time for a JSON file? If yes, how long does it take to load a big JSON file? Note that if the large files are located on a slow network drive, then it can be the network latency that slows down loading. Try moving the files to a local folder first.

You can specify a file name using a parameter. See the tutorial on parameters here: EasyMorph | Parameters.

Also, to process many files using the same workflow, you can arrange a loop using iterations. Iterations use parameters. Read more about iterations here: EasyMorph | Loops and iterations

Yes, it takes about 10 minutes for one of our files to run. So I am using a VM to log on to where the data source is so I would not be pulling it across a network drive.

I was trying to use parameters but there are different file names for each JSON file, so I am having to create a new parameter for every file anyway which is helpful if it needs to be changed, but I still have to set everything up.

I will try the iterations in combination with the parameters to see if that helps.

Thanks Dmitry.