Large JSON Parse

Good afternoon all -

My use case is I need to effectively "sift" through a large amount of zendesk ticket data. What I've come up with is using their incremental process to extract the full data set.

in EM we Iterate through passing the next cursor as we go
https://COMPANYSUBDOMAIN.zendesk.com/api/v2/incremental/tickets/cursor.json

Problem Statement:
The response files can be up to 1000 tickets. Each response is ~36megs as a .json file. Its at about 1M records at this point.

Typically I I save the response as files and continue the iteration (pretty quick!). Then I attempt to load each of those json files into a singular .dset file so that I can analyze it (slow).

That being said its very slow to parse json fully when the file is that large. Any tips/tricks on this use case to make it a bit more speedy?

An article I was reading was effectively pushing those files into a sqlite db with json1 but I've not attempted this yet.

There are two ways to parse JSON in EasyMorph:

  • The "Parse JSON" action
  • The "Extract JSON properties" action

The latter sometimes works together with the "Parse JSON array" action.

If you use "Parse JSON" but it's slow, try "Extract JSON properties".