This is more of a best practice and/or EasyMorph approach topic.
Business Workflow
- Read from web API source GPS breadcrumb data for many vehicles with each vehicle having many updates
- Transform the data to make sense to the business tables
a. Header table for last known location
i. One row per asset
b. Detail table for all breadcrumbs consumed.
i. On asset many rows
ii. May have already read the breadcrumbs previously so ignore if the same
iii. Database table has a unique index to prevent duplicates. - Workflow is executed every 15 minutes
The good news is that I have this already working using the easymorph platform. I learned a great deal of how the platform works but I still to learn more. Hence the reason for the question.
Current Approach
-
Read from api
-
Keep/remove columns.
-
Rename columns.
-
Rule logic to control country name.
-
Calculate columns.
a. Mostly to deal with combining adding some constants -
Fix dates to match database columns
a. Did I mention I hate date logic for all databases. Laughing -
Update the asset header table with last know location from ingest.
a. Works perfectly -
For some reason have to adjust dates again after the update
-
Then iterate through a module (Merge_F_GPS_YARD)
a. This is the one that is very large and will update the duplicates with just the updated date
b. Then inserts new rows
c. To achieve this I used a custom SQL statement this Merge into statement that goes row by row
d.
-
It works and it is fast. So great news. Again, learning the platform and looking learn more.
The Ask
I have created a unique index on the larger table and just wanted to “Export to database” on mass. When I did this, and the entire input table was NEW and not in the table it worked fine. As soon as one row was duplicated, I just wanted easy morph to ignore the error and keep going to the other row. However regardless of what flags I flipped I was not able to get easymorph to insert into the table just the new rows. The whole batch was ignored.
I wanted to use the database engine to just deal with the inserts and have the duplicate rows ignored.
So does the easymorph community have any suggestions for the newbie. Now remember, the export to table has 10’s of millions of rows so trying to get a temp table of potential matches is a costly sql call. So the “merge into sql statement” achieves the goal. On the update it just updates the modified row date and the insert works fine
the Net/net is that I have it working but would love to learn a cleaner more effective approach that leverages the tool more.
So for now, a fool with a tool is still a fool. That fool is currently me.
Thank you in advance.