Filter then unfilter

Issue I’m trying to fix is having multiple duplicate lines in a database. I want to export this table to 3 different tables in my SQL chart. The issue is i need to filter the table the first time and remove duplicates for table 1, then i want to unfilter the table and move on to table 2 and filter on a different criteria, then unfilter and write to table 3 without remove duplicates. any suggestions on how to use the filter then unfilter it to get the table back?

I would suggest using derived tables in EasyMorph. In this case:

  1. Load the database table into EasyMorph.
  2. Derive from the loaded table 3 new tables in EasyMorph. Each derived table with automatically replicate data from the source table without extra queries or taking extra memory.
  3. In each derived table, filter and deduplicate as necessary, export to a respective table in the SQL chart.
1 Like

Thanks for your quick reply!!! One question more to this, do each table run in parallel? As i’m working out the logic on easymorph the third table will need to see the results of the 2nd table before it can finish.

so logic would go as…

  1. derive orig table
  2. merge table to get an Id from another sql table
  3. del the id found from sql table a
  4. del id2 and date from sql table b
  5. filter table and remove dups
  6. export to sql table b

but in other table…

  1. derive table
  2. lookup for id after the new exported sql table b
  3. then export the new data
    *the issue is the derived table has to be unfiltered at this point… :confused: its like a catch 22. its still comes back to unfiltering the table - if these run in sequence then i should be fine to do the query in the third table but if i’m reading it correclty looks it runs in parallel.

thanks again!

Yes, see here a visual explanation: Derived tables synchronisation

To synchronize parallel calculations between tables, use the "Synchronize" action. See also this topic: Workflow for loading tables in a database

1 Like

Thank you so much!!! that exactly what I needed. I’m going to test this flow and hope that it runs but added a synchronize option to wait for the table to complete before querying that table to get a new ID (primary key) and then use that for the export for the last table.

You’re welcome! :slight_smile:

Final Question! :slight_smile: So everything ran great except after the synchronization, I do a merge to the table after the table completes hoping to get the latest updated id number that gets generated from table 2. Unfortunately, its like the import took a snapshot at first load, so the data never shows the newer information. Hows the best way to invoke an import from a table so that I get the latest data after table 2 runs.

An import action doesn’t always have to be the 1st in a table. You can insert import actions anywhere, after any action. So if you need to re-import data at some point, just press “Add data” on the toolbar, open the “Import” category, and drag the import action into the position where you need to insert it.

Or even simpler, copy the initial import action and paste it where data needs to be re-imported.

Hmm how can see in which order its in? i have those 3 tables. first table completes, second table runs in parallel, third runs and waits till the second table is done at that piont the third table needs to refresh the sql table that got uploaded with the export from table 2. i tried copying the import into table 3 but the derived table disappeared and replaced it with the pulled imported table. so it couldn’t continue grabbing the data from it. Is there a place somewhere in the design phase that i can ensure that its placed in the right loading order? thank you again!

No derived table should disappear when an action is copied.

It’s hard to understand from a verbal description what’s going on there. Can you post your project, or at least its screenshot?

Thanks so much for responding! Sorry been out ill but now back!

Lets see if I can write it out. otherwise I’ll try to add a picture of the logic.

Table A contains all the raw data from an XLS file
Derived Table B, C, D
Table B edits table then writes to SQL Table A
Table C edits table then writes to SQL Table B
Table D waits for Table C to complete (needs a primary key created to use to update its SQL table C).

So what I did was in Table D (derived table):
Derived table
Perform Calculations to derived table
Synchronize (wait for table C)
Import again SQL table B
*At this point the derived table disappears in the preview and I see the updated sql table B in the preview window.
I have to do a lookup from derived table column to the updated imported sql table b to get the primary key and add that to the derived table as a new column
Then export the new table (dervied table with added lookup primary key) to SQL Table C

whats happening is As soon as I import that table, the preview window disappears and replaces with the new updated imported table in place of the derived table D. So when the lookup happens it can’t find the original columns from the derived table to match the imported table.

With that said I did do a workaround where I merged back the original table to the imported table. Then filtered the rows to what I needed. Then exported the table to SQL table C. this is working for me but seems pasted together.

Let me know what I did wrong! :slight_smile:
Thanks
Flowy

What if you derive Table E from Table C and re-import SQL Table B in table E? In this case you will keep Table D with all the calculations, and will be able to use it with re-imported SQL Table B.

Also in this case you won’t need to do synchronization explicitly as Table E will be derived from B which by definition makes it synchronized

Nice Let me give that a try and see if that works!

Thank you! That worked amazing. Didn’t think about deriving the new table would wait.

Great solution!