Best Workflow Practices for Important/Exporting Huge Files Oct 2021

Hi, using Premium Power BI as main BI/dashboard tool. EM much better than Power Query because it’s way faster (even with Premium capacity).

Using an ODBC connector with whitelisted IP to a remote Redshift DB and downloading huge CSVs (+100M rows). Doing all transforms in locally in EM. Works great. Then need to assemble tables in a complex schema in Power BI desktop then load DM and dashboards to premium services cloud. So what is best practice?

  1. Exporting CSVs for subsequent import to PBI is super inefficient. Takes forever to load, save, refresh, etc. But it works …

  2. Exporting to PBI dataset works well (fast load, no errors) but if I take this path I can only load table into into a dataset and then only load one dataset into a PBI model, unless I have read the latest posts on this incorrectly. For some weird reason, if I load an EM dataset into a PBI model, that’s all i can load. Can’t even open get data or transform window. Frustrating.

  3. Exporting to a local SQL Server database. This works much better when data is finally loaded in PBI but it is very cumbersome (e.g., need to recreate each column in each table in SQL Server, correct?) and seems like a long walk for a short drink.

Am I missing something? Are there other options? Any insights greatly appreciated.

Hi Rick,

If I’m reading it correctly, the problem with p.2) is that it’s not possible to create a Power BI dataset with multiple tables in it from EasyMorph. Is that correct? Is it possible to create an “empty” dataset with (empty) multiple tables in the Power BI Service and then populate the tables with EasyMorph? Note that the “Export to Power BI” action in EasyMorph has a setting to amend column definitions. If checked, it will basically replace column names and types with the one that are being exported from EasyMorph into the dataset.