As a full fledged programming language, Python has its own runtime and environment for coding. Therefore, you can use Python to prep data before you ever consider importing it into your data model. Python can also be used during data import as entire queries or steps within queries can be written in Python. After importing the data into your model, Python can be used in Python visuals. To use Python as the basis for an entire query, in Power BI choose Get Data | More | Other | Python Script. In the Power BI Query Editor, on the Transform tab, “Run Python Script” is the last icon on the right and will insert a Python script into your query as a step. When on the Report tab within Power BI, one of the icons in the next to last row is the “Python script visual”. Perhaps unsurprisingly, its icon is “Py”.
As a non-programmer, I can imagine a Py script that un-compresses a dset file.
Excel has Power Bi build in behind the scenes (called PowerPivot).
You use “Get data” to import data into the Data model (not a sheet).
There is almost no limit to the number of rows you can import. I have done 102 million. (should not have too many columns).
Once you have tables in the data model, you can write “measures” (calculations) on it using Dax language. Very efficient and fast.
I then create pivot tables using those measures and share the sheet with accounting folks who don’t know EM. They often copy/paste parts of these for various purposes. Often they add formulas to the side of the pivot table for various reasons.
In this case I reconcile data from external parties to our general ledger. We need to keep the Excel recons for audit purposes and they want to see how I got to the recon.
I am trying to figure out how to use EM pivots to get to the same end result.I am sure it could be done, but the learning curve… In Excel I already have the knowledge.
I totally understand that. We're considering different ways of integrating EasyMorph with Excel. Ideally, it should make it possible to pull datasets from EasyMorph Server right into Excel and maybe even refresh these datasets right from Excel. But we haven't decided yet on how exactly it's going to work.
As of now, I suppose the only way to do it is to export data from EasyMorph into a database table in MS SQL Server, and then pull it in PowerPivot. Starting from v4.5 EasyMorph has the "Bulk export to database" action for fast export into SQL Server (the regular "Export to database" action can be slow). So I would suggest trying that.
If you don't have an instance of SQL Server available, I believe the free SQL Server Express should be relatively easy to install.
It is good to hear you are thinking in this direction.
A connector to a EM dset will solve this problem. It will allow users to Pull data into EM instead of Pushing it to Power BI. I mentioned that Excel and Power BI share the “Get data” functionality, so one connector will allow users to pull data into either.
Pushing data into SQL is probably the more elegant solution for advanced solutions, but it adds a layer of complexity that prevents 99% of non-coders to attempt it.
Most of my work just calls for a simple place to store data in one or two tables for import into Excel. The tables being overwritten each time (no complexity of upserting, maintaining a database.)
Writing to a csv file is such a simple solution - except that it is slow and large. Hence my suggestion to use dset files, knowing it would be smaller and hoping it would be faster.
Power Bi really makes it simple to create a connector. Would not take you more than a few hours to figure out. Google “create custom connector power BI”