I use Bigquery as a data-warehouse, I know loading to BigQuery is on the pipeline, but in my use case, I prefer first to load the data to Google Storage then batch load from Google Storage to BQ.
Currently I use rather a hack, for data preparation, I use PowerBI desktop which generate a csv that I load then to Google storage using Python.
if Easymorph can natively load to Google storage without python, then it add a competitive advantage compared to other tools.
Easymorph can easily import data from BigQuery and export it in different formats. That’s great, but a direct Google BigQuery export would be just nicer and cleaner!
Maybe a native BigQuery integration (without ODBC driver) would be a great way to achieve this.
In this way, a data export to BigQuery with different methods like “append” or “update” rows would be awesome.
Hi there export into Bigquery would be a great functionality.
At this moment Easymorph can handle direct export into aws redshift and also azure synapse dwh abd works amazing. So Bigquery direct export would be a nice feature to handle all the 3 cloud conectors.
We started to work on the “Export to BigQuery” action.
Now we are considering creating a separate “Export to BigQuery” connector type that will require creating a service account in the Google Cloud console and then providing credentials of that account to the connector.
Will this approach work for you? Maybe you have some use cases when a service account can’t be used and the OAuth authentication workflow is required?
yes that would great for us as well. I double checked with IT and currently we don’t require the OAuth authentication.
However, of course it would be great to have that option in the future.
But our main issues would be solved with you approach. So Thanks a million already.