I’ve been trying out the Salesforce Connector, it’s looking good!
Here are some comments:
Authorization URL: It works with production orgs (URL for authorizing begins with login.salesforce.com, which is the case when pressing the “Authorize” button), but not with sandbox orgs (in this case, the URL for authorizing begins with test.salesforce.com). Sandbox orgs are used for testing features before moving them to production orgs, so being able to connect to a Sandbox org is important, in my opinion. I’ve tried changing “login” to “test” manually in the URL that EasyMorph generates, I get to log in but the connector doesn’t get authorized. Could there be two buttons for authorizing: “Authorize (Production)” and “Authorize (Sandbox)”? Depending on that, the URL starts with either login.salesforce.com or test.salesforce.com.
Custom OAuth app: when trying to connect with a custom OAuth app, I get the following error in the browser:
Refresh token: does this connector automatically manage refresh tokens?
Connector selection in Import/Export actions: The connector dropdown in the Import/Export action only allows to select “Salesforce” type connectors. If we want to keep using a “Web Location” connector to connect to Salesforce (for example, if we want to keep having a custom callback URL), why can’t we select it from this dropdown? Even though it’s a “Web Location” connector, its being used for Salesforce, so it should still work correctly right?
OAuth scopes: Here are the OAuth scopes that the default EasyMorph OAuth app has:
If we have a custom OAuth App and we want to use the Salesforce actions in EasyMorph, we have to add all these OAuth scopes to our custom OAuth app for it to work correctly, right?
Record limit: Is there a limit to the number of records that can be imported/exported? Are you always using the REST API, or do you use the BULK API for operations with a significant number of records (for example 10000 or more)?
API usage: with each import/export or when loading the object/field selector, API calls are done to Salesforce in order to obtain this information. There is a monthly limit to the number of calls that can be done in a Salesforce org, so it would be very useful to have an API usage counter to know the current state. This could be for example in the Salesforce Connector, a button that allows you to see the current API Usage. This information is always included by Salesforce in the response when a call is made.
Export - only insert for now? I was able to insert new records into Salesforce, but when I attempted to update records (including the record Id to do the match and then the fields that have to be updated, I get the following error:
I understand that “insert” new records is only available at the moment? It would be great to have “update” existing records as well. And there’s also another mode called “upsert” (it inserts records without Id, and it updates fields where the Id imported matches an existing Id, which allows you to “insert” and “update” at the same time.
I think the description of the “Import” action should specify which ingest operations (insert / update / upsert) are allowed.
We are handling this transparently. Outbound batch size is monitored and - when it’s size approaches ~90Mb - current bulk job is closed and a new one is created, starting with the rest of the dataset. But, actually, we were not able to test it thoroughly (due do the Salesforce dev account quotas) and it would be awesome if you can check whether this logic actually works with real data
PS: Be sure to use the latest beta build, because yesterday (8.10.2021) there were many Salesforce-related bugfixes.
Hi - Usually the smallest is 50, but we do have some cases where we are doing batches as small as 1 record at a time as well (using mass action scheduler for some of these). Let me know if I can provide any further info as well. Thanks!
Why only 1 row? Are your column values so big that 1 row of data can be more than 100MB (the export limit in Salesforce)?
Also, if you can take the latest build of beta and try to export a big table (at least 1mln rows by 30-50 columns) into Salesforce with it, that would be really helpful. We can’t test it because of the limits of our dev account. You would really help us with testing a large data export.
Hi - we have some automations that run when records are inserted and we get a lot of limit errors if we try to process more than 1 record in a batch, so this is the primary reason we have to impose a limit. Will try and run the export and circle back as well!