Announcement: Salesforce connector and actions

The topic has been moved from the Beta #lounge.

Salesforce connector and actions are ready for testing and use.

Connector

The connector is pretty self-explanatory. It has no basic settings. Just press Authorize to go through Salesforce authorization before use. Custom OAuth clients are supported.

image

Import

The “Import from Salesforce” action allows:

  • Importing selected fields of the selected object, also fields of its child and parent objects in a single query
  • Specifying a filtering condition using the SOQL syntax
  • Fetching only top N rows
  • Executing a custom SOQL query

Export

Features of the “Export to Salesforce” action:

  • Export a dataset from EasyMorph to a Salesforce object table
  • Map columns in EasyMorph to fields in Salesforce naturally (when the have same names) or explicitly (when they have different names)
  • If some rows failed to export, retrieve a list of such rows
  • Export into child/parent fields of the target object (names of such fields should be entered manually, a picker isn’t available)

The connector and actions are already available in the latest beta build.

Please try it and leave your comments, bug reports and feature suggestions in this thread below.

Update / delete

The “Update/delete records in Salesforce” action has two commands:

  • Update records
  • Delete records

Both commends require specifying a column in EasyMorph that contains IDs of Salesforce records to be updated or deleted.

Hello!

I’ve been trying out the Salesforce Connector, it’s looking good! :slight_smile:

Here are some comments:

Connector

  1. Authorization URL: It works with production orgs (URL for authorizing begins with login.salesforce.com, which is the case when pressing the “Authorize” button), but not with sandbox orgs (in this case, the URL for authorizing begins with test.salesforce.com). Sandbox orgs are used for testing features before moving them to production orgs, so being able to connect to a Sandbox org is important, in my opinion. I’ve tried changing “login” to “test” manually in the URL that EasyMorph generates, I get to log in but the connector doesn’t get authorized. Could there be two buttons for authorizing: “Authorize (Production)” and “Authorize (Sandbox)”? Depending on that, the URL starts with either login.salesforce.com or test.salesforce.com.

  2. Custom OAuth app: when trying to connect with a custom OAuth app, I get the following error in the browser:

image

I guess that’s because the callback URL in my custom OAuth app is different to the default one in the EasyMorph connector. Does the default callback URL have to be “https://login.salesforce.com/services/oauth2/authorize”? Could this be customized?

  1. Refresh token: does this connector automatically manage refresh tokens?

  2. Connector selection in Import/Export actions: The connector dropdown in the Import/Export action only allows to select “Salesforce” type connectors. If we want to keep using a “Web Location” connector to connect to Salesforce (for example, if we want to keep having a custom callback URL), why can’t we select it from this dropdown? Even though it’s a “Web Location” connector, its being used for Salesforce, so it should still work correctly right?

  3. OAuth scopes: Here are the OAuth scopes that the default EasyMorph OAuth app has:

image

If we have a custom OAuth App and we want to use the Salesforce actions in EasyMorph, we have to add all these OAuth scopes to our custom OAuth app for it to work correctly, right?

Import/Export Actions

  1. Record limit: Is there a limit to the number of records that can be imported/exported? Are you always using the REST API, or do you use the BULK API for operations with a significant number of records (for example 10000 or more)?

  2. API usage: with each import/export or when loading the object/field selector, API calls are done to Salesforce in order to obtain this information. There is a monthly limit to the number of calls that can be done in a Salesforce org, so it would be very useful to have an API usage counter to know the current state. This could be for example in the Salesforce Connector, a button that allows you to see the current API Usage. This information is always included by Salesforce in the response when a call is made.

  3. Export - only insert for now? I was able to insert new records into Salesforce, but when I attempted to update records (including the record Id to do the match and then the fields that have to be updated, I get the following error:

I understand that “insert” new records is only available at the moment? It would be great to have “update” existing records as well. And there’s also another mode called “upsert” (it inserts records without Id, and it updates fields where the Id imported matches an existing Id, which allows you to “insert” and “update” at the same time.

I think the description of the “Import” action should specify which ingest operations (insert / update / upsert) are allowed.

Thanks for the feedback, Roberto! Answering your questions:

The connector uses the "offline" connection mode, so technically tokens don't expire. But even if they did, the connector can still handle it.

For now, that's not supported. Use separate connectors for the Salesforce actions and web requests.

That's correct.

Records are inserted using the bulk API so the export action can insert large amounts of data rather quickly.

Interesting suggestion. We'll check it out.

Updating/deleting rows is in the works and will be available later.

Hello, @roberto!

When creating a custom Salesforce Connected App, you should set the following callback URL:

http://localhost:1234

@roberto,

We’ve added the ability to choose the sandbox environment in the connector. The beta version has been updated. Can you please check that it works as intended if you select the sandbox environment?

Hello,

Yes it works perfectly, I was able to authorize in my Sandbox! Thanks :slight_smile:

29 posts were split to a new topic: Tokens in Salesforce

Working great so far in sandbox - Love the ability to map column names (even without SQL).

Quick question, is there a way to control the batch size? We have some custom flows/processes that fail if the batch size is too large.

Thank you for the feedback!

What’s the smallest batch size that you use?

We are handling this transparently. Outbound batch size is monitored and - when it's size approaches ~90Mb - current bulk job is closed and a new one is created, starting with the rest of the dataset. But, actually, we were not able to test it thoroughly (due do the Salesforce dev account quotas) and it would be awesome if you can check whether this logic actually works with real data :slight_smile:

PS: Be sure to use the latest beta build, because yesterday (8.10.2021) there were many Salesforce-related bugfixes.

Hi - Usually the smallest is 50, but we do have some cases where we are doing batches as small as 1 record at a time as well (using mass action scheduler for some of these). Let me know if I can provide any further info as well. Thanks!

@shippy140,

Why only 1 row? Are your column values so big that 1 row of data can be more than 100MB (the export limit in Salesforce)?

Also, if you can take the latest build of beta and try to export a big table (at least 1mln rows by 30-50 columns) into Salesforce with it, that would be really helpful. We can’t test it because of the limits of our dev account. You would really help us with testing a large data export.

Thanks!

Hi - we have some automations that run when records are inserted and we get a lot of limit errors if we try to process more than 1 record in a batch, so this is the primary reason we have to impose a limit. Will try and run the export and circle back as well!