Export to database error - value too long

Hi there,

I need help urgently regarding the export to database action for Postgresql. I am getting the following error message.

Error: Export of row #2 failed with the following error: 22001: value too long for type character varying(32)
Source: action “Export to database”, module “Insurance Company”, table “Data not in Portal Table”

Here is the screenshot of the data I am trying to insert in the Postgresql table.

Thanks,
Aatash

I suspect field “name” allows only 32 characters but you’re trying to insert a longer value. For instance, “Talcott Resolution Life Insurance Company” is 41 character long.

I would suggest checking character limits in field types in the target database tables and comparing them with max text lengths for each text column in the EasyMorph table to be exported.

@andrew.rybka, does the row numbering in error messages start from 0 or from 1?

Here is the metadata from the database table “name” field. It seems it can accept 190 characters.

190 is the column width in pixels in EasyMorph.

I’m afraid it’s not possible to obtain the exact DB data type in EasyMorph. You need to open SQL Studio or a similar tool and generate a CREATE statement for the table - the statement will show the exact data types.

As an experiment try restricting values in column “name” to 31 or fewer characters and exporting again. You can do this using the left() function, e.g.

[name] = left([name],31)

If it works then this is the cause of the error.

PS. Thank you for posting screenshots, it’s very helpful.

You are correct. I was able to insert records after truncating the “name” values to 32 characters.

Thanks a lot.

Very good :slight_smile: You’re welcome!

This is very similar to the issue I have. I use the bulk export to sql command and when it fails because of a columns data is too big for the database it doesn’t tell me which column. Some of my tables are very large which makes it very difficult to find the issue. I’ve got around this for now by making an easy morph to look up the database column sizes in the master database and compare with Metadata of the easy morph column for the max width.
It seems like this could be done for me through behind the scenes of the bulk export action. That would make my life a lot easier.

@dgudkov, row numbering in that error message starts with 0. It seems that we have to change that.

Hi @Justin_Grewe and welcome to the community!

When columns data is too big for a database column, EasyMorph just displays an error that was geneerated by the SQL Server.

We are trying to keep the Bulk export action as fast as possible. So it performs only those validations that are absolutely necessary.

Yeah I understand the need for quickness. But it makes using the bulk insert difficult when you have a large table with many columns. Like I said I was able to get around it by calling an easymorph project and passing the table name to it as well as the easymorph table column. Then I do a max of that column and then search the master database for field length. I then return true false to original easymorph. This is all fine but its reactionary…I do this when easymorph server fails then have to open up. Then change project to call my easymorph identify problem columns morph.
Seems like if I can do this myself the program could do it behind the scenes. It is very fast to look up the table info in sql.

We could make a separate action for checking consistency of data types in EasyMorph and in a target database. Its use would be optional.

Alternatively, it could be an option in the export action(s). It could be turned on and off, when needed.

That would be great. No rush i just wanted to share some feedback on the bulk export. I’ve been using Easymorph at my work for the past year and have become somewhat of a guru. It’s an excellent product that I recommend to everyone.

Appreciate your feedback, Justin. We absolutely want to hear back from our customers.

Thank you :slight_smile: