Hi there , we created this flow some time ago and it works fine until this week.
At this moment , a error apears when we try to transfer a file with 2.6 gb.
The process , always stop on 2.09 gb .
We test conection and sending using other alternatives for example s3 and it works ok.
Let me know if there is an advice for this case.
We will look into this issue.
We’ve just tested uploading of 4.2 GB file via SFTP.
For testing, we used the ssh server SSH-2.0-OpenSSH_8.2p1 running on Ubuntu 20.04.4 LTS. The file system is ext4.
For the client, we used the latest version of the EasyMorph Desktop.
And everything works fine.
So, the problem seems related to your ssh/sftp server. It could be server misconfiguration or other limitations (e.g., disk quota, no free space).
What type/kind/version of the ssh/sftp server are you using?
Which EasyMorph Desktop version are you using?
Hi there , I found something interesting. I dont kwow if this behaviour is only for easymorph. But when I make the upload manually or using python in both cases it does not create a temporary file.
When I use easymorph it creates a temporary file in the sftp folder and litle by litle start to growth to complete the transfer. Sending to other sftp or bucket works well. Only with one provider the process always stop on 1.99 gb so I think is a configuration. But the error doest not apear using the other 2 option becasue not create a temp file. Let me know if there is something that I could make I also asking to the provider of the sftp if there is a configuration for temp files.
Yes, when EasyMorph uploads a file it gives it a temporary name and renames it to the target name when uploading finishes. Please check with your SFTP provider if there are any settings that affect such behavior.