Hi guys! I have a server on a rather reliable network, with 1Gbps connection upload verified by different connections over the same area. However, in some countries where the connection is very unreliable, the connection slowly dies, and stays dead halfway through the transfer. Is there any client which would be prepared to adapt or detect an unreliable connection, and retry/resume as soon as it happens without losing the transfer? I find filezilla sometimes tends to stall halfway through, and it stays…hung without doign much. And sometimes after restarting the connection, it just overwrites the file from the beginning without prompting. Is there a better system or client to transfer large files in a resumeable way over the internet?
Thanks!


If you are on Windows, take a look at Robocopy.
Thanks! I’m looking for some Linux equivalent of this…Not sure if there’s an SFTP client that can reliably copy a bunch of large files, or I’d need to go with any other protocol.