Hi guys! I have a server on a rather reliable network, with 1Gbps connection upload verified by different connections over the same area. However, in some countries where the connection is very unreliable, the connection slowly dies, and stays dead halfway through the transfer. Is there any client which would be prepared to adapt or detect an unreliable connection, and retry/resume as soon as it happens without losing the transfer? I find filezilla sometimes tends to stall halfway through, and it stays…hung without doign much. And sometimes after restarting the connection, it just overwrites the file from the beginning without prompting. Is there a better system or client to transfer large files in a resumeable way over the internet?
Thanks!


It sounds to me like you’re gonna like rsync. It’s cool, you can copy 1 or multiple large files, resume interrupted transfers, and even keep entire directories in sync.
Oh yes, and it’s a Linux tool. Not sure if it’s available with a GUI version, so you might need to get comfortable with using the terminal (which you probably already are if you’re using Linux).