Hi guys! I have a server on a rather reliable network, with 1Gbps connection upload verified by different connections over the same area. However, in some countries where the connection is very unreliable, the connection slowly dies, and stays dead halfway through the transfer. Is there any client which would be prepared to adapt or detect an unreliable connection, and retry/resume as soon as it happens without losing the transfer? I find filezilla sometimes tends to stall halfway through, and it stays…hung without doign much. And sometimes after restarting the connection, it just overwrites the file from the beginning without prompting. Is there a better system or client to transfer large files in a resumeable way over the internet?

Thanks!

  • Menschlicher_Fehler@feddit.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    24 hours ago

    If you are on Windows, take a look at Robocopy.

    /z Copies files in restartable mode. In restartable mode, should a file copy be interrupted, robocopy can pick up where it left off rather than recopying the entire file.

    /r:<n> Specifies the number of retries on failed copies. The default value of n is 1,000,000 (one million retries).

    /w:<n> Specifies the wait time between retries, in seconds. The default value of n is 30 (wait time 30 seconds).

    • iturnedintoanewt@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      24 hours ago

      Thanks! I’m looking for some Linux equivalent of this…Not sure if there’s an SFTP client that can reliably copy a bunch of large files, or I’d need to go with any other protocol.