So, in my work we've had to copy a large number of files from my old server to new. There are two data backups I've had to transfer each having ~200000 files cumulatively over 20GBs. There are large number of small files and no large files, so scp I think is the right way to do this transfer.
However, when I remote ssh login into my server2 and scp into it the files from my server1, I see the files being accessed in my terminal. However, I've had to break the wifi connection when I left office, so the terminal now shows a broken pipe. So, I have no way to know if the scp process is still running or how much is remaining. I can change the directory size over time, but I was wondering if there was a better way? Could we get the display of the running process back on my terminal?
I did this for my first data set, I'm yet to start the process for the second data set.. ANything I can do before/while launching the command to prevent/solve such issues of large SCPs?
Also, how many days is this likely to run? 1 server is AWS and other is Google, if that even matters.
screen(ortmux) on server2, which let you detach from your shell and re-connect later.