So I have a windows computer that is continuously collecting data for a physics experiment. The data is saved in a text file for which each new event is appended in a new line.
Every-time I need to re-analyse the data I transfer the file to a ubuntu computer by ssh, this is fine at the beginning but when the file passes the 1GB it starts to be really time consuming to transfer the hole file again and again, when only the last lines are different.
To give you an idea the experiments runs for around 3h, transfer takes ~5min per GB, files have typically a max of 3GB and files have a line number in the order of the tens of million.
In case you have a solution that works only for a linux to linux transfer I am also interested, maybe I can try it with cygwin.