Is there any way that I can read a few lines of a massive online CSV file using its URL from command line before I am going to download it?
Is there any way that I can read a few lines of a massive online CSV file using its URL from command line before I am going to download it?
An slightly quicker than @musher's would be to do this:
wget http://datagov.ic.nhs.uk/presentation/2014_03_March/T201403PDPI+BNFT.CSV -qO -
The O
will output the results to stdout in the terminal, while the q
option supresses stuff about downloading progress etc. You would have to press Ctrl+C fairly quickly to stop it downloading, because as it is text it will download fairly quickly.
This might also work - it should download to stdout the first ten lines:
wget http://datagov.ic.nhs.uk/presentation/2014_03_March/T201403PDPI+BNFT.CSV -qO - | head -10
It should be fine as long as wget stops downloading after the first 10
lines have gone through head
... I checked and wget
seems to stop downloading after the first 10 lines have been read.
Edit: The wget command will be closed by the signal SIGPIPE
- see here
You can change the number of lines read by changing the number after head -