Sunday, April 28, 2024
 Popular · Latest · Hot · Upcoming
1
rated 0 times [  1] [ 0]  / answers: 1 / hits: 16554  / 1 Year ago, sat, february 25, 2023, 1:25:53

I am looking for a command line tool that can download multiple urls with multiple threads e.g.



wget2 -n 5 http://stackoverflow.com/ http://askubuntu.com/ http://bobo.com/


Where -n = number of threads. I have come across Axel, but when I give it multiple URLs, it only downloads one.



I will be downloading HTML files.


More From » downloads

 Answers
5

All of the above and linked suggestions do not take two unique URLs. They only take URLs that are mirrors of the same file.



I've found a few programs that do this:



The best is puf (apt-get install puf), use puf url1 url2 etc.



Then there is HTTRACK, which requires a lot of tinkerings and has some limites I can't get past (speed and connection limits)



DownThemAll for Firefox is very good if you don't need a command line app.



UPDATE



I've since found puf has a tendency to crash. The best solution is to create a .txt file with URLs on new lines, e.g.



http://google.com/
http://yahoo.com/


Save that are urls.txt (for example) then run the command:



cat urls.txt | xargs -n 1 -P 10 wget -q


-n specifies to select each line from the file



-p specifies the number of URLs you would like to download in parallel.


[#28223] Sunday, February 26, 2023, 1 Year  [reply] [flag answer]
Only authorized users can answer the question. Please sign in first, or register a free account.
restlerfin

Total Points: 347
Total Questions: 112
Total Answers: 108

Location: Ukraine
Member since Wed, Dec 16, 2020
3 Years ago
;