Sunday, April 28, 2024
 Popular · Latest · Hot · Upcoming
1
rated 0 times [  1] [ 0]  / answers: 1 / hits: 7683  / 3 Years ago, thu, september 9, 2021, 8:24:32

I have this list of URLs in a .txt file -which i'm suppose to get content of them. To achieve this, i decided to use cURL. With xargs curl < url-list.txt , i can display all the content of the URL within my terminal. With curl -o myFile.html www.example.com , i can save only 1 file.



There is also curl -O URL1 -O URL2 approach but it will be too long for me to do it.



How can i save multiple files at once?



Edit:



#!/bin/bash
file="filename"

while read line
do
curl -o "$line.html" "$line"
done < "$file"


i run the bash screen above and here is what happened:



enter image description here


More From » bash

 Answers
2

Build up a bash script to loop through your list of URLs and perform the curl command.



#!/bin/bash
file="filename"

while read line
do
outfile=$(echo $line | awk 'BEGIN { FS = "/" } ; {print $NF}')
curl -o "$outfile.html" "$line"
done < "$file"

[#24344] Friday, September 10, 2021, 3 Years  [reply] [flag answer]
Only authorized users can answer the question. Please sign in first, or register a free account.
brellked

Total Points: 63
Total Questions: 107
Total Answers: 104

Location: Morocco
Member since Fri, May 22, 2020
4 Years ago
;