Friday, May 3, 2024
5
rated 0 times [  5] [ 0]  / answers: 1 / hits: 4250  / 3 Years ago, thu, october 7, 2021, 6:02:47

I want to save web pages for offline reading. Currently I save it using firefox. For bulk saving I want to automate the process using a script (or what about a web site copier like webhttrack?). From terminal I can save the .html file of the URL (using wget URL) but can't view the page as no image, .js, etc.



Sometimes I want to save numbered pages, ie. https://askubuntu.com/posts/1, https://askubuntu.com/posts/2, https://askubuntu.com/posts/3, https://askubuntu.com/posts/4 .. (like mirroring) in one shot.



How can I bulk save web pages with all necessary file to view it properly?


More From » software-recommendation

 Answers
7

There's a great little firefox addon called ScrapBook that'll do what you want. Just install it by clicking the Add to Firefox button on the addon's website over at Mozilla.



Here's the link


[#31826] Friday, October 8, 2021, 3 Years  [reply] [flag answer]
Only authorized users can answer the question. Please sign in first, or register a free account.
onomicsiberated

Total Points: 217
Total Questions: 98
Total Answers: 107

Location: Luxembourg
Member since Sun, May 28, 2023
1 Year ago
;