Questions seeking product, service, or learning material recommendations are off-topic because they become outdated quickly and attract opinion-based answers. Instead, describe your situation and the specific problem you're trying to solve. Share your research. Here are a few suggestions on how to properly ask this type of question.
Closed 6 years ago . Do you know a good software to download all PDF links in a web page?? Operating system is Windows 7. 13.5k 19 19 gold badges 89 89 silver badges 118 118 bronze badges asked Mar 20, 2011 at 20:20 759 3 3 gold badges 11 11 silver badges 19 19 bronze badgesYou can use wget and run a command like this:
wget --recursive --level=1 --no-directories --no-host-directories --accept pdf http://example.com
Or with the short options:
wget -r -l 1 -nd -nH -A pdf http://example.com
UPDATE: Since your update says you are running Windows 7: use wget for Windows from a cmd prompt.
UPDATE 2: For a graphical solution - though it may be overkill since it gets other files too is DownThemAll
2,197 7 7 gold badges 32 32 silver badges 48 48 bronze badges answered Mar 20, 2011 at 20:33 Kevin Worthington Kevin Worthington 1,594 2 2 gold badges 15 15 silver badges 13 13 bronze badgesthank you kevin for your advice, wget looks good, anyway i would prefer a 'graphic' software, non command line. :)
Commented Mar 20, 2011 at 21:09 This rejects even the initial .html page. Has it ever been tested? Commented Jan 21, 2015 at 18:28 The question asks about downloading all PDF links, so yes, the initial .html page will be ignored. Commented Jan 21, 2015 at 21:48 Is there a posibility to do the same thing in Windows 7 using Power Shell? Commented Jul 4, 2015 at 11:28I would also suggest throwing in a delay of at least a few seconds between file downloads so as to be nice and not overwhelm the remote server. e,g, for wget, add in a flag of -w 5
Commented Jan 21, 2016 at 15:21Copy and paste this, open a console enter wget press the right mouse button to insert your clipboard content and press enter.
To use a download file, join the lines with "\n" and use the parameter as follows wget -i mydownload.txt
Note that most other (GUI) download programs too accept to be called with a space separated list of URLs.
Hope this helps. This is how I generally do it. It is faster and more flexible than any extension with a graphical UI, I have to learn and remain familiar with.