use wget mirror (or recursive)
Code:wget -m Index of /images/b
On a webserver where there is a large collection of files stored. They are stored just in a folder on a webserver, the webserver indexes the files. This is an example of what I mean (I just found this randomness on google): Index of /images/b
Is there a way i can download all of the files at once without clicking and saving each one?
Hope that makes sense?
erm, if i remember correctly theres a firefox addon, i think its called downthemall that will allow you to download all links on a website.
FN-GM (15th January 2012)
*edit: actually scrap that downthemall looks awesome! just right click on the page and select downthemall then there's just a tick box for jpeg
Last edited by Pyroman; 13th January 2012 at 10:02 AM.
+1 for downloadthemall. It will also do video files, document files and so on.
if you hae a Linux box running you can use wget or an rsync script
IIRC both wget and curl are available for windows too...
There are currently 1 users browsing this thread. (0 members and 1 guests)