General Chat Thread, Download files from http server index. in General; Hi,
On a webserver where there is a large collection of files stored. They are stored just in a folder ...
13th January 2012, 09:49 AM #1
Download files from http server index.
On a webserver where there is a large collection of files stored. They are stored just in a folder on a webserver, the webserver indexes the files. This is an example of what I mean (I just found this randomness on google): Index of /images/b
Is there a way i can download all of the files at once without clicking and saving each one?
Hope that makes sense?
13th January 2012, 09:56 AM #2
use wget mirror (or recursive)
13th January 2012, 09:57 AM #3
erm, if i remember correctly theres a firefox addon, i think its called downthemall that will allow you to download all links on a website.
Thanks to Flakes from:
FN-GM (15th January 2012)
13th January 2012, 09:57 AM #4
*edit: actually scrap that downthemall looks awesome! just right click on the page and select downthemall then there's just a tick box for jpeg
Last edited by Pyroman; 13th January 2012 at 10:02 AM.
13th January 2012, 02:06 PM #5
+1 for downloadthemall. It will also do video files, document files and so on.
13th January 2012, 08:32 PM #6
if you hae a Linux box running you can use wget or an rsync script
15th January 2012, 03:26 AM #7
This looks to be the ideal thing!
Originally Posted by Flakes
15th January 2012, 10:59 AM #8
IIRC both wget and curl are available for windows too...
By simonmerry in forum Home Access Plus+
Last Post: 8th December 2011, 04:29 PM
By BorisAlden in forum Network and Classroom Management
Last Post: 1st December 2010, 08:35 PM
By thesk8rjesus in forum Mac
Last Post: 25th November 2008, 03:12 PM
Last Post: 27th April 2007, 06:30 PM
Last Post: 24th January 2006, 09:44 PM
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)