Old curriculum machine + linux?
Anyone have any recommended local web servers? One that will run stuff like Apache, php, mysql? Looking for a cheap/low power option. I've looked at a couple different options including using a Synology NAS which seems like it may be the easiest at this point vs something like those plug computers like the GuruPlug Server. The machine doesn't have to be fast, just work reliably.
Old curriculum machine + linux?
Apache and a headless linux system (IE a system with no desktop, just terminal) can happily serve up a school website on something like 512mb RAM a 20gb drive 1gb NIC and a 2ghz CPU.
Ultimately it depends exactly what you want from the system but with linux being a flat file system (as maybe you already know) if the machine dies just put the HDD in another machine and start it up. Will work first time. If you want a TRULY dedicated system then get the cheapest 1U server you can with dual 1gb NICs and a raid system (frankly anything more than a 1gb is enough for web server so dont blow the budget on drives) and a dual core/cual CPU setup with 2gb RAM and whack it into a UPS. Job done.
Our public site was, at one point, sitting on an 2 generation old (6-7 years) workstation which the NM then dropped, while it was running, from about 4ft. Survived the fall and not even a blip in uptime We've since moved it to a little 1U server in the server room so we don't drop it again! The most downtime we had was when we had a power failure and the disks got mounted read only. Took a couple of days but she's back now. I've also got another 1U 5 year old server sitting behind me running our internal "mess about" apache server. Can you tell me and the boss are Apache geeks?
ISPconfig runnign on above sort of hardware.
Shadowx, that's helpful. You know what the easiest way to get a stack of apache, php, mysql running on linux? I will be training foreign IT professionals and their competencies may not (probably not) necessarily linux-based web server management. I'm looking for something simple as XAMPP on Mac to install and run. (That's an ideal case anyway).
I will also look into ISPconfig. Good suggestion!
dhicks, I actually have a Raspberry Pi around the lab here. I was also thinking about using that. You think the RAM is competent to serve only like...40 clients? What do you think?
Thanks for all the quick responses!
We just started up another HyperV VM for our web server (Server 2008 R2), runs SharePoint, Home Access Plus+, various WordPress sites and a few custom built Silverlight/SQL sites we have made in Visual Studio LightSwitch.
SharePoint is the real hog but everything else (keeping in mind its PHP and MySQL based) all ticks over with very little resource usage.
I dont know if you have used *nix before so I will assume you havent, just skip over this lot if you know what you're doing! Essentially I would set it up to keep all the important folders separate so you will need something like the following:
ROOT Partition (Like the C Drive)
Mount Point: /
Size: Anywhere from 10-20gb or smaller if space is short
Swap partition (Like a pagefile)
Mount Point: N/A
Type: N/A (I think)
Size: Equal to or double your RAM size
Home partition (Like My Documents)
Size: 5-10gb, largely irrelevant so smaller if space is short. It wont get used on a server anyway
Var Partition (Variable data - Will hold the website files etc... so this needs to be big)
Size: Everything left on the disk!
In terms of how to carry out the partitioning try this guide: Manual disk partitioning guide for Linux Mint 11 It will look slightly different but same interface behind the theme. Essentially by setting the VAR partition separately if the ROOT partition gets corrupted for some reason or if you want to update the server install your VAR partition is completely isolated so you can just unmount it and do whatever you like, copy it off to a new server, update the distro then remount it etc... and your data is safe. If you want an on-disk backup you can just add another partition in there mounted at "/backup" and give it 20gb or so. Primary/Logical (IIRC) refers to a "physical" partition compared to a logical one, so for the important areas like root and VAR I go Primary, everything else can go logical. The SWAP partition will act as your pagefile taking lesser used data from RAM temporarily if needed. HOME is like my documents and is largely irrelevant, you want some space there for tinkering but not a whole lot in the grand scheme of things. Just a useful place to act as a drop box for bringing in updates etc...
I can't remember if the Ubuntu server installer gives you a GUI for networking, if not then try this guide for doing in the command-line: Ubuntu Networking Configuration Using Command Line|Ubuntu Geek
Once you get the server install you want to get the Apache server in, then PHP5 and probably PHP5-CLI (Client line functions, useful for Cron jobs later I think) and then SQL, this guide should talk you through that nicely: Installing LAMP(Apache Web Server/PHP/MySQL) in Debian/Ubuntu – Bin-Blog
Once that lot is up and running you should have a webserver ready to go. The only things after that configuring updates which I think by default are to notify you whenever you login but not to download until you give the command and to get used to the command line. It doesn't take long I promise
Once you have the webserver running you should also install Webmin, it's a web interface for the server. Since linux is all flat-file configs (no registry etc...) Webmin actually lets you do really indepth things like change network configs, change file shares, restart services like Apache, re-configure things like PHP, Apache and loads more all from a webpage and probably more importantly it lets you check for and install updates without using the terminal. Linky: Webmin
Try going through that on an old machine and see what works best for your situation, like has been said, that little setup would run on under a 1gb of ram and a 2ghz processor without issues. Apache is pretty rock solid when it comes to security, always be sensible and put it behind a firewall/in a DMZ and keep the updates fairly current. Linux has a utility called IP Tables which does what it says on the tin. It blocks or allows certain ports (or all ports) for certain IPs (or all IPs) allowing you to only allow port 80 traffic for example and blocking anything else. Linky: iptables - Debian Wiki
IMHO any other web server like IIS is a child's toy and opens you up to a world of pain! Give it a play and if you get stuck PM one of the linux geeks here or make a forum post. As with everything do a bit of research and testing, I'm not saying my way is right but it works for me!
EDIT: Seeing what dhicks has said about the Pi I would be tempted to test that setup. You can use the same instructions as I have posted here since you'd be plugging it into a hard drive anyway but as @dhicks said, you could actually set the server up for them, get them to buy a hard drive (or send your data over encrypted or something similar) and then you can keep a standardised setup. Ubuntu also offer a paid system called Landscape: http://www.ubuntu.com/business/syste...ement/features which offers you the ability to remotely control updates and configs from across the net so if you wanted you could even retain complete control over the operating system and let your other guys control their own website data.
Last edited by shadowx; 28th January 2013 at 11:04 AM.
mattcrum (29th January 2013)
Thanks! I have a Pi set up and running currently. I need to run some stress tests on it to see what it can handle. It was pretty easy to get running. I think this will be a great solution to offer! Thanks for the thorough directions. This is great.
With that being said, one of the things that would be really ideal is also have an out of the box alternative solution. The organization I work for is a not-for-profit and we work with a lot of local ICT professionals in developing countries in rural areas that may not have much linux experience. The issue with stuff like the Pi that makes me nervous for some of the schools we work in is the ability to troubleshoot locally if the hardware changes or there needs to be any deviation from the directions I would initially provide for setup.
Some folks we work with are great with this stuff, but others have less experience can find stuff like the Pi intimidating.
Set up a golden image and clone the sd cards?
Then you can be sure there'll be set properly and the instructions will be able to be followed.
The pi hardware won't be revised until a new model is released (currently there's an a and b option) , it's part of the concept.
I don't know exactly what project you're working on, but if it's something that needs a web server I'd have thought any configuration / setup could be done in a web interface, letting users access it through any device with a web browser.
There are currently 1 users browsing this thread. (0 members and 1 guests)