torledo (29th April 2009)
If SMT won't spring for A/C up front, insist on some decent air handling fans. In the UK it is seldom over 30C outside, so pulling fresh air in will help keep temps under control. These (two for redundancy) would need sensors to detect failure, but that is trivial to achieve.
torledo (29th April 2009)
No, there's a bunch of UPSs keeping the device side of the system live. The boiler room side is dead as the dodo.a boiler room has to have (by law) an electrical isolator switch. Even if you configured this to also turn off power to the server room there's a whole bunch of UPSs in there keeping the lot live.
Last edited by powdarrmonkey; 29th April 2009 at 12:29 PM.
The floor is being raised due to a large concrete plinth that is too difficult to remove, so the servers will be at a much higher height than the boiler room so hopefully if the water pipes do burst we'll have some time to respond before the room floods totally!
We fitted our server room out with open racks which be purchased from cablenet. They were a good quality and could supply cable trays and rings as required. They also supplied a rack mounted tft screen with built in 16 port KVM
We use a largish APC ups which has extra batteries added to give an extended run time with a nework interface and envirmental monitoring. It was oversized when we purchased it but additional servers have been fitted unexpectely this year and any further servers will need to go on a second.
The server are connected to the UPS through a APC Switched PDU this means we can control the start up sequence as we have a server that appears to be reliant on another being up. It can also be used to perform a hard reset with out a visit to the server room.
The room has AC which is massivly overrated but it did not add to much to the price to go for the larger one. It is attualy a heat pump so it could be used for heating (not that we want that) because of this there is a subsidy availiable for fitting one.
We have a Lascar EL-USB-RT this logs temperature and humdidty and sends an email alert out if the room goes out of range and cost about £35.
Finaly if all else has failed there is a temperature sensor that if the room gets really hot cuts the power. The ups will then start a safe shutdown of the room.
Lol they fresh-aired my server room when I originally managed to get it built. It took 5 years to get it built so I didnt kick up a hissy fit over air con. It sook 5 mins to melt the entire world though....
Biggest make ever. 4 Portable air con units later I am still pushing for reliable, powerful, permanant air con.... *sigh*
Go for £2k worth of air con now IMHO.
Spring for aircon now rather than later... ambient aircooling is insufficient... fresh air cooling adds further risks over winter, especially if you shut servers down over Christmas hols etc, unless you can completely close the vents to outdoors...
If SMT are against this, there is plenty of research out there online to back up the need. Find a way to hint to them that they're being blatantly ignorant by skimping on it, without being as blunt as I tend to be when I deal with them
Enviro monitoring on the cheap: http://www.t-balancer.com/english/bng.htm
Analog sensors can be positioned to monitor and record to CSV files ambient temps, internal case temps etc etc. Can also be used to trigger 12v devices on / off (eg: additional case fans, or solenoids to additional 240v rack fans) - can report all back to MBM5 or similar free mobo monitoring software, and can be used to trigger shutdowns based on temps of any sensor attached to it that you specify.
Last edited by Marci; 29th April 2009 at 07:00 PM.
am curious as to the risks of fresh air cooling, how cold does a room get with fresh air vents where it effects the health of machines which are in an off state. the fresh air system we have has pretty basic controls to regulate flow at the supply and return vent ends....presumably you can get systems with more advanced controls ?
Humidity and contaminants (bugs, dirt, stray bears depending on vent size , etc) are the big concerns, when the gear is left off moisture can easily build up inside the machines which causes huge messes when the power is pushed back through. Experiments have been carried out by the big box manufacturers and MS notibly with MS's server rack in a tent experiment but this puts extra strain on the equipment and is still not recommended practice. Additionally the extra contaminants can increase the rate of oxidisation of the metals inside the server which can lead to other issues.
If I turn my air con off the server room goes from 20 to 30 degrees in 20 mins
Oh and carpets are a bad idea, they host dust moisture and static quite effectivly, linolium is the answer or tile or painted concrete, these are also less or a fire hazard.
The more pressing concern is the moisture, as the servers and components contain semi enclosed pockets of air that are insulated a bit and so heat and cool at a differing rate condensation can form inside the systems. This in turn can rain down on components along with contaminants and can even form inside hard drives and under cpus. This is made more dangerous if you are in an enviroment which can have sub zero temperatures as when water freezes it expands and if lodged under or in components can do damage.
Living things also like the slightly warmer indoor environment and so you need to be careful that your equipment does not become an impromptu nest for a wayward colony of insects or worse.
Last edited by SYNACK; 29th April 2009 at 10:00 PM.
There are currently 1 users browsing this thread. (0 members and 1 guests)