Quote:
Originally Posted by trevius
It isn't too uncommon for Data Centers to have heating issues during the first warm days of the year. In the building I work at, our server room has the same issue every year. Apparently there is a leak in the coolant and they just have to come back out and refill the coolant again. Normally it shouldn't last a week, but a few days probably isn't uncommon considering how busy AC guys probably are this time of year. The problem is that the low coolant still works fine when the weather isn't hot, because it still cools enough as long as the AC runs high. But, once it gets to a certain point, the AC can't keep it cool enough even if it runs around the clock.
This is just a guess about what is happening in the Data Center that PEQ is hosted in, but I think it is probably a similar case.
|
I presume by coolant, that the server room at the company you work for uses portable units rather than a centralized system. HVAC industry doesn't really use freon-based systems anymore, it's all peltier/radiator closed systems in the 'modern world'. And I disagree, I think that this sort of downtime at any type of host is rare. In the hosting industry, unless you can't guarantee your customers 99.6% uptime, you end up with backed out investors, and losing business to someone that CAN provide such a service. I've seen some pretty gnarly rigging in datacenters to cut corners (like huge box fans with furnace filters duct taped to both sides rather than a ventilation filter to keep dust down) but in all of my professional experience I have yet to see a datacenter that hasn't planned the server builds around the room design and power source usage.
Portable and window unit air conditioning systems are influenced by external temperature, you'll end up freezing the compressor if you run it during cold weather. You'll also bend the copper pipes in the unit if you operate them far out of humidity specification (an air conditioner doubles as a dehumidifier, so if humidity is low, there are problems)
A sufficient BTU centralized air conditioning system is easily approximated like this:
Floor area * 340 + Humans (425 * permanent occupant count) + Power/network equipment (wattage * 3.5) + Lights (wattage * 4.25) == total BTU cooling capacity needed for an application
I didn't mention windows, because most datacenters don't have them. I'm sure a quick Google search could yield how to calculate them though.
Today's virtualized datacenters usually require 3.8k BTU per two 24U cabinets give or take 300 - direct push due to the heat being centralized rather than equalized across the floor plan. MOST companies also have capacity to spare (cooling-wise) for expansion in the original design already calculated and have to adhere to those guidelines for insurance purposes.