Q: Building a small machine room? Materials/costs/etc.

Robert G. Brown rgb at phy.duke.edu
Tue Sep 16 21:57:11 EDT 2003


On Tue, 16 Sep 2003, Brian Dobbins wrote:

> 
> Hi guys,
> 
>   Has anyone on this list been involved in the construction of a small
> 'machine room' for their clusters?  We're beginning to look at creating an
> enclosure with an attached AC unit and, possibly, a raised floor.
> 
>   I've started poking around the 'net a bit, but was interested in hearing 
> any experiences or tips from the people here.  Anything about materials to 
> use, people to talk to, hidden costs to watch out for, etc.  The overall 
> goal is to have a cool, clean, local systems 'room' for our computational 
> facilities.

There are a variety things you'll want to consider, especially:

   a) power;

There have been some very extensive discussions on the list about power
gotchas associated with driving a large number of non-power-factor
corrected switching power supplies.  There are vendors out there who
sell server-room sized main transformers that compensate for the 180 Hz
harmonic line distortion that can occur.  You can also consider line
conditioning and UPS at the same time, depending on your needs and
budget.  You'll need to locate power poles or other receptacle sets
convenient to the racks.

   b) capacity and airflow patterns of your air conditioning compared to
your power supply and expected loading;

What goes in must come out, basically, with enough spare capacity to
allow for future increases in rackmount power density and to keep the
room COOL, not just "not hot".

   c) structural integrity; 

Loaded racks are >>heavy<< -- they can weigh 800 lbs or even more if
they contain UPS batteries, on less than a square meter of floor space.
Two post racks also are often under considerable torque when loaded, and
need to be fastened carefully to the floor and/or equipped with cases
with center mounts that balance.

   d) sound and light;

Four full racks plus AC will sound like a small jet taxiing.  All the
time.  Tolerable for short stretches but not a good place to work.
Besides it's cold, right?  Having enough light, effectively located, to
be able to work with is also good if you are an Old Guy with eyes that
get stressed by reading little bitty etched print upside down in the
dark.

   e) network;

Cable trays, rackspace or wall racks for network POPs and switches.  WAN
(organizational level) connections as well as the actual cluster LAN.

   f) security;

Lessee, a room with 160 or so $2000 or so boxes = $320K.  Locating the
space in a room with a convenient door out to your dark and unattended
loading dock, however convenient it is for delivery, is ill-advised.
Locks, booby traps, x10 video cams, alarms, and armed guards optional.
Find a comfort level here.

   g) comfort/convenience;

We find things like a communal jacket on the back of the cluster room
door, a workbench with tools (rechargable screwdriver), portable KVM,
maybe a laptop for serial port access (for certain motherboards), a
comfy workchair or two.  Sound excluding/noise reducing stereo
headphones can be nifty (and probably OSHA approved).

>   As for specifics for size, that's unknown at the moment... perhaps, just 
> for the sake of discussion, say 15' by 25' (by however tall we need, at 
> least 8.5').  We'd like to plan for a capacity of up to 4 racks (w/ 32 
> nodes each), plus a few miscellaneous desksides lying around.

If you mean a few tower-mounted servers, that's great, but see notes
above about suitability as a real workspace.  It is typically loud,
cold, locked (or should be), and a bad place to take cups of coffee or
food.  So it's not a great place to locate humans for more than the
times required to work on node installation and maintenance.

>   (While we do have a central machine room on campus, our network is 
> isolated for security reasons, and additionally we tend to run GigE 
> between our newest machines -even some desktops- and we certainly can't 
> get that sort of throughput from the central machine room!  Thus, even 
> though this might be costly, it may be worth it in the end.)
> 
>   Thanks for any pointers whatsoever!  It's much appreciated!

Only two.  Other folks will probably add some to bigger/better
locations, but you can take a "walking tour in pictures" of our brahma
cluster on http://www.phy.duke.edu/brahma.  You can also find a book on
engineering clusters and a few other documents which address
infrastructure.

Hope this helps.

Oh, and I'd strongly suggest getting a professional architect (one with
experience doing computer machine rooms) to do your plans.  And avoid
bozo electricians who try e.g. wiring three phase circuits with a single
common neutral.

   rgb

> 
>   Cheers,
>   - Brian
> 
> 
> Brian Dobbins
> Yale Mechanical Engineering
> ------------------------------------------------------------------
> "Be nice to other people.. they outnumber you six billion to one."
> 
> _______________________________________________
> Beowulf mailing list, Beowulf at beowulf.org
> To change your subscription (digest mode or unsubscribe) visit http://www.beowulf.org/mailman/listinfo/beowulf
> 

-- 
Robert G. Brown	                       http://www.phy.duke.edu/~rgb/
Duke University Dept. of Physics, Box 90305
Durham, N.C. 27708-0305
Phone: 1-919-660-2567  Fax: 919-660-2525     email:rgb at phy.duke.edu



_______________________________________________
Beowulf mailing list, Beowulf at beowulf.org
To change your subscription (digest mode or unsubscribe) visit http://www.beowulf.org/mailman/listinfo/beowulf



More information about the Beowulf mailing list