Advice on Cluster Hardware.

Luc Renambot renambot at cs.vu.nl
Tue Feb 5 13:24:17 EST 2002


Hi,
If you want 2 NICs, why not the Thunder K7 motherboard,
which includes already them on-board. I built and use a 9-node 
using that motherboard (without SCSI) with dual-atlhon 1500+,
with GeForce3, and a very cheap switch (3com 16ports).
It works nice, with RedHat 7.2 and a little help from xCAT IBM
software to install and configure a cluster (it install
automatically Linux with kickstart, plus SSH, MPI, Myrinet GM, ...)

Luc.
renambot at cs.vu.nl

> -----Original Message-----
> From: beowulf-admin at beowulf.org 
> [mailto:beowulf-admin at beowulf.org] On Behalf Of Alberto Ramos
> Sent: Tuesday, February 05, 2002 6:35 PM
> To: Lista de correo sobre Beowulf
> Subject: Advice on Cluster Hardware.
> 
> 
> 
>   Here in a university of Madrid, we are designing a Beowulf 
> for paralel
> computing in QCD. We will begin with a small cluster to see 
> the performance
> and later try with other comunications, like mirynet.
> 
>   The Hardware will be 4 nodes each consisting of:
>   
>   - 2xAMD MP 1800+ CPU
>   - 1x512MB RAM DDR
>   - 1xTyan tiger MP
>   - 2x3-COM 905B NIC
>   - 20GB HD
>   
>   The master node has 1GB of RAM DDR, and one aditional HD to 
> use as /home.
>   
>   The conection will be trought a HP Procurve Switch 408 with 
> 8 ports to use
> Chanel Bonding.
> 
>   Now the questions:
>   
>   - Any known problems with the hardware?
>   - Are the NIC and the Switch good choices?
>   - Will the Intel FORTRAN 90 Compiler work with this hardware.
>   
>   Thank you very much for your time.
>   
>   Alberto.
>   

_______________________________________________
Beowulf mailing list, Beowulf at beowulf.org
To change your subscription (digest mode or unsubscribe) visit http://www.beowulf.org/mailman/listinfo/beowulf



More information about the Beowulf mailing list