[Beowulf] AMD 8 cores vs 12 cores CPUs and Infiniband

Gilad Shainer Shainer at Mellanox.com
Tue Mar 29 18:24:19 EDT 2011


I have been using single card on Magny-Cours with no issues at all. You can definitely go with the QDR adapters (latest and greatest). Feel free to contact me directly if you need more info.

On the switch side, switches build according to the spec will auto negotiate to the lower speed, so connecting QDR port to DDR port will cause the QDR port to go down to DDR, but it will work with no issues.

Gilad


-----Original Message-----
From: beowulf-bounces at beowulf.org [mailto:beowulf-bounces at beowulf.org] On Behalf Of Ramiro Alba
Sent: Tuesday, March 29, 2011 1:37 AM
To: beowulf at beowulf.org
Subject: [Beowulf] AMD 8 cores vs 12 cores CPUs and Infiniband

Hi all,

We currently have an IB DDR cluster of 128 AMD QuadCore 'Barcelona' (4 *
2 cores) using DDR 'Flextronics' switches ( http://www.ibswitches.com) with an over-subscription 2:1 (two 24 port switches/rack a one 144 port modular 'Flextronics' Fat-Tree switch to link). Every IB card in nodes, is Mellanox DDR InfiniHost III Lx PCI Express x8 (one port) and we use IB both for calculations (MPI) and storage (Lustre: 80 TB using 1 MDS and 2 OSSs on a DDN 9900 unit)

Now, we are planning to add new AMD 'Magny-Cours' nodes (16 or 24 cores) using Infiniband QDR, but linking with the Flextronics 144 port DDR switch, using 'Hybrid Pasive Copper QSFP to MicroGiGaCN cables', so as we can reach the Lustre storage.

But there are two main issues that we are worried about:

1 - One port IB cards QP saturation

Using 16 cores per node (8 * 2) seem the 'safe' option, but the 24 cores
(12 * 2) is better in term of price per job. Our CFD applications using MPI (OpenMPI) may need to do about 15 'MPI_allreduce' calls in one seccond or less, and we may probably using a pool of 1500 cores. ¿Is anyone having this kind of 'message rate', using AMD 24 cores, and can tell me about his/her solution/experience?

2 - I've heard that QLogic behavior is better in terms of QP creation, I have also to think on linking IB DDR with QDR to reach the 'Lustre'
storage. I suppose the main issue is, which QDR switch or switches are linking with the Flextronics 144 port DDR switch, but I do not know what is the role of the node card (Mellanox/Qlogic). Again, can anyone tell me about his/her solution/experience?

Any comment suggestion will be welcomed

Thanks in advance

Regards
--
Ramiro Alba

Centre Tecnològic de Tranferència de Calor http://www.cttc.upc.edu


Escola Tècnica Superior d'Enginyeries
Industrial i Aeronàutica de Terrassa
Colom 11, E-08222, Terrassa, Barcelona, Spain
Tel: (+34) 93 739 86 46



--
Aquest missatge ha estat analitzat per MailScanner a la cerca de virus i d'altres continguts perillosos, i es considera que est net.

_______________________________________________
Beowulf mailing list, Beowulf at beowulf.org sponsored by Penguin Computing
To change your subscription (digest mode or unsubscribe) visit http://www.beowulf.org/mailman/listinfo/beowulf

-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.




More information about the Beowulf mailing list