[Beowulf] many cores and ib

Jan Heichler jan.heichler at gmx.net
Tue May 6 02:43:53 EDT 2008

Hallo Gilad,

Montag, 5. Mai 2008, meintest Du:

>> Bonding (or multi-rail) does not make sense with "standard IB" in PCIe
GS> x8 since the PCIe connection limits the transfer rate of a single
GS> IB-Link already. 

>> My hint would be to go for Infinipath from QLogic or the new ConnectX
GS> from Mellanox since message rate is probably your limiting factor and
GS> those technologies have a huge advantage over standard Infiniband

>> Infinipath and ConnectX are available as DDR Infiniband and provide a
GS> bandwidth of more than 1800 MB/s.
GS> Boding can provide more bandwidth if needed. Each PCIe x8 slot can
GS> provide (in average) around 1500MB/s, therefore using IB DDR (no need to
GS> be ConnectX), you will get 1500MB/s uni-dir from each PCIe Gen1 x8 slot.

Ahh... sorry... i was just thinking about dual-port cards. Not different cards in different slots. 


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.clustermonkey.net/pipermail/beowulf/attachments/20080506/dfff72b7/attachment-0001.html>
-------------- next part --------------
Beowulf mailing list, Beowulf at beowulf.org
To change your subscription (digest mode or unsubscribe) visit http://www.beowulf.org/mailman/listinfo/beowulf


More information about the Beowulf mailing list