[Beowulf] Really efficient MPIs??

Peter St. John peter.st.john at gmail.com
Wed Nov 28 08:34:40 EST 2007


Because my target application is easy to distribute, and also tries to
optimize it's own operating environment (by fiddling with it's own
parameters), I'm thinking about using  MPI for the case that a node wants to
specify a remote node to do a job (e.g., an underutilized node, or one that
has common statistics loaded locally) and OpenMP for when the app doesn't
know what specific other node should do a job. There's a bottleneck in my
app where I'd just have to have two calls, one for spawning in each mode. I
was thinking that OpenMP might be smarter about taking advantage of nearby
cores on a CPU, while my app might be smarter about taking advantage of the
current environment of a CPU, or maybe could learn to be.

But I'm a long way off still, this is all hypothetical.

Peter

On 11/28/07, amjad ali <amjad11 at gmail.com> wrote:
>
> Hello,
>
> Because today the clusters with multicore nodes are quite common and the
> cores within a node share memory.
>
> Which Implementations of MPI (no matter commercial or free), make
> automatic and efficient use of shared memory for message passing within a
> node. (means which MPI librarries auomatically communicate over shared
> memory instead of interconnect on the same node).
>
> regards,
> Ali.
>
> _______________________________________________
> Beowulf mailing list, Beowulf at beowulf.org
> To change your subscription (digest mode or unsubscribe) visit
> http://www.beowulf.org/mailman/listinfo/beowulf
>
>


!DSPAM:474d6ef9244071446633523!
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.clustermonkey.net/pipermail/beowulf/attachments/20071128/a6cc4210/attachment-0001.html>
-------------- next part --------------
_______________________________________________
Beowulf mailing list, Beowulf at beowulf.org
To change your subscription (digest mode or unsubscribe) visit http://www.beowulf.org/mailman/listinfo/beowulf


!DSPAM:474d6ef9244071446633523!


More information about the Beowulf mailing list