Select News

The news in this category has been selected by us because we thought it would be interestingto hard core cluster geeks. Of course, you don't have to be a cluster geek to read the news stories.

From the Pass the Messages Please department

The Open MPI team has been working hard on the version 2 release and it is here!. While Cluster Monkey usually does not track software releases (maybe we should!), this is a significant upgrade to the venerable Open MPI project. The most important aspect of the new release is that it is not ABI compatible with the v1.10 series. That means v1.1 applications will not work with v2.0 of Open MPI. Applications will need to be re-compiled using v2.0.

The Open MPI v2.0 announcement is reproduced below. Thanks and good work Open MPI team!

The Open MPI Team, representing a consortium of research, academic, and industry partners, is pleased to announce the release of Open MPI version 2.0.0.

v2.0.0 is a major new release series containing many new features and bug fixes. As a community, the Open MPI Team is incredibly thankful and appreciative of all the time, effort, and downright hard work contributed by its members and all of its users. Thank you all! We couldn't have done this without you!

From the language-not-the-movie(s) department

Two important Julia Language updates. First, a great interview over at RCE-Cast (while you are there listen to the Singularity interview as well). Head over an take a listen.

What is Julia you ask? A good answer comes from the Julia team:

Julia is the open source programming language for data science and numerical computing that is taking many diverse areas such finance, central banking, insurance, engineering, robotics, artificial intelligence, astrophysics, life sciences and many others by storm. Julia combines the functionality of quantitative environments such as Python and R with the speed of production languages like C++, Fortran and Java to solve big data and analytics problems. Julia delivers dramatic improvements in simplicity, speed, capacity and productivity for data scientists, quants and researchers who need to solve massive computation problems quickly and accurately. The number of Julia users has grown dramatically during the last five years – doubling every 9 months. Julia is taught at MIT, Stanford and dozens of universities worldwide, including MOOCs on Coursera and EdX.

Update: Intel releases ParallelAccelerator v0.2 for Julia 0.5

From the bad-play-on-words department

For those using Python to calculate asymptotes and other science and mathematical things, Intel ® has added its speedy MKL (Math Kernel Library) to the mix. Called Intel ® Distribution for Python* 2017 Beta, The beta release gives Python a big boost by using MKL and other libraries. From the web page "The Beta product adds new Python packages like scikit-learn, mpi4py, numba, conda, tbb (Python interfaces to Intel Threading Building Blocks) and pyDAAL (Python interfaces to Intel Data Analytics Acceleration Library). The Beta also delivers performance improvements for NumPy/SciPy through linking with performance libraries like Intel MKL, Intel Message Passing Interface (Intel MPI), Intel TBB and Intel DAAL."

Beta users can look forward to the following features.

  • Includes NumPy, SciPy, scikit-learn, numba, Cython, pyDAAL
  • Performance accelerations via Intel® MKL, Intel MPI, Intel® TBB, Intel® DAAL
  • Easy, out-of-the-box access to performance
  • Free to download
  • Supports Python versions 2.7 and 3.5
  • Available on Windows*, Linux, and Mac OS

An Intel blog provide more information. There is also a Python profiling tool (beta) available.

A recent article on Phys.org has announced a breakthrough in quantum computing. The article, Crucial hurdle overcome in quantum computing, describes how a team at University of New South Wales (UNSW) in Sydney Australia has created a working quantum gate in silicon. This process paves the way for quantum computing to become a reality in the years to come. Background on quantum computing can be found in this Cluster Monkey article: A Smidgen of Quantum Computing

According to Dr. Menno Veldhorst, a UNSW Research Fellow and the lead author of the Nature paper:

"We've morphed those silicon transistors into quantum bits by ensuring that each has only one electron associated with it. We then store the binary code of 0 or 1 on the 'spin' of the electron, which is associated with the electron's tiny magnetic field."

From the best acronym of the day (BAD) department

The Adept project is bringing some metrics and tools to help optimize energy-efficient use of parallel technologies. According the web site, "Adept builds on the expertise of software developers from high-performance computing (HPC) to exploit parallelism for performance, and on the expertise of Embedded systems engineers in managing energy usage. Adept is developing a tool that can guide software developers and help them to model and predict the power consumption and performance of parallel software and hardware."

Recently, Adapt released a benchmarks suite to help understand and measure power usage for HPC and embedded systems The benchmark suite consists of a wide range of benchmarks including both high-performance embedded and high-performance technical computing. The benchmarks are designed to characterize the efficiency (both in terms of performance and energy) of computer systems, from the hardware and system software stack to the compilers and programming models. More information about the benchmark suite can found on the EPCC Blog Page

Hopefully the ClusterMonkey crew will carve out some time to play with these tools and report back on their experiences.

Search

Login And Newsletter

Create an account to access exclusive content, comment on articles, and receive our newsletters.

Feedburner


This work is licensed under CC BY-NC-SA 4.0

©2005-2023 Copyright Seagrove LLC, Some rights reserved. Except where otherwise noted, this site is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International. The Cluster Monkey Logo and Monkey Character are Trademarks of Seagrove LLC.