From the language-not-the-movie(s) department

Two important Julia Language updates. First, a great interview over at RCE-Cast (while you are there listen to the Singularity interview as well). Head over an take a listen.

What is Julia you ask? A good answer comes from the Julia team:

Julia is the open source programming language for data science and numerical computing that is taking many diverse areas such finance, central banking, insurance, engineering, robotics, artificial intelligence, astrophysics, life sciences and many others by storm. Julia combines the functionality of quantitative environments such as Python and R with the speed of production languages like C++, Fortran and Java to solve big data and analytics problems. Julia delivers dramatic improvements in simplicity, speed, capacity and productivity for data scientists, quants and researchers who need to solve massive computation problems quickly and accurately. The number of Julia users has grown dramatically during the last five years – doubling every 9 months. Julia is taught at MIT, Stanford and dozens of universities worldwide, including MOOCs on Coursera and EdX.

Update: Intel releases ParallelAccelerator v0.2 for Julia 0.5

The second bit of new is the 0.5 release of Julia (and the 1.0 release is closer than you think). From the press release:

The Julia language project is pleased to announce the official launch of Julia 0.5. It is the latest in a series of five releases and marks a big leap forward in performance and functionality. Julia 0.5 is available for immediate download at http://julialang.org.

Julia is the numerical computing language that is revolutionizing fields as diverse as machine learning, artificial intelligence, algorithmic trading, robotics, self-driving cars and the Internet of Things (IoT). Julia users and supporters include MIT Lincoln Labs, Johns Hopkins University Applied Physics Laboratory, US Federal Reserve Bank, Brazilian Development Bank, Federal Aviation Administration, IBM, Intel, Blackrock, Conning, Invenia, the Moore Foundation and researchers at MIT, Harvard, Stanford, UC Berkeley and NYU.

According to Francesco Borrelli, Professor of Mechanical Engineering at UC Berkeley and co-director of the Hyundai Center of Excellence in Integrated Vehicle Safety Systems and Control, “Julia 0.5 has some amazing new features for our research on autonomous driving at UC Berkeley. The port to ARM has made it easy for us to translate our research codes into real world applications.”

Highlights of the new release include from Julia Blog:

  • More than 1,100 packages – an increase of 57% since version 0.4 was released one year ago
  • A new debugger, Gallium, that allows interactive multi-language debugging of Julia, C and C++ code with ease
  • Juno, an IDE for Julia that allows users to write, edit, debug, execute and plot results in a unified desktop application
  • New platform support for ARM and IBM’s POWER architectures, which allows machine learning algorithms to train on the largest datasets using the biggest supercomputers and deploy on the tiniest computers such as Raspberry Pi
  • Significant improvements for functional programming including 24x performance improvements for calling anonymous functions
  • New vectorized function call syntax eliminates temporary allocations, using 3 times less memory and producing results 4.3x faster
  • Efficiency improvements for generators mean a gain of 10x in memory for simple problems and larger gains for bigger data sets
  • LLVM upgrade from 3.3. to 3.7.1 permits new kinds of compiler optimizations for leveraging SIMD instruction sets on modern processors
  • Experimental multithreading support drives new levels of efficiency in parallel computing with Julia

Julia has been mentioned previously on ClusterMonkey

You have no rights to post comments

Search

Login And Newsletter

Create an account to access exclusive content, comment on articles, and receive our newsletters.

Feedburner


This work is licensed under CC BY-NC-SA 4.0

©2005-2023 Copyright Seagrove LLC, Some rights reserved. Except where otherwise noted, this site is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International. The Cluster Monkey Logo and Monkey Character are Trademarks of Seagrove LLC.