New Documents Available to Site Users
- Details
- Written by Administrator
- Hits: 5519

From the secret double bonus department
A somewhat un-publicized feature of Cluster Monkey are the PDF documents that registered users can download. These documents include white papers on Limulus Personal computing, a brief book called HPC for Dummies and the PowerPoint slides describing the 3D printing at SC19. To download any of the files you need to register with your email address. When logged in, a Downloads item will appear in the blue Main Menu area on the top left of the page. There you will find the materials.
We ask for your email because we are working on starting two low frequency newsletters covering HPC and Hadoop/Spark. We explain how we treat your information in our Privacy Policy. Also, as a registered user you can leave comments on articles. Cluster Monkey is a community based web-site.
One of our newest documents is a white paper comparing an on-prem Limulus personal appliance to Amazon Cloud instances. The economics are quite interesting. You can also find the slides from our 3D printing demonstration at SC19 and a 2018 white paper on building a Modern Scalable Analytics Classroom--on-prem/no data center needed.
Close to the Edge: If I Don't Use ECC, Have I Made an Error?
- Details
- Written by Douglas Eadline
- Hits: 5295

A continuing installment of our (Close to the) Edge Computing series.
The main memory in computing devices is an ephemeral repository of information. Processing units (CPUs) may analyze and change the information, but ultimately the results of everything eventually end up in the main memory store. Of course, the information may move to nonvolatile memory or disk storage, which provides a more permanent resting place.
Main memory integrity is important. If errors occur in main memory, anything from nothing to a full crash of the entire computer is possible. In order to prevent and possibly correct memory errors, Error-Correcting Code memory (ECC) memory has been developed and deployed in systems where data errors may result in harmful results, like in real-time financial systems, for instance. The goal is that the data written to memory should be the same when it is read back in the future.
Big Data In Little Spaces: Hadoop And Spark At The Edge
- Details
- Written by Administrator
- Hits: 2211

Ever wonder what Edge computing is all about? Data happens and information takes work. Estimates are that by 2020, 1.7 megabytes of new data will be created every second for every person in the world. That is a lot of raw data.
Two questions come to mind. What are we going to do with it and where we going to keep it. Big Data is often described by the three Vs – Volume, Velocity, and Variability – and note not all three need apply. What is missing is the letter “U” which stands for Usability. A Data Scientist will first ask, how much of my data is usable? Data usability can take several forms and include things like quality (is it noisy, incomplete, accurate) and pertinence (is there any extraneous information that will not make a difference to my analysis). There is also the issue of timeliness. Is there a “use by” date for the analysis or might the data be needed in the future for some as of yet unknown reason. The usability component is hugely important and often determines the size of any scalable analytics solution. Usable data is not the same as raw data.
Get the full article at The Next Platform. You may recognize the author.
Close to the Edge: No Data Center Needed Computing
- Details
- Written by Douglas Eadline
- Hits: 3904

Introduction
Welcome to a new series on ClusterMonkey! While the news and articles have been a bit sparse lately, it is not because the head monkey has been idle. Indeed, there is so much to write about and so little time. Another issue we ran into was how to present all the recent projects that may seem rather disparate with an easy-to-understand overriding theme. Welcome to edge computing.
Defining edge computing has become tricky because it now has a marketing buzz associated with it. Thus, like many over-hyped technology topics, it may take on several forms and have some core aspects that allow it to be treated as a "thing."
In this series, the definition of edge is going to be as specific as possible. In general, edge computing is that which does not take place in the data center or the cloud (hint: the cloud is a data center). Such a definition is too broad, however, since computing is everywhere (from cell phones to actual desktop workstations). A more precise definition of edge computing can be written as:
Data center level computing that happens outside of the physical data center or cloud.
That definition seems to eliminate many smaller forms of computing but still is a little gray in terms of "data center level computing." This category of computing usually operates 24/7 and provides a significantly higher level of performance and storage than mass-marketed personal systems.
Sledgehammer HPC
- Details
- Written by Douglas Eadline
- Hits: 7916

HPC without coding in MPI is possible, but only if your problem fits into one of several high level frameworks.
[Note: The following updated article was originally published in Linux Magazine in June 2009. The background presented in this article has recently become relevant due to the resurgence of things like genetic algorithms and the rapid growth of MapReduce (Hadoop) . It does not cover deep learning.]
Not all HPC applications are created in the same way. There are applications like Gromacs, Amber, OpenFoam, etc. that allow domain specialist to input their problem into an HPC framework. Although there is some work required to "get the problem into the application", these are really application specific solutions that do not require the end user to write a program. At the other end of the spectrum are the user written applications. The starting points for these problems include a compiler (C/C++ or Fortran), an MPI library, and other programming tools. The work involved can range form small to large as the user must concern themselves with the "parallel aspects of the problem". Note: all application software started out at this point some time in the past.
Search
Login And Newsletter
Feedburner
Who's Online
We have 183 guests and no members online
Latest Stories/News
Popular
HPCWire
-
EU Launches InvestAI Initiative to Build AI Gigafactories Across Europe
EU Launches InvestAI Initiative to Build AI Gigafactories Across Europe
Feb. 12, 2025 — At the Artificial Intelligence (AI) Action Summit in Paris, European Commission President Ursula von der Leyen has launched InvestAI, an initiative to mobilize €200 billion for […] The post EU Launches InvestAI Initiative to Build AI[…]
Source: HPCwire
Created on: Feb 12, 2025 | 21:29 pm
HPCwire | Feb 12, 2025 | 21:29 pm -
EuroHPC to Streamline Supercomputing Access with New Platform
EuroHPC to Streamline Supercomputing Access with New Platform
Feb. 12, 2025 — The EuroHPC Joint Undertaking (EuroHPC JU) has signed a procurement contract to establish a new EuroHPC Access Platform. This platform aims to enhance the user experience when […] The post EuroHPC to Streamline Supercomputing Access with New[…]
Source: HPCwire
Created on: Feb 12, 2025 | 19:41 pm
HPCwire | Feb 12, 2025 | 19:41 pm -
ACM Announces 2024 Class of Distinguished Members
ACM Announces 2024 Class of Distinguished Members
NEW YORK, Feb. 12, 2025 — ACM, the Association for Computing Machinery, has named 56 Distinguished Members for their impact in the field. All of the 2024 inductees are registered […] The post ACM Announces 2024 Class of Distinguished Members[…]
Source: HPCwire
Created on: Feb 12, 2025 | 19:32 pm
HPCwire | Feb 12, 2025 | 19:32 pm
InsideHPC
-
Applied Digital Closes $375M Financing with SMBC to Support HPC Campus
Feb 12, 2025 | 20:04 pm
-
Quantinuum Quantum Computer Now Operational at RIKEN
Feb 12, 2025 | 19:54 pm
-
Eaton Invests $340M in Transformer Manufacturing to Power Data Centers
Feb 12, 2025 | 19:42 pm