Scale Up Your HPC Job Interviews
- Details
- Written by Alex McKee
- Hits: 6048
![]()
Give your HPC resume a performance boost by creating a Resume Profile
If your job search was a parallel program, then your resume would be the code that outputs interviews. You wouldn’t want to scale up your job count until your code is optimized and bug free, right?
An inefficient resume can cause you a lot of wasted time, extending your job search out for several months. Sure, brute forcing your job search by sending out 100’s of resumes will typically land you a job, eventually. But it’s likely that you will burn-out before then, or get into a “I’ll take anything I can get” mindset.
So what can you do to optimize your resume to increase your interview rate?
There are several ways, but one stands out in my book as the most important, and that is adding a resume profile at the top of your resume.
Hadoop Is Not a Toaster
- Details
- Written by Douglas Eadline
- Hits: 5300
From the "Here comes the cluestick" department
Apache Hadoop has been in the press lately. Some of the content has not been positive and, often times, reflects a misunderstanding of how Hadoop relates to data processing. Indeed, we seem to be in the Trough of Disillusionment in the Technology Hype Cycle. In my opinion, many of these recent "insights" seem to come from the belief that Hadoop is some kind of toaster. For the record, Hadoop can make great toast and as it traveled up the hype curve, market exuberance thought good tasting toast could do anything. Turns out, these days, people want something to go along with their toast. What happened to the great toast?
Nothing happened to the toast. It turns out that Hadoop may have started out as a toaster, but now it is quite a bit more. Hadoop has evolved into a full kitchen. To understand modern Hadoop technology, one must understand that just like kitchen that is designed to prepare food for consumption, Hadoop is designed as a platform to prepare data for analysis and insights.
Close to the Edge: If I Don't Use ECC, Have I Made an Error?
- Details
- Written by Douglas Eadline
- Hits: 6401

A continuing installment of our (Close to the) Edge Computing series.
The main memory in computing devices is an ephemeral repository of information. Processing units (CPUs) may analyze and change the information, but ultimately the results of everything eventually end up in the main memory store. Of course, the information may move to nonvolatile memory or disk storage, which provides a more permanent resting place.
Main memory integrity is important. If errors occur in main memory, anything from nothing to a full crash of the entire computer is possible. In order to prevent and possibly correct memory errors, Error-Correcting Code memory (ECC) memory has been developed and deployed in systems where data errors may result in harmful results, like in real-time financial systems, for instance. The goal is that the data written to memory should be the same when it is read back in the future.
New Documents Available to Site Users
- Details
- Written by Administrator
- Hits: 6101
From the secret double bonus department
A somewhat un-publicized feature of Cluster Monkey are the PDF documents that registered users can download. These documents include white papers on Limulus Personal computing, a brief book called HPC for Dummies and the PowerPoint slides describing the 3D printing at SC19. To download any of the files you need to register with your email address. When logged in, a Downloads item will appear in the blue Main Menu area on the top left of the page. There you will find the materials.
We ask for your email because we are working on starting two low frequency newsletters covering HPC and Hadoop/Spark. We explain how we treat your information in our Privacy Policy. Also, as a registered user you can leave comments on articles. Cluster Monkey is a community based web-site.
One of our newest documents is a white paper comparing an on-prem Limulus personal appliance to Amazon Cloud instances. The economics are quite interesting. You can also find the slides from our 3D printing demonstration at SC19 and a 2018 white paper on building a Modern Scalable Analytics Classroom--on-prem/no data center needed.
Close to the Edge: No Data Center Needed Computing
- Details
- Written by Douglas Eadline
- Hits: 4859

Introduction
Welcome to a new series on ClusterMonkey! While the news and articles have been a bit sparse lately, it is not because the head monkey has been idle. Indeed, there is so much to write about and so little time. Another issue we ran into was how to present all the recent projects that may seem rather disparate with an easy-to-understand overriding theme. Welcome to edge computing.
Defining edge computing has become tricky because it now has a marketing buzz associated with it. Thus, like many over-hyped technology topics, it may take on several forms and have some core aspects that allow it to be treated as a "thing."
In this series, the definition of edge is going to be as specific as possible. In general, edge computing is that which does not take place in the data center or the cloud (hint: the cloud is a data center). Such a definition is too broad, however, since computing is everywhere (from cell phones to actual desktop workstations). A more precise definition of edge computing can be written as:
Data center level computing that happens outside of the physical data center or cloud.
That definition seems to eliminate many smaller forms of computing but still is a little gray in terms of "data center level computing." This category of computing usually operates 24/7 and provides a significantly higher level of performance and storage than mass-marketed personal systems.
Search
Login And Newsletter
Feedburner
Who's Online
We have 117 guests and no members online
Latest Stories/News
Popular
InsideHPC
-
AMD and Eviden to Build €554 France-Based Exascale Supercomputer
Nov 18, 2025 | 17:51 pm
-
d-Matrix and Andes Collaborate on RISC-V Accelerator for AI Inference
Nov 17, 2025 | 22:00 pm
-
Enabling Utility-Scale Quantum Computing with HPC-QC Integration
Nov 17, 2025 | 19:45 pm