EDGE Computing Series
- Details
- Written by Douglas Eadline
- Hits: 4577

A continuing installment of our (Close to the) Edge Computing series.
The main memory in computing devices is an ephemeral repository of information. Processing units (CPUs) may analyze and change the information, but ultimately the results of everything eventually end up in the main memory store. Of course, the information may move to nonvolatile memory or disk storage, which provides a more permanent resting place.
Main memory integrity is important. If errors occur in main memory, anything from nothing to a full crash of the entire computer is possible. In order to prevent and possibly correct memory errors, Error-Correcting Code memory (ECC) memory has been developed and deployed in systems where data errors may result in harmful results, like in real-time financial systems, for instance. The goal is that the data written to memory should be the same when it is read back in the future.
- Details
- Written by Douglas Eadline
- Hits: 3098

Introduction
Welcome to a new series on ClusterMonkey! While the news and articles have been a bit sparse lately, it is not because the head monkey has been idle. Indeed, there is so much to write about and so little time. Another issue we ran into was how to present all the recent projects that may seem rather disparate with an easy-to-understand overriding theme. Welcome to edge computing.
Defining edge computing has become tricky because it now has a marketing buzz associated with it. Thus, like many over-hyped technology topics, it may take on several forms and have some core aspects that allow it to be treated as a "thing."
In this series, the definition of edge is going to be as specific as possible. In general, edge computing is that which does not take place in the data center or the cloud (hint: the cloud is a data center). Such a definition is too broad, however, since computing is everywhere (from cell phones to actual desktop workstations). A more precise definition of edge computing can be written as:
Data center level computing that happens outside of the physical data center or cloud.
That definition seems to eliminate many smaller forms of computing but still is a little gray in terms of "data center level computing." This category of computing usually operates 24/7 and provides a significantly higher level of performance and storage than mass-marketed personal systems.
HPCWire
-
OpenHPC Announces the Release of OpenHPC v3.0
OpenHPC Announces the Release of OpenHPC v3.0
Oct. 4, 2023 — OpenHPC is pleased to announce the release of OpenHPC v3.0. This is the first release of the OHPC 3.x branch targeting support for three new major […] The post OpenHPC Announces the Release of OpenHPC v3.0[…]
Source: HPCwire
Created on: Oct 4, 2023 | 23:24 pm
HPCwire | Oct 4, 2023 | 23:24 pm -
UT’s Texas Institute for Electronics and Infleqtion Launch Quantum Manufacturing Center of Excellence
UT’s Texas Institute for Electronics and Infleqtion Launch Quantum Manufacturing Center of Excellence
AUSTIN, Texas, Oct. 4, 2023 — The University of Texas at Austin and Infleqtion, a global quantum technologies company, have signed a memorandum of understanding to develop a new center […] The post UT’s Texas Institute for Electronics and Infleqtion[…]
Source: HPCwire
Created on: Oct 4, 2023 | 22:26 pm
HPCwire | Oct 4, 2023 | 22:26 pm -
Keshav Pingali Receives Ken Kennedy Award for High Performance and Parallel Computing
Keshav Pingali Receives Ken Kennedy Award for High Performance and Parallel Computing
Oct. 4, 2023 — It’s 4 a.m. in Italy. Jet lagged before a conference, Keshav Pingali, professor of Computer Science and core faculty member at the Oden Institute for Computational Engineering and Sciences, found […] The post Keshav Pingali Receives Ken Kennedy Award for[…]
Source: HPCwire
Created on: Oct 4, 2023 | 21:17 pm
HPCwire | Oct 4, 2023 | 21:17 pm