Brianna White

Administrator
Staff member
Jul 30, 2019
4,593
3,442
“Exascale only becomes valuable when it’s creating and using data that we care about,” said Pete Beckman, co-director of the Northwestern-Argonne Institute of Science and Engineering (NAISE), at the most recent HPC User Forum. Beckman, head of an Argonne National Laboratory edge computing project called Waggle, was insistent on one thing: edge computing is a crucial part of delivering that value for exascale.
Beckman had opened with a quote from computer architect Ken Batcher: “A supercomputer is a device for turning compute-bound problems into I/O-bound problems.” “In many ways, that is still true today,” Beckman said. “What we expect from supercomputers is that they’re so blindingly fast that really it’s bottlenecked on either reading or writing from input or output.”
“If we take that concept, though, and flip it over,” he added, “then we end up with this idea that edge computing, therefore, is a device for turning an I/O-bound problem into a compute-bound problem.”
Beckman outlined what he viewed as the new paradigm of high-performance computing: one defined by extreme data production – more than could ever be efficiently moved to supercomputers – by massive detectors and instruments like the Large Hadron Collider and radio telescopes. This paradigm, he said, resulted in a series of research problems where it would be more efficient to examine data at the edge and filter only the important or interesting data to supercomputers for heavy-duty analysis.
Continue reading: https://www.hpcwire.com/2021/09/24/the-case-for-an-edge-driven-future-for-supercomputing/
 

Attachments

  • p0004970.m04639.waggle_argonne_675x380.jpg
    p0004970.m04639.waggle_argonne_675x380.jpg
    63.3 KB · Views: 10