Linux.com

Home News Enterprise Computing High Performance High-Performance Data-Intensive Problems to Shift Course of Supercomputing

Data-Intensive Problems to Shift Course of Supercomputing

Science Daily is reporting that a ‘Data Deluge’ is resulting in a paradigm shift in supercomputer architectures. In a presentation during the 3rd Annual La Jolla Research & Innovation Summit this week, SDSC Director Michael Norman said that the amount of digital data generated just by instruments such as DNA sequencers, cameras, telescopes, and MRIs is now doubling every 18 months.

Digital data is advancing at least as fast, and probably faster, than Moore’s Law,” said Norman, referring to the computing hardware belief that the number of transistors which can be placed inexpensively on an integrated circuit doubles approximately every 18 months. “But I/O (input/output) transfer rates are not keeping pace — that is what SDSC’s supercomputers are designed to solve.”

Read more at insideHPC
 

Comments

Subscribe to Comments Feed

Upcoming Linux Foundation Courses

  1. LFD331 Developing Linux Device Drivers
    25 Aug » 29 Aug - Virtual
    Details
  2. LFS422 High Availability Linux Architecture
    08 Sep » 11 Sep - Raleigh, NC
    Details
  3. LFS426 Linux Performance Tuning
    08 Sep » 11 Sep - New York
    Details

View All Upcoming Courses

Become an Individual Member
Check out the Friday Funnies

Sign Up For the Linux.com Newsletter


Who we are ?

The Linux Foundation is a non-profit consortium dedicated to the growth of Linux.

More About the foundation...

Frequent Questions

Join / Linux Training / Board