HPSS Logo
Who has a petabyte or more of data?

These HPSS Collaboration Members' sites have accumulated a petabyte or more of data, in a single HPSS file system.

Site 10^15 Bytes Millions of Files
(ECMWF) European Centre for Medium-Range Weather Forecasts 196.63 249.56
(NOAA-RD) National Oceanic and Atmospheric Administration Research & Development 92.52 78.92
(UKMO) United Kingdom Met Office 86.41 119.49
(BNL) Brookhaven National Laboratory 85.4 109.67
(LBNL-User) Lawrence Berkley National Laboratory - User 80.45 211.07
(LANL-Secure) Los Alamos National Laboratory - Secure 63.61 623.86
(ORNL) Oak Ridge National Laboratory 62.92 60.89
(NCAR) National Center for Atmospheric Research 61.93 230.6
(CEA TERA) Commissariat a l`Energie Atomique - GENO 60.68 13.27
(LLNL-Secure) Lawrence Livermore National Laboratory - Secure 59.34 879.41
(DKRZ) Deutsches Klimarechenzentrum 50.13 19.69
(Meteo-France) Meteo France 48.74 239.52
(MPCDF) Max Planck 39.71 57.11
(IN2P3) Institut National de Physique Nucleaire et de Physique des Particules 36.92 57.41
(DWD) Deutscher Wetterdienst 35.75 46.1
(SLAC) Stanford Linear Accelerator Center 34.5 10.15
(LBNL-Backup) Lawrence Berkley National Laboratory - Backup 33.82 20.32
(LLNL-Open) Lawrence Livermore National Laboratory - Open 33.22 705.96
(NCSA) National Center for Supercomputing Applications 32.4 186.95
(ANL) Argonne National Laboratory 28.87 303.94
(CEA TGCC) Commissariat a l`Energie Atomique - GENO 27.85 14.18
(IU) Indiana University 22.72 195.17
(HLRS) High Performance Computing Center Stuttgart 19.13 8.16
(JAXA) Japan Aerospace Exploration Agency 14.13 33
(PNNL) Pacific Northwest National Laboratory 10.16 104.99
(Purdue) Purdue University 9.94 46.18
(KEK) High Energy Accelerator Research Organization 9.31 29.53
(LaRC) NASA Langley Research Center 9.14 40.53
(NOAA-Class) National Oceanic and Atmospheric Administration - Comprehensive Large Array-data Stewardship System - Asheville 8.13 167.62
(NOAA-Class) National Oceanic and Atmospheric Administration - Comprehensive Large Array-data Stewardship System - Boulder 8.13 167.58
(SciNet) SciNet HPC Consortium 5.83 33.61
(LANL-Open) Los Alamos National Laboratory - Open 4.76 70.28
(LoC) Library of Congress 4.42 34.99
(NCDC) National Climatic Data Center 2.98 82.89
(SNL-Secure) Sandia National Laboratory - Secure 1.74 9.53
(SNL-Open) Sandia National Laboratory - Open 1.71 6.23
(KIT) Karlsruhe Institute of Technology 1.09 10.67

 

< Home

What's New?
HPSS @ SC15 - SC15 is the 2015 international conference for high performance computing, networking, storage and analysis. SC15 will be in Austin, Texas, from November 16th through 19th - Learn More. Come visit the HPSS folks at the IBM booth and schedule an HPSS briefing at the IBM Executive Briefing Center

Swift On HPSS - OpenStack Swift Object Server implementation enables objects created using the Swift API to be accessed by name in HPSS - /account name/container name/object name. Legacy HPSS files can be accessed using the Swift API. Contact us for more information.

2015 HPSS Users Forum - The HPSS User Forum 2015 will be hosted by SciNet in Toronto, Canada from Monday, September 28 through Friday, October 2. For more information.

HPSS @ ISC15 - ISC15 is the 2015 International Supercomputing Conference for high performance computing, networking, storage and analysis. ISC15 will be in Frankfurt, Germany, from July 12th through 16th - Learn More. Come visit the HPSS folks at the IBM booth and schedule an HPSS briefing at the IBM Executive Briefing Center

2015 HPSS Training - The next HPSS System Administration course from August 24th - 28th. For more information and registration.

HPSS @ MSST 2015 - MSST 2015 is the 31st International Conference on Massive Storage Systems and Technology. This year's theme is Media Wars: Disk versus FLASH in the Struggle for Capacity and Performance. Learn More

NCSA in production with RAIT - A massive 380 petabyte HPSS system was successfully deployed. -- the world’s largest automated near-line data repository for open science. Learn more from NCSA, and HPCwire. The new HPSS system went into production using HPSS Redundant Array of Independent Tapes (RAIT) tiers, which is similar to RAID, providing redundancy for a tape stripe. RAIT allows HPSS customers to meet their performance and redundancy requirements without doubling their tape cost. Learn more about RAIT.

Home    |    About HPSS    |    Services    |    Contact us
Copyright 2015, IBM Corporation. All Rights Reserved.