HPSS Logo
Who has a petabyte or more of data?

These HPSS Collaboration Members' sites have accumulated a petabyte or more of data, in a single HPSS file system.

Site 10^15 Bytes Millions of Files
(ECMWF) European Centre for Medium-Range Weather Forecasts 184.09 238.5
(NOAA-RD) National Oceanic and Atmospheric Administration Research & Development 86.61 76.6
(UKMO) United Kingdom Met Office 78.40 113.0
(BNL) Brookhaven National Laboratory 75.69 104.2
(LBNL-User) Lawrence Berkley National Laboratory - User 72.85 209.1
(LANL-Secure) Los Alamos National Laboratory - Secure 61.92 611.3
(ORNL) Oak Ridge National Laboratory 60.54 61.89
(NCAR) National Center for Atmospheric Research 58.83 233.0
(LLNL-Secure) Lawrence Livermore National Laboratory - Secure 56.73 852.2
(CEA TERA) Commissariat a l`Energie Atomique - GENO 54.05 12.0
(DKRZ) Deutsches Klimarechenzentrum 54.05 19.6
(Meteo-France) Meteo France 48.45 215.5
(MPCDF) Max Planck 38.28 44.2
(IN2P3) Institut National de Physique Nucleaire et de Physique des Particules 36.21 54.7
(SLAC) Stanford Linear Accelerator Center 33.77 10.0
(LBNL-Backup) Lawrence Berkley National Laboratory - Backup 33.27 20.3
(ANL) Argonne National Laboratory 32.58 408.7
(LLNL-Open) Lawrence Livermore National Laboratory - Open 32.03 622.3
(DWD) Deutscher Wetterdienst 31.79 44.6
(NCSA) National Center for Supercomputing Applications 30.76 186.0
(CEA TGCC) Commissariat a l`Energie Atomique - GENO 25.82 13.0
(IU) Indiana University 20.93 148.6
(HLRS) High Performance Computing Center Stuttgart 16.30 7.7
(JAXA) Japan Aerospace Exploration Agency 11.67 31.0
(PNNL) Pacific Northwest National Laboratory 9.99 103.3
(Purdue) Purdue University 8.91 44.2
(KEK) High Energy Accelerator Research Organization 8.81 28.0
(LaRC) NASA Langley Research Center 8.44 37.7
(NOAA-Class) National Oceanic and Atmospheric Administration - Comprehensive Large Array-data Stewardship System - Asheville 7.76 157.4
(NOAA-Class) National Oceanic and Atmospheric Administration - Comprehensive Large Array-data Stewardship System - Boulder 7.76 157.5
(SciNet) SciNet HPC Consortium 5.17 32.7
(LoC) Library of Congress 4.28 33.3
(LANL-Open) Los Alamos National Laboratory - Open 4.18 64.3
(NCDC) National Climatic Data Center 2.87 81.2
(CEA GENO) Commissariat a l`Energie Atomique - GENO 2.62 0.5
(CNES) Le site du Centre national d`etudes spatiales 1.83 0.3
(SNL-Secure) Sandia National Laboratory - Secure 1.74 9.5
(SNL-Open) Sandia National Laboratory - Open 1.71 6.3

 

< Home

What's New?
HPSS @ SC15 - SC15 is the 2015 international conference for high performance computing, networking, storage and analysis. SC15 will be in Austin, Texas, from November 16th through 19th - Learn More. Come visit the HPSS folks at the IBM booth and schedule an HPSS briefing at the IBM Executive Briefing Center

Swift On HPSS - OpenStack Swift Object Server implementation enables objects created using the Swift API to be accessed by name in HPSS - /account name/container name/object name. Legacy HPSS files can be accessed using the Swift API. Contact us for more information.

2015 HPSS Users Forum - The HPSS User Forum 2015 will be hosted by SciNet in Toronto, Canada from Monday, September 28 through Friday, October 2. For more information.

HPSS @ ISC15 - ISC15 is the 2015 International Supercomputing Conference for high performance computing, networking, storage and analysis. ISC15 will be in Frankfurt, Germany, from July 12th through 16th - Learn More. Come visit the HPSS folks at the IBM booth and schedule an HPSS briefing at the IBM Executive Briefing Center

2015 HPSS Training - The next HPSS System Administration course from August 24th - 28th. For more information and registration.

HPSS @ MSST 2015 - MSST 2015 is the 31st International Conference on Massive Storage Systems and Technology. This year's theme is Media Wars: Disk versus FLASH in the Struggle for Capacity and Performance. Learn More

NCSA in production with RAIT - A massive 380 petabyte HPSS system was successfully deployed. -- the world’s largest automated near-line data repository for open science. Learn more from NCSA, and HPCwire. The new HPSS system went into production using HPSS Redundant Array of Independent Tapes (RAIT) tiers, which is similar to RAID, providing redundancy for a tape stripe. RAIT allows HPSS customers to meet their performance and redundancy requirements without doubling their tape cost. Learn more about RAIT.

Home    |    About HPSS    |    Services    |    Contact us
Copyright 2015, IBM Corporation. All Rights Reserved.