High Performance Storage System

HPSS Logo
Incremental Scalability
Based on storage needs and deployment schedules, HPSS scales incrementally by adding computer, network and storage resources. A single HPSS namespace can scale from petabytes of data to exabytes of data, from millions of files to billions of files, and from a few file-creates per second to thousands of file-creates per second.
HPSS Services   :    HPSS User Forum 2021

The virtual HUF will be held on the following dates:

October 06, 07 (Week 1)
October 13, 14 (Week 2)
October 20, 21 (Week 3)

Venue

The HPSS Slack community will be used for discussion and questions during the HUF.
The virtual HUF will be presented via Webex. Registration with Webex will be required to attend. HUF invitations have been sent out via the HPSS community Slack channel and the HPSS site reflector. We encourage all users to attend.

Price

There will be no cost to attend the virtual HUF event this year.

Session Start Times

Each event will be open at least 10 minutes prior to the first presentation slot -- please join early!
HUF invitations have gone out via the HPSS community Slack channel and the HPSS site reflector.

Agenda

Red shading indicates restricted talks.  
   
Date
Times (CDT)
Presentation Title Presenter(s), Org.
Oct 6
8:00 am + 5:00 pm
HPSS Collaboration Outlook Todd Heer, LLNL
Oct 6
8:30 am + 5:30 pm
HPSS Roadmap Michael Meseke, IBM
Oct 6
9:00 am + 6:00 pm
HPSS Futures Michael Meseke, IBM
Oct 6
9:30 am + 6:30 pm
HPSS Support Aaron Watson, IBM
Oct 7
8:00 am + 5:00 pm
A Unified Storage Namespace for High Performance Computing Todd Heer, LLNL
Oct 7
8:30 am + 5:30 pm
Future of Tape Technologies IBM or FujiFilm
Oct 7
9:00 am + 6:00 pm
IBM Tape Technologies IBM or FujiFilm
Oct 7
9:30 am + 6:30 pm
Spectra Logic Storage Technologies Matt Ninesling, Spectra Logic
Craig Bungay, Spectra Logic
Oct 13
8:00 am + 5:00 pm
HSB Status Michael Meseke, IBM
Oct 13
8:30 am + 5:30 pm
Containerization of HSI/HTAR clients Melinda Jacobsen, NERSC
Jon Procknow, IBM
Oct 13
9:00 am + 6:00 pm
HPSS Developer Q&A HPSS Developers
Oct 13
9:30 am + 6:30 pm
HPSSADM has/hac Tool Elena Summer, MPCDF
Oct 14
8:00 am + 5:00 pm
Change Management Plan to upgrade HPSS from 7.5.3 to 8.3u11 on a test system Jaime Pinto, SciNet
Oct 14
8:30 am + 5:30 pm
Test Results with Limited Availability release of HSB Guangwei Che, SLAC
Oct 14
9:00 am + 6:00 pm
SL8500 DataEvac(tm): migrating from Oracle SL8500 to Spectra TFinity Jose Rodriguez & Geoff Cleary, LLNL
Oct 14
9:30 am + 6:30 pm
ATOS HPSS customers and community tool Romain Simonin, ATOS
Oct 20
8:00 am + 5:00 pm
General Site Update on Operations and Current Projects Nick Balthaser, NERSC
Oct 20
8:30 am + 5:30 pm
GridKa Migration Progress Doris Ressmann, KIT
Oct 20
9:00 am + 6:00 pm
General Site Update on Operations and Current Projects Brenna Miller, ORNL
Oct 20
9:30 am + 6:30 pm
Birds of a Feather All
Oct 21
8:00 am + 5:00 pm
Performance Tuning and Testing Jon Procknow, IBM
Oct 21
8:30 am + 5:30 pm
HPSS Storage Archive Small File Disk Cache Hardware Refresh Gregg Gawinski, ORNL
Oct 21
9:00 am + 6:00 pm
Burning Issues Update Zach Stace, IBM
Oct 21
9:30 am + 6:30 pm
Burning Issues Update Zach Stace, IBM

Birds of a Feather

Host Discussion
Herb Wartens, LLNL ZFS ZVOLs as HPSS Disk Devices (could also include info about RHEL 8 & Ansible if the audience is interested)
Francis Dequenne, LBNL Sites' use of UDAs: accessing interest, driving adoption, interface connectivity, and the implementation of the FAIR paradigm.

< Home

Come meet with us!
2021 HUF - VIRTUAL
COVID-19 has disrupted the 2021 HPSS User Forum (HUF) and the Karlsruhe Institute of Technology (KIT) in Karlsruhe, Germany is no longer hosting the event. The 2021 HUF will be hosted online for six days spread across three weeks in October 2021 with no admission cost. This will be a great opportunity to hear from HPSS users, collaboration developers, testers, support folks and leadership (from IBM and DOE Labs) - Learn More. Please contact us if you are not a customer but would like to attend.

HPSS @ SC21
The 2021 international conference for high performance computing, networking, storage and analysis will be in St. Louis, MO from November 15th through 18th, 2021 - Learn More. As we do each year, we are scheduling and meeting with customers via IBM Single Client Briefings. Please contact your local IBM client executive or contact us to schedule a HPSS Single Client Briefing to meet with the IBM business and technical leaders of HPSS.

HPSS @ STS 2022
The 4th Annual Storage Technology Showcase is in the planning stage, but HPSS expects to support the event in March of 2022. Check out their web site - Learn More. We expect an update in early fall 2021.

HPSS @ MSST 2022
The 37th International Conference on Massive Storage Systems and Technology will be in Santa Clara, California in May of 2022 - Learn More. Please contact us if you would like to meet with the IBM business and technical leaders of HPSS at Santa Clara University.

What's New?
DOE Announces HPSS Milestone - Todd Heer, Deputy Program Lead, Advanced Simulation and Computing (ASC) Facilities, Operations, and User Support (FOUS), announced that DOE High Performance Storage Systems (HPSS) eclipse one exabyte in stored data.

Atos Press Release - Atos boosts Météo-France’s data storage capacity to over 1 exabyte in 2025 to improve numerical modeling and climate predictions. Want to read more?

HPSS 9.2 Release - HPSS 9.2 was released on May 11th, 2021 and introduces eight new features and numerous minor updates.

HPSS 9.1 Release - HPSS 9.1 was released on September 24th, 2020 and introduces a few new features.

HUF 2020 - The HPSS User Forum was hosted virtually at no cost in October 2020.

HPSS 9.1 Release - HPSS 9.1 was released on September 24th, 2020 and introduces a few new features.

HPSS 8.3 Release - HPSS 8.3 was released on March 31st, 2020 and introduces one new feature and many minor changes.

Capacity Leader - ECMWF (European Center for Medium-Range Weather Forecasts) has a single HPSS namespace with over 567 PB spanning over 399 million files.

File-Count Leader - LLNL (Lawrence Livermore National Laboratory) has a single HPSS namespace with over 65 PB spanning 1.540 billion files.

Older News - Want to read more?
  • LLNL Logo
  • LLNL Logo
  • NERSC Logo
  • ORNL Logo
  • SANDIA Logo
  • IBM Logo
  • ANL Logo
  • BNL Logo
  • CEA Logo
  • DKRZ Logo
  • ECMWF Logo
  • HLRS Logo
  • IN2P3 Logo
  • IU Logo
  • JAXA Logo
  • KEK Logo
  • NASA LaRC Logo
  • NASA ASDC Logo
  • UCAR Logo
  • NOAA NCDC Logo
  • NOAA Logo
  • NCEP Logo
  • PNNL Logo
  • SLAC Logo
  • MetOffice Logo
  • SciNet Logo
  • SSC Logo
  • UTAS Logo
Home    |    About HPSS    |    Services    |    Contact us
Copyright 1992 - 2021, HPSS Collaboration. All Rights Reserved.