High Performance Storage System |
![]() |
Fifteen new HPSS Client API based tools
HPSS standalone API tools are available in $HPSS_ROOT/bin. They are named [hpss]
Support Auto Change COS of files during migration
With this feature, HPSS administrators will be able to enable an Automatic Change Class of Service (COS) operation on a per-COS basis. When this new COS flag is turned on, upon disk migration, files within this COS that fall outside the Minimum File Size and Maximum File Size will have their COS changed to a target COS that has the appropriate file size range. Some use cases:
- Site has files that come into their default COS without a file size (e.g. users running non-HPSS ftp client on their desktop).
- Site is using a COS for an HPSSFS Fuse mount point, Enforce Maximum File Size is off, and the file sizes are not within a consistent range.
File-Family Merge Tool to remove files and tapes from a file family
Moves all files and media belonging to one file family to another file family, and removes the original file family.
DB Partition Manager for Rumbler History Tables
This feature provides a database utility that can be used to manage the database partitions for the Rumbler History tables. This utility will add and delete partitions for the Rumbler History Tables, allowing access to a certain amount of historical data.
Client API for aborting I/O
This feature allows I/O to be aborted by client applications. The following commands are available for admin use via scrub:
- abortreq
- abortuser
Set accountID through FTP
The following new PFTP client commands are provided:
- account
- change account of this session to account id - site getacct - display the account associated with this session
- site getacct - display the account associated with the given file
- site setacct
- set the account associated with the given file to account id
Add support for buffered tape marks
HPSS supports buffered tape marks for all supported tape and PVR types. Refer to the 'Storage Class' section of the HPSS Management Guide for more information on buffered tape marks.
2022 HUF The 2022 HPSS User Forum (HUF) will be an in-person event scheduled October 24-28, 2022, in Houston, TX. Please check back for registration details. This will be a great opportunity to hear from HPSS users, collaboration developers, testers, support folks and leadership (from IBM and DOE Labs). Please contact us if you are not a customer but would like to attend. |
HPSS @ SC22 The 2022 international conference for high performance computing, networking, storage and analysis will be in Dallas, TX from November 14th through 17th, 2022 - Learn More. As we have each year (pre-pandemic), we are scheduling and meeting with customers via IBM Single Client Briefings. Please contact your local IBM client executive or contact us to schedule a HPSS Single Client Briefing to meet with the IBM business and technical leaders of HPSS. |
HPSS @ STS 2023 The 4th Annual Storage Technology Showcase has been postponed, but HPSS expects to support the event when it returns. Check out their web site - Learn More. |
HPSS @ MSST 2023 The 37th International Conference on Massive Storage Systems and Technology will be in Santa Clara, California in May of 2023 - Learn More. Please contact us if you would like to meet with the IBM business and technical leaders of HPSS at Santa Clara University. |
HPSS @ ISC 2023 ISC 2023 is the event for high performance computing, machine learning, and data analytics, and will be in Hamburg, Germany from May 21st through May 25th, 2023 - Learn More. As we have done each year (pre-pandemic), we are scheduling and meeting with folks attending the conference. Please contact us meet with the IBM business and technical leaders of HPSS. |
Celebrating 30 Years - 2022 marks the 30th anniversary of the High Performance Storage System (HPSS) Collaboration. |
HPSS 10.1 Release - HPSS 10.1 was released on September 30th, 2022 and introduces fourteen new features and numerous minor updates. |
Lots of Data - In March 2022, IBM/HPSS delivered a storage solution to a customer in Canada, and demonstrated a sustained tape ingest rate of 33 GB/sec (2.86 PB/day peak tape ingest x 2 for dual copy), while simultaneously demonstrating a sustained tape recall rate of 24 GB/sec (2.0 PB/day peak tape recall). HPSS pushed six 18-frame IBM TS4500 tape libraries (scheduled to house over 1.6 Exabytes of tape media) to over 3,000 mounts/hour. |
HPSS 9.3 Release - HPSS 9.3 was released on December 14th, 2021 and introduces eight new features and numerous minor updates. |
HUF 2021 - The HPSS User Forum was hosted virtually at no cost in October 2021. |
DOE Announces HPSS Milestone - Todd Heer, Deputy Program Lead, Advanced Simulation and Computing (ASC) Facilities, Operations, and User Support (FOUS), announced that DOE High Performance Storage Systems (HPSS) eclipse one exabyte in stored data. |
Atos Press Release - Atos boosts Météo-France’s data storage capacity to over 1 exabyte in 2025 to improve numerical modeling and climate predictions. Want to read more? |
HPSS 9.2 Release - HPSS 9.2 was released on May 11th, 2021 and introduces eight new features and numerous minor updates. |
Capacity Leader - ECMWF (European Center for Medium-Range Weather Forecasts) has a single HPSS namespace with over 739 PB spanning over 493 million files. |
File-Count Leader - LLNL (Lawrence Livermore National Laboratory) has a single HPSS namespace with over 76 PB spanning 1.699 billion files. |
Older News - Want to read more? |