Lossy Compression Research Helps Supercomputers Communicate Faster

December 5, 2020

Computer Science PhD student Robert Underwood is tackling the critical problem of moving and storing the ever growing volume of data produced in things like physics and climate simulations, intelligent transportation systems, and medicine, by using compression to reducing the volume of data as it is moved and stored. His work makes a type of compression called lossy compression more usable by application scientists who use supercomputing to solve hard problems.  Lossy compression creates an approximation of the original  data that can be stored more compactly.  Robert’s work is to provide a consistent programming interface called LibPressio, an automated configuration system to configure the compressors, and tools to understand the trade-offs in lossy compression.  His tools are used by multiple universities and national labs and have run on some of the world’s largest supercomputers, including  Summit (#2), Theta (#37), and  the Clemson Palmetto supercomputer (#477).

This project is funded by the US Department of Energy’s Exascale Computing Project (ECP), Project Number 17-SC-20-SC.  A paper  on this work is: R. Underwood, S. Di, J. C. Calhoun and F. Cappello, “FRaZ: A Generic High-Fidelity Fixed-Ratio Lossy Compression Framework for Scientific Floating-point Data,” 2020 IEEE International Parallel and Distributed Processing Symposium (IPDPS), New Orleans, LA, USA, 2020, pp. 567-577.  doi:10.1109/IPDPS47924.2020.00065,

Related Posts