TY - JOUR
T1 - High-throughput variable-to-fixed entropy codec using selective, stochastic code forests
AU - Torres, Manuel Martinez
AU - Hernandez-Cabronero, Miguel
AU - Blanes, Ian
AU - Serra-Sagrista, Joan
PY - 2020/1/1
Y1 - 2020/1/1
N2 - Efficient high-throughput (HT) compression algorithms are paramount to meet the stringent constraints of present and upcoming data storage, processing, and transmission systems. In particular, latency, bandwidth and energy requirements are critical for those systems. Most HT codecs are designed to maximize compression speed, and secondarily to minimize compressed lengths. On the other hand, decompression speed is often equally or more critical than compression speed, especially in scenarios where decompression is performed multiple times and/or at critical parts of a system. In this work, an algorithm to design variable-to-fixed (VF) codes is proposed that prioritizes decompression speed. Stationary Markov analysis is employed to generate multiple, jointly optimized codes (denoted code forests). Their average compression efficiency is on par with the state of the art in VF codes, e.g., within 1% of Yamamoto et al.'s algorithm. The proposed code forest structure enables the implementation of highly efficient codecs, with decompression speeds 3.8 times faster than other state-of-the-art HT entropy codecs with equal or better compression ratios for natural data sources. Compared to these HT codecs, the proposed forests yields similar compression efficiency and speeds.
AB - Efficient high-throughput (HT) compression algorithms are paramount to meet the stringent constraints of present and upcoming data storage, processing, and transmission systems. In particular, latency, bandwidth and energy requirements are critical for those systems. Most HT codecs are designed to maximize compression speed, and secondarily to minimize compressed lengths. On the other hand, decompression speed is often equally or more critical than compression speed, especially in scenarios where decompression is performed multiple times and/or at critical parts of a system. In this work, an algorithm to design variable-to-fixed (VF) codes is proposed that prioritizes decompression speed. Stationary Markov analysis is employed to generate multiple, jointly optimized codes (denoted code forests). Their average compression efficiency is on par with the state of the art in VF codes, e.g., within 1% of Yamamoto et al.'s algorithm. The proposed code forest structure enables the implementation of highly efficient codecs, with decompression speeds 3.8 times faster than other state-of-the-art HT entropy codecs with equal or better compression ratios for natural data sources. Compared to these HT codecs, the proposed forests yields similar compression efficiency and speeds.
KW - Data compression
KW - high-throughput entropy coding
KW - variable-to-fixed codes
UR - http://www.scopus.com/inward/record.url?scp=85084956163&partnerID=8YFLogxK
U2 - 10.1109/access.2020.2991314
DO - 10.1109/access.2020.2991314
M3 - Artículo
AN - SCOPUS:85084956163
SN - 2169-3536
VL - 8
SP - 81283
EP - 81297
JO - IEEE Access
JF - IEEE Access
IS - 1
M1 - 9081983
ER -