High-throughput variable-to-fixed entropy codec using selective, stochastic code forests

Manuel Martinez Torres, Miguel Hernandez-Cabronero*, Ian Blanes, Joan Serra-Sagrista

*Autor correspondiente de este trabajo

Producción científica: Contribución a una revistaArtículoInvestigaciónrevisión exhaustiva

1 Cita (Scopus)


Efficient high-throughput (HT) compression algorithms are paramount to meet the stringent constraints of present and upcoming data storage, processing, and transmission systems. In particular, latency, bandwidth and energy requirements are critical for those systems. Most HT codecs are designed to maximize compression speed, and secondarily to minimize compressed lengths. On the other hand, decompression speed is often equally or more critical than compression speed, especially in scenarios where decompression is performed multiple times and/or at critical parts of a system. In this work, an algorithm to design variable-to-fixed (VF) codes is proposed that prioritizes decompression speed. Stationary Markov analysis is employed to generate multiple, jointly optimized codes (denoted code forests). Their average compression efficiency is on par with the state of the art in VF codes, e.g., within 1% of Yamamoto et al.'s algorithm. The proposed code forest structure enables the implementation of highly efficient codecs, with decompression speeds 3.8 times faster than other state-of-the-art HT entropy codecs with equal or better compression ratios for natural data sources. Compared to these HT codecs, the proposed forests yields similar compression efficiency and speeds.

Idioma originalInglés estadounidense
Número de artículo9081983
Páginas (desde-hasta)81283-81297
Número de páginas15
PublicaciónIEEE Access
EstadoPublicada - 1 ene 2020


Profundice en los temas de investigación de 'High-throughput variable-to-fixed entropy codec using selective, stochastic code forests'. En conjunto forman una huella única.

Citar esto