Probability models for highly parallel image coding architecture

Research output: Contribution to journalArticleResearchpeer-review

1 Citation (Scopus)
1 Downloads (Pure)


A key aspect of image coding systems is the probability model employed to code the data. The more precise the probability estimates inferred by the model, the higher the coding efficiency achieved. In general, probability models adjust the estimates after coding every new symbol. The main difficulty to apply such a strategy to a highly parallel coding engine is that many symbols are coded simultaneously, so the probability adaptation requires a different approach. The strategy employed in previous works utilizes stationary estimates collected a priori from a training set. Its main drawback is that statistics are dependent of the image type, so different images require different training sets. This work introduces two probability models for a highly parallel architecture that, similarly to conventional systems, adapt probabilities while coding data. One of the proposed models estimates probabilities through a finite state machine, while the other employs the statistics of already coded symbols via a sliding window. Experimental results indicate that the latter approach improves the performance achieved by the other models, including that of JPEG2000 and High Throughput JPEG2000, at medium and high rates with only a slight increase in computational complexity.

Original languageEnglish
Article number116914
Number of pages8
JournalSignal Processing: Image Communication
Early online date30 Dec 2022
Publication statusPublished - Mar 2023


  • Entropy coding
  • Image coding
  • Parallel computing
  • Probability models


Dive into the research topics of 'Probability models for highly parallel image coding architecture'. Together they form a unique fingerprint.

Cite this