NICE: A computational solution to close the gap from colour perception to colour categorization

C. Alejandro Parraga, Arash Akbarinia

Research output: Contribution to journalArticleResearchpeer-review

17 Citations (Scopus)


© 2016 Parraga, Akbarinia.This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. The segmentation of visible electromagnetic radiation into chromatic categories by the human visual system has been extensively studied from a perceptual point of view, resulting in several colour appearance models. However, there is currently a void when it comes to relate these results to the physiological mechanisms that are known to shape the pre-cortical and cortical visual pathway. This work intends to begin to fill this void by proposing a new physiologically plausible model of colour categorization based on Neural Isoresponsive Colour Ellipsoids (NICE) in the cone-contrast space defined by the main directions of the visual signals entering the visual cortex. The model was adjusted to fit psychophysical measures that concentrate on the categorical boundaries and are consistent with the ellipsoidal isoresponse surfaces of visual cortical neurons. By revealing the shape of such categorical colour regions, our measures allow for a more precise and parsimonious description, connecting well-known early visual processing mechanisms to the less understood phenomenon of colour categorization. To test the feasibility of our method we applied it to exemplary images and a popular ground-truth chart obtaining labelling results that are better than those of current state-of-the-art algorithms.
Original languageEnglish
Article numbere0149538
JournalPLoS ONE
Publication statusPublished - 1 Mar 2016


Dive into the research topics of 'NICE: A computational solution to close the gap from colour perception to colour categorization'. Together they form a unique fingerprint.

Cite this