Beyond one-hot encoding: Lower dimensional target embedding

Pau Rodríguez, Miguel A. Bautista, Jordi Gonzàlez, Sergio Escalera

Producció científica: Contribució a revistaArticleRecercaAvaluat per experts

306 Cites (Scopus)

Resum

© 2018 Elsevier B.V. Target encoding plays a central role when learning Convolutional Neural Networks. In this realm, one-hot encoding is the most prevalent strategy due to its simplicity. However, this so widespread encoding schema assumes a flat label space, thus ignoring rich relationships existing among labels that can be exploited during training. In large-scale datasets, data does not span the full label space, but instead lies in a low-dimensional output manifold. Following this observation, we embed the targets into a low-dimensional space, drastically improving convergence speed while preserving accuracy. Our contribution is two fold: (i) We show that random projections of the label space are a valid tool to find such lower dimensional embeddings, boosting dramatically convergence rates at zero computational cost; and (ii) we propose a normalized eigenrepresentation of the class manifold that encodes the targets with minimal information loss, improving the accuracy of random projections encoding while enjoying the same convergence rates. Experiments on CIFAR-100, CUB200-2011, Imagenet, and MIT Places demonstrate that the proposed approach drastically improves convergence speed while reaching very competitive accuracy rates.
Idioma originalAnglès
Pàgines (de-a)21-31
RevistaImage and Vision Computing
Volum75
DOIs
Estat de la publicacióPublicada - 1 de jul. 2018

Fingerprint

Navegar pels temes de recerca de 'Beyond one-hot encoding: Lower dimensional target embedding'. Junts formen un fingerprint únic.

Com citar-ho