Back-dropout transfer learning for action recognition

Huamin Ren, Nattiya Kanhabua, Andreas Møgelmose, Weifeng Liu, Kaustubh Kulkarni, Sergio Escalera, Xavier Baró, Thomas B. Moeslund

    Research output: Contribution to journalArticleResearchpeer-review

    3 Citations (Scopus)


    © The Institution of Engineering and Technology 2018. Transfer learning aims at adapting a model learned from source dataset to target dataset. It is a beneficial approach especially when annotating on the target dataset is expensive or infeasible. Transfer learning has demonstrated its powerful learning capabilities in various vision tasks. Despite transfer learning being a promising approach, it is still an open question how to adapt the model learned from the source dataset to the target dataset. One big challenge is to prevent the impact of category bias on classification performance. Dataset bias exists when two images from the same category, but from different datasets, are not classified as the same. To address this problem, a transfer learning algorithm has been proposed, called negative back-dropout transfer learning (NB-TL), which utilizes images that have been misclassified and further performs backdropout strategy on them to penalize errors. Experimental results demonstrate the effectiveness of the proposed algorithm. In particular, the authors evaluate the performance of the proposed NB-TL algorithm on UCF 101 action recognition dataset, achieving 88.9% recognition rate.
    Original languageEnglish
    Pages (from-to)484-491
    JournalIET Computer Vision
    Issue number4
    Publication statusPublished - 1 Jun 2018


    Dive into the research topics of 'Back-dropout transfer learning for action recognition'. Together they form a unique fingerprint.

    Cite this