%0 Journal Article %T Unsupervised Deep Feature Extraction for Remote Sensing Image Classification %A Adriana Romero %A Carlo Gatta %A Gustavo Camps-Valls %J IEEE Transaction on Geoscience and Remote Sensing %D 2016 %V 54 %N 3 %@ 0196-2892 %F Adriana Romero2016 %O LAMP; 600.079;MILAB %O exported from refbase (http://refbase.cvc.uab.es/show.php?record=2723), last updated on Thu, 20 Jan 2022 12:56:01 +0100 %X This paper introduces the use of single-layer and deep convolutional networks for remote sensing data analysis. Direct application to multi- and hyperspectral imagery of supervised (shallow or deep) convolutional networks is very challenging given the high input data dimensionality and the relatively small amount of available labeled data. Therefore, we propose the use of greedy layerwise unsupervised pretraining coupled with a highly efficient algorithm for unsupervised learning of sparse features. The algorithm is rooted on sparse representations and enforces both population and lifetime sparsity of the extracted features, simultaneously. We successfully illustrate the expressive power of the extracted representations in several scenarios: classification of aerial scenes, as well as land-use classification in very high resolution or land-cover classification from multi- and hyperspectral images. The proposed algorithm clearly outperforms standard principal component analysis (PCA) and its kernel counterpart (kPCA), as well as current state-of-the-art algorithms of aerial classification, while being extremely computationally efficient at learning representations of data. Results show that single-layer convolutional networks can extract powerful discriminative features only when the receptive field accounts for neighboring pixels and are preferred when the classification requires high resolution and detailed results. However, deep architectures significantly outperform single-layer variants, capturing increasing levels of abstraction and complexity throughout the feature hierarchy. %U http://refbase.cvc.uab.es/files/RGC2016.pdf %U http://dx.doi.org/10.1109/TGRS.2015.2478379 %P 1349-1362