ENSEMBLE CLASSIFIERS FOR HYPERSPECTRAL CLASSIFICATION

Research output: Chapter in Book/Report/Conference proceedingConference paperResearch

Abstract

Machine learning algorithms are methods developed to deal with large volumes of data with high efficiency. Adaboost has been among the most popular and promising algorithms in the last dec-ade and has demonstrated its potential for classification of remote sensing data. Previous studies have shown that Adaboost, though less stable than bagging (another well-know ensemble classifi-cation algorithm), consistently produces higher accuracies in classification tasks performed in a vast variety of data domains. The use of Adaboost for hyperspectral classification, however, has not been fully explored. Like Adaboost, Random Forest is another bootstrap method proposed recently to generate numerous, up to hundreds of classifiers for classification. Using the same resampling strategy as bagging, Random Forest introduces a new feature, called out-of-bag sam-ples, for feature ranking and evaluation. The only parameter for tuning is the number of features to split on at each node, which is described as insensitive to accuracy. Comparatively, Adaboost does not have any parameters except for the amount of pruning, which is zero when using Ran-dom Forest. In this paper, we compare the results obtained with both classifiers on hyperspectral data. Results from two applications, one on ecotope mapping and one on urban mapping are pre-sented. Compared with using one decision tree classifier, Adaboost increases classification accu-racy by 9%, and Random Forest by 13%. Both classifiers achieve comparable results in terms of overall accuracy. Random Forest, however, due to its use of only a random feature subset and no pruning, is more efficient. Our results show that both Adaboost and Random Forest are exception-ally fast in training and achieve higher accuracies than accurate classifiers such as Multi-Layer Perceptrons. Their limited demands on user's input for parameter tuning makes them ideal algo-rithms for operationally oriented tasks. The study demonstrates that Adaboost and Random Forest perform well with hyperspectral data, in terms of both accuracy and ease-of-use.
Original languageEnglish
Title of host publicationProceedings 5th EARSeL Workshop on Imaging Spectroscopy
PublisherEuropean Association of Remote Sensing Laboratories
Publication statusPublished - 25 Apr 2007
EventFinds and Results from the Swedish Cyprus Expedition: A Gender Perspective at the Medelhavsmuseet - Stockholm, Sweden
Duration: 21 Sep 200925 Sep 2009

Publication series

NameProceedings 5th EARSeL Workshop on Imaging Spectroscopy

Conference

ConferenceFinds and Results from the Swedish Cyprus Expedition: A Gender Perspective at the Medelhavsmuseet
CountrySweden
CityStockholm
Period21/09/0925/09/09

Keywords

  • ensemble classification
  • random forest
  • adaboost
  • boosting

Fingerprint Dive into the research topics of 'ENSEMBLE CLASSIFIERS FOR HYPERSPECTRAL CLASSIFICATION'. Together they form a unique fingerprint.

Cite this