Amazon cover image
Image from Amazon.com

Decision forests [electronic resource] : a unified framework for classification, regression, density estimation, manifold learning and semi-supervised learning / Antonio Criminisi, Jamie Shotton, and Ender Konukoglu.

By: Contributor(s): Material type: TextTextSeries: Foundations and trends in computer graphics and vision (Online) ; v. 7, issue 2-3, p. 81-227.Publication details: Hanover, Mass. : Now Publishers, c2012.Description: 1 electronic text ([81]-227 p.) : ill. (some col.), digital fileISBN:
  • 9781601985415 (electronic)
Other title:
  • Unified framework for classification, regression, density estimation, manifold learning and semi-supervised learning
Subject(s): DDC classification:
  • 511/.52 23
LOC classification:
  • QA166.2 .C753 2012
Online resources: Available additional physical forms:
  • Also available in print.
Contents:
1. Overview and scope -- 2. The random decision forest model -- 3. Classification forests -- 4. Regression forests -- 5. Density forests -- 6. Manifold forests -- 7. Semi-supervised forests -- 8. Random ferns and other forest variants -- Appendix A. Deriving the regression information gain -- Acknowledgements.
Abstract: This review presents a unified, efficient model of random decision forests which can be applied to a number of machine learning, computer vision, and medical image analysis tasks. Our model extends existing forest-based techniques as it unifies classification, regression, density estimation, manifold learning, semi-supervised learning, and active learning under the same decision forest framework. This gives us the opportunity to write and optimize the core implementation only once, with application to many diverse tasks. The proposed model may be used both in a discriminative or generative way and may be applied to discrete or continuous, labeled or unlabeled data.Summary: The main contributions of this review are: (1) Proposing a unified, probabilistic and efficient model for a variety of learning tasks; (2) Demonstrating margin-maximizing properties of classification forests; (3) Discussing probabilistic regression forests in comparison with other nonlinear regression algorithms; (4) Introducing density forests for estimating probability density functions; (5) Proposing an efficient algorithm for sampling from a density forest; (6) Introducing manifold forests for nonlinear dimensionality reduction; (7) Proposing new algorithms for transductive learning and active learning. Finally, we discuss how alternatives such as random ferns and extremely randomized trees stem from our more general forest model.
Holdings
Item type Current library Collection Call number Status Date due Barcode Item holds
eBook eBook e-Library EBook Available
Total holds: 0

Includes bibliographical references (p. 221-227).

1. Overview and scope -- 2. The random decision forest model -- 3. Classification forests -- 4. Regression forests -- 5. Density forests -- 6. Manifold forests -- 7. Semi-supervised forests -- 8. Random ferns and other forest variants -- Appendix A. Deriving the regression information gain -- Acknowledgements.

Restricted to subscribers or individual document purchasers.

Google Scholar

Google Book Search

INSPEC

Scopus

ACM Computing Guide

DBPLP Computer Science Bibliography

Zentralblatt MATH Database

AMS MathSciNet

ACM Computing Reviews

This review presents a unified, efficient model of random decision forests which can be applied to a number of machine learning, computer vision, and medical image analysis tasks. Our model extends existing forest-based techniques as it unifies classification, regression, density estimation, manifold learning, semi-supervised learning, and active learning under the same decision forest framework. This gives us the opportunity to write and optimize the core implementation only once, with application to many diverse tasks. The proposed model may be used both in a discriminative or generative way and may be applied to discrete or continuous, labeled or unlabeled data.

The main contributions of this review are: (1) Proposing a unified, probabilistic and efficient model for a variety of learning tasks; (2) Demonstrating margin-maximizing properties of classification forests; (3) Discussing probabilistic regression forests in comparison with other nonlinear regression algorithms; (4) Introducing density forests for estimating probability density functions; (5) Proposing an efficient algorithm for sampling from a density forest; (6) Introducing manifold forests for nonlinear dimensionality reduction; (7) Proposing new algorithms for transductive learning and active learning. Finally, we discuss how alternatives such as random ferns and extremely randomized trees stem from our more general forest model.

Antonio Criminisi, Jamie Shotton and Ender Konukoglu (2012) "Decision Forests: A Unified Framework for Classification, Regression, Density Estimation, Manifold Learning and Semi-Supervised Learning", Foundations and Trendsʼ in Computer Graphics and Vision: Vol. 7: No 2-3, pp 81-227.

Also available in print.

Mode of access: World Wide Web.

System requirements: Adobe Acrobat Reader.

Title from PDF (viewed on 16 April 2012).

Powered by Koha