Limit search to available items
Book Cover
E-book

Title Conformal prediction for reliable machine learning : theory, adaptations, and applications / [edited by] Vineeth Balasubramanian, Shen-Shyang Ho, Vladimir Vovk
Published Amsterdam ; Boston : Morgan Kaufmann, ©2014

Copies

Description 1 online resource
Contents Half Title; Title Page; Copyright; Copyright Permissions; Contents; Contributing Authors; Foreword; Preface; Book Organization; Part I: Theory; Part II: Adaptations; Part III: Applications; Companion Website; Contacting Us; Acknowledgments; Part I: Theory; 1 The Basic Conformal Prediction Framework; 1.1 The Basic Setting and Assumptions; 1.2 Set and Confidence Predictors; 1.2.1 Validity and Efficiency of Set and Confidence Predictors; 1.3 Conformal Prediction; 1.3.1 The Binary Case; 1.3.2 The Gaussian Case; 1.4 Efficiency in the Case of Prediction without Objects
1.5 Universality of Conformal Predictors1.6 Structured Case and Classification; 1.7 Regression; 1.8 Additional Properties of Validity and Efficiency in the Online Framework; 1.8.1 Asymptotically Efficient Conformal Predictors; Acknowledgments; 2 Beyond the Basic Conformal Prediction Framework; 2.1 Conditional Validity; 2.2 Conditional Conformal Predictors; 2.2.1 Venn's Dilemma; 2.3 Inductive Conformal Predictors; 2.3.1 Conditional Inductive Conformal Predictors; 2.4 Training Conditional Validity of Inductive Conformal Predictors; 2.5 Classical Tolerance Regions
2.6 Object Conditional Validity and Efficiency2.6.1 Negative Result; 2.6.2 Positive Results; 2.7 Label Conditional Validity and ROC Curves; 2.8 Venn Predictors; 2.8.1 Inductive Venn Predictors; 2.8.2 Venn Prediction without Objects; Acknowledgments; Part II: Adaptations; 3 Active Learning; 3.1 Introduction; 3.2 Background and Related Work; 3.2.1 Pool-based Active Learning with Serial Query; SVM-based methods; Statistical methods; Ensemble-based methods; Other methods; 3.2.2 Batch Mode Active Learning; 3.2.3 Online Active Learning; 3.3 Active Learning Using Conformal Prediction
3.3.1 Query by Transduction (QBT)Algorithmic formulation; 3.3.2 Generalized Query by Transduction; Algorithmic formulation; Combining multiple criteria in GQBT; 3.3.3 Multicriteria Extension to QBT; 3.4 Experimental Results; 3.4.1 Benchmark Datasets; 3.4.2 Application to Face Recognition; 3.4.3 Multicriteria Extension to QBT; 3.5 Discussion and Conclusions; Acknowledgments; 4 Anomaly Detection; 4.1 Introduction; 4.2 Background; 4.3 Conformal Prediction for Multiclass Anomaly Detection; 4.3.1 A Nonconformity Measure for Multiclass Anomaly Detection; 4.4 Conformal Anomaly Detection
4.4.1 Conformal Anomalies4.4.2 Offline versus Online Conformal Anomaly Detection; 4.4.3 Unsupervised and Semi-supervised Conformal Anomaly Detection; 4.4.4 Classification Performance and Tuning of the Anomaly Threshold; 4.5 Inductive Conformal Anomaly Detection; 4.5.1 Offline and Semi-Offline Inductive Conformal Anomaly Detection; 4.5.2 Online Inductive Conformal Anomaly Detection; 4.6 Nonconformity Measures for Examples Represented as Sets of Points; 4.6.1 The Directed Hausdorff Distance; 4.6.2 The Directed Hausdorff k-Nearest Neighbors Nonconformity Measure
Summary "Traditional, low-dimensional, small scale data have been successfully dealt with using conventional software engineering and classical statistical methods, such as discriminant analysis, neural networks, genetic algorithms and others. But the change of scale in data collection and the dimensionality of modern data sets has profound implications on the type of analysis that can be done. Recently several kernel-based machine learning algorithms have been developed for dealing with high-dimensional problems, where a large number of features could cause a combinatorial explosion. These methods are quickly gaining popularity, and it is widely believed that they will help to meet the challenge of analysing very large data sets. Learning machines often perform well in a wide range of applications and have nice theoretical properties without requiring any parametric statistical assumption about the source of data (unlike traditional statistical techniques). However, a typical drawback of many machine learning algorithms is that they usually do not provide any useful measure of confidence in the predicted labels of new, unclassifed examples. Confidence estimation is a well-studied area of both parametric and non-parametric statistics; however, usually only low-dimensional problems are considered"-- Provided by publisher
Bibliography Includes bibliographical references and index
Notes Print version record
Subject Machine learning.
Machine Learning
COMPUTERS -- General.
Machine learning
Form Electronic book
Author Balasubramanian, Vineeth, editor
Ho, Shen-Shyang, editor
Vovk, Vladimir, 1960- editor.
ISBN 9780124017153
0124017150
1306697484
9781306697484