Book Cover
E-book
Author Azim, Tayyaba, author

Title Composing fisher kernels from deep neural models : a practitioner's approach / Tayyaba Azim, Sarah Ahmed
Published Cham, Switzerland : Springer, [2018]
©2018

Copies

Description 1 online resource
Series SpringerBriefs in computer science
SpringerBriefs in computer science.
Contents Intro; Preface; Acknowledgements; Contents; Acronyms; 1 Kernel Based Learning: A Pragmatic Approach in the Face of New Challenges; 1.1 Kernel Learning Framework; 1.1.1 Kernel Definition; 1.2 Characteristics of Kernel Functions; 1.3 Kernel Trick; 1.4 Types of Kernel Functions; 1.5 Challenges Faced by Kernel Methods and Recent Advances in Large-Scale Kernel Methods; References; 2 Fundamentals of Fisher Kernels; 2.1 Introduction; 2.2 The Fisher Kernel; 2.2.1 Fisher Vector Normalisation; 2.2.2 Properties of Fisher Kernels; 2.2.3 Applications of Fisher Kernels
2.2.4 Illustration of Fisher Kernel Extraction from Multivariate Gaussian Model2.2.5 Illustration of Fisher Kernel Derived from Gaussian Mixture Model (GMM); References; 3 Training Deep Models and Deriving Fisher Kernels: A Step Wise Approach; 3.1 How to Train Deep Models?; 3.1.1 Data Preprocessing; 3.1.2 Selection of an Activation Function; 3.1.3 Selecting the Number of Hidden Layers and Hidden Units; 3.1.4 Initializing Weights of Deep models; 3.1.5 Learning Rate; 3.1.6 The Size of Mini-Batch and Stochastic Learning; 3.1.7 Regularisation Parameter
3.1.8 Number of Iterations of Gradient Based Algorithms3.1.9 Parameter Tuning: Evade Grid Search-Embrace Random Search; 3.2 Constructing Fisher Kernels from Deep Models; 3.2.1 Demonstration of Fisher Kernel Extraction from Restricted Boltzmann Machine (RBM); 3.2.2 MATLAB Implementation of Fisher Kernel Derived from Restricted Boltzmann Machine (RBM); 3.2.3 Illustration of Fisher Kernel Extraction from Deep Boltzmann Machine; 3.2.4 MATLAB Implementation of Fisher Kernel Derived from Deep Boltzmann Machine (DBM); References; 4 Large Scale Image Retrieval and Its Challenges
4.1 Condensing Deep Fisher Vectors: To Choose or to Compress?4.2 How to Detect Multi-collinearity?; 4.2.1 Variance Inflation Factor (VIF); 4.3 Feature Compression Methods; 4.3.1 Linear Feature Compression Methods; 4.3.2 Non-linear Feature Compression Methods; 4.4 Feature Selection Methods; 4.4.1 Feature Selection via Filter Methods; 4.4.2 Feature Selection via Wrapper Methods; 4.4.3 Feature Selection via Embedded Methods; 4.5 Hands on Fisher Vector Condensation for Large Scale Data Retrieval; 4.5.1 Minimum Redundancy and Maximum Relevance (MRMR); 4.5.2 Parametric t-SNE; References
5 Open Source Knowledge Base for Machine Learning Practitioners5.1 Benchmark Data Sets; 5.2 Standard Toolboxes and Frameworks: A Comparative Review; References
Summary This book shows machine learning enthusiasts and practitioners how to get the best of both worlds by deriving Fisher kernels from deep learning models. In addition, the book shares insight on how to store and retrieve large-dimensional Fisher vectors using feature selection and compression techniques. Feature selection and feature compression are two of the most popular off-the-shelf methods for reducing data's high-dimensional memory footprint and thus making it suitable for large-scale visual retrieval and classification. Kernel methods long remained the de facto standard for solving large-scale object classification tasks using low-level features, until the revival of deep models in 2006. Later, they made a comeback with improved Fisher vectors in 2010. However, their supremacy was always challenged by various versions of deep models, now considered to be the state of the art for solving various machine learning and computer vision tasks. Although the two research paradigms differ significantly, the excellent performance of Fisher kernels on the Image Net large-scale object classification dataset has caught the attention of numerous kernel practitioners, and many have drawn parallels between the two frameworks for improving the empirical performance on benchmark classification tasks. Exploring concrete examples on different data sets, the book compares the computational and statistical aspects of different dimensionality reduction approaches and identifies metrics to show which approach is superior to the other for Fisher vector encodings. It also provides references to some of the most useful resources that could provide practitioners and machine learning enthusiasts a quick start for learning and implementing a variety of deep learning models and kernel functions
Bibliography Includes bibliographical references
Notes Vendor-supplied metadata
Subject Kernel functions.
Support vector machines.
Imaging systems & technology.
Information retrieval.
Maths for computer scientists.
Data mining.
Artificial intelligence.
Pattern recognition.
MATHEMATICS -- Calculus.
MATHEMATICS -- Mathematical Analysis.
Kernel functions
Support vector machines
Form Electronic book
Author Ahmed, Sarah, author
ISBN 9783319985244
3319985248
9783319985251
3319985256