Cover image for Machine Learning in Non-Stationary Environments : Introduction to Covariate Shift Adaptation.
Machine Learning in Non-Stationary Environments : Introduction to Covariate Shift Adaptation.
Title:
Machine Learning in Non-Stationary Environments : Introduction to Covariate Shift Adaptation.
Author:
Sugiyama, Masashi.
ISBN:
9780262301220
Personal Author:
Physical Description:
1 online resource (279 pages)
Series:
Adaptive Computation and Machine Learning
Contents:
Contents -- Foreword -- Preface -- I INTRODUCTION -- 1 Introduction and Problem Formulation -- 1.1 Machine Learning under Covariate Shift -- 1.2 Quick Tour of Covariate Shift Adaptation -- 1.3 Problem Formulation -- 1.4 Structure of This Book -- II LEARNING UNDER COVARIATE SHIFT -- 2 Function Approximation -- 2.1 Importance-Weighting Techniques for Covariate Shift Adaptation -- 2.2 Examples of Importance-Weighted Regression Methods -- 2.3 Examples of Importance-Weighted Classification Methods -- 2.4 Numerical Examples -- 2.5 Summary and Discussion -- 3 Model Selection -- 3.1 Importance-Weighted Akaike Information Criterion -- 3.2 Importance-Weighted Subspace Information Criterion -- 3.3 Importance-Weighted Cross-Validation -- 3.4 Numerical Examples -- 3.5 Summary and Discussion -- 4 Importance Estimation -- 4.1 Kernel Density Estimation -- 4.2 Kernel Mean Matching -- 4.3 Logistic Regression -- 4.4 Kullback-Leibler Importance Estimation Procedure -- 4.5 Least-Squares Importance Fitting -- 4.6 Unconstrained Least-Squares Importance Fitting -- 4.7 Numerical Examples -- 4.8 Experimental Comparison -- 4.9 Summary -- 5 Direct Density-Ratio Estimation with Dimensionality Reduction -- 5.1 Density Difference in Hetero-Distributional Subspace -- 5.2 Characterization of Hetero-Distributional Subspace -- 5.3 Identifying Hetero-Distributional Subspace by Supervised Dimensionality Reduction -- 5.4 Using LFDA for Finding Hetero-Distributional Subspace -- 5.5 Density-Ratio Estimation in the Hetero-Distributional Subspace -- 5.6 Numerical Examples -- 5.7 Summary -- 6 Relation to Sample Selection Bias -- 6.1 Heckman's Sample Selection Model -- 6.2 Distributional Change and Sample Selection Bias -- 6.3 The Two-Step Algorithm -- 6.4 Relation to Covariate Shift Approach -- 7 Applications of Covariate Shift Adaptation -- 7.1 Brain-Computer Interface.

7.2 Speaker Identification -- 7.3 Natural Language Processing -- 7.4 Perceived Age Prediction from Face Images -- 7.5 Human Activity Recognition from Accelerometric Data -- 7.6 Sample Reuse in Reinforcement Learning -- III LEARNING CAUSING COVARIATE SHIFT -- 8 Active Learning -- 8.1 Preliminaries -- 8.2 Population-Based Active Learning Methods -- 8.3 Numerical Examples of Population-Based Active Learning Methods -- 8.4 Pool-Based Active Learning Methods -- 8.5 Numerical Examples of Pool-Based Active Learning Methods -- 8.6 Summary and Discussion -- 9 Active Learning with Model Selection -- 9.1 Direct Approach and the Active Learning/Model Selection Dilemma -- 9.2 Sequential Approach -- 9.3 Batch Approach -- 9.4 Ensemble Active Learning -- 9.5 Numerical Examples -- 9.6 Summary and Discussion -- 10 Applications of Active Learning -- 10.1 Design of Efficient Exploration Strategies in Reinforcement Learning -- 10.2 Wafer Alignment in Semiconductor Exposure Apparatus -- IV CONCLUSIONS -- 11 Conclusions and Future Prospects -- 11.1 Conclusions -- 11.2 Future Prospects -- Appendix: List of Symbols and Abbreviations -- Bibliography -- Index.
Abstract:
Theory, algorithms, and applications of machine learning techniques to overcome "covariate shift" non-stationarity.
Local Note:
Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2017. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.
Added Author:
Electronic Access:
Click to View
Holds: Copies: