1 - 9
1. Semi-supervised learning [2006]
- Cambridge, Mass. : MIT Press, c2006.
- Description
- Book — x, 508 p. : ill. ; 26 cm.
- Summary
-
A comprehensive review of an area of machine learning that deals with the use of unlabeled data in classification problems: state-of-the-art algorithms, a taxonomy of the field, applications, benchmark experiments, and directions for future research. In the field of machine learning, semi-supervised learning (SSL) occupies the middle ground, between supervised learning (in which all training examples are labeled) and unsupervised learning (in which no label data are given). Interest in SSL has increased in recent years, particularly because of application domains in which unlabeled data are plentiful, such as images, text, and bioinformatics. This first comprehensive overview of SSL presents state-of-the-art algorithms, a taxonomy of the field, selected applications, benchmark experiments, and perspectives on ongoing and future research.Semi-Supervised Learning first presents the key assumptions and ideas underlying the field: smoothness, cluster or low-density separation, manifold structure, and transduction. The core of the book is the presentation of SSL methods, organized according to algorithmic strategies. After an examination of generative models, the book describes algorithms that implement the low-density separation assumption, graph-based methods, and algorithms that perform two-step learning. The book then discusses SSL applications and offers guidelines for SSL practitioners by analyzing the results of extensive benchmark experiments. Finally, the book looks at interesting directions for SSL research. The book closes with a discussion of the relationship between semi-supervised learning and transduction.
(source: Nielsen Book Data)
Engineering Library (Terman)
Engineering Library (Terman) | Status |
---|---|
Stacks | |
Q325.75 .S42 2006 | Unknown |
- New York : Nova Science Publishers, [2018]
- Description
- Book — 1 online resource (241 pages)
- Summary
-
- Intro; SEMI-SUPERVISED LEARNINGBACKGROUND, APPLICATIONSAND FUTURE DIRECTIONS; SEMI-SUPERVISED LEARNINGBACKGROUND, APPLICATIONSAND FUTURE DIRECTIONS; CONTENTS; PREFACE; Introduction to This Book; Target Audience; Acknowledgments; Chapter 1CONSTRAINED DATASELF-REPRESENTATIVE GRAPHCONSTRUCTION; Abstract;
- 1. Introduction;
- 2. Constrained Data Self-Representative GraphConstruction;
- 3. Kernelized Variants; 3
- .1. Hilbert Space; 3
- .2. Column Generation;
- 4. Performance Evaluation; 4
- .1. Label Propagation; 4.1
- .1. Gaussian Random Fields; 4.1
- .2. Local and Global Consistency; 4
- .2. Experimental Results
- 4.2
- .1. Comparison among Several Graph Construction Methods4.2
- .2. Stability of the Proposed Method; 4.2
- .3. Sensitivity to Parameters; 4.2
- .4. Computational Complexity and CPU Time; Acknowledgments; Conclusion; References; Chapter 2INJECTING RANDOMNESS INTO GRAPHS:AN ENSEMBLE SEMI-SUPERVISEDLEARNING FRAMEWORK; Abstract;
- 1. Introduction;
- 2. Background; 2
- .1. Graph-Based Semi-Supervised Learning; 2
- .2. Ensemble Learning and Random Forests; 2
- .3. Anchor Graph;
- 3. Random Multi-Graphs; 3
- .1. Problem Formulation; 3
- .2. Algorithm; 3
- .3. Graph Construction; 3
- .4. Semi-Supervised Inference
- 3
- .5. Inductive Extension3
- .6. Randomness as Regularization;
- 4. Experiments; 4
- .1. Data Sets; 4
- .2. Experimental Results; 4
- .3. Impact of Parameters; 4
- .4. Hyperspectral Image Classification; Acknowledgments; Conclusion; References; Chapter 3LABEL PROPAGATION VIA KERNELFLEXIBLE MANIFOLD EMBEDDING; Abstract;
- 1. Introduction;
- 2. RelatedWork; 2
- .1. Semi-Supervised Discriminant Analysis; 2
- .2. Semi-Supervised Discriminant Embedding; 2
- .3. Laplacian Regularized Least Square; 2
- .4. Review of the Flexible Manifold Embedding Framework;
- 3. Kernel FlexibleManifold Embedding; 3
- .1. The Objective Function
- 3
- .2. Optimal Solution3
- .3. The Algorithm; 3
- .4. Difference between KFME and Existing Methods; 3.4
- .1. Difference between KFME and FME; 3.4
- .2. Difference between KFME and Other Methods;
- 4. Experimental Results; 4
- .1. Datasets; 4
- .2. Method Comparison; 4
- .3. Results Analysis; 4
- .4. Stability with Respect to Graph; Acknowledgments; Conclusion; References; Chapter 4FAST GRAPH-BASED SEMI-SUPERVISEDLEARNING AND ITS APPLICATIONS; Abstract;
- 1. Introduction;
- 2. Related Work; 2
- .1. Scalable Graph-Based SSL/TL Methods; 2
- .2. Scalable Graph Construction Methods; 2
- .3. Robust Graph-Based SSL/TL Methods
- 3. Minimum Tree Cut Method3
- .1. Notations; 3
- .2. The Proposed Method; 3
- .3. The Tree Labeling Algorithm; 3
- .4. Generate a Spanning Tree from a Graph;
- 4. Insensitiveness to Graph Construction;
- 5. Experiments; 5
- .1. Data Set; 5.1
- .1. UCI Data Set; 5.1
- .2. Image; 5.1
- .3. Text; 5
- .2. Graph Construction; 5
- .3. Accuracy; 5
- .4. Speed; 5
- .5. Robustness; 5
- .6. Effect of Different Spanning Tree and Ensemble of MultipleSpanning Trees;
- 6. Applications in Text Extraction; 6
- .1. Interactive Text Extraction in Natural Scene Images; 6
- .2. Document Image Binarization; Conclusion and FutureWork; References
3. Graph-based semi-supervised learning [2014]
- Subramanya, Amarnag, author.
- Cham, Switzerland : Springer, [2014]
- Description
- Book — 1 online resource (xiii, 111 pages) : illustrations
- Summary
-
- Introduction Graph Construction Learning and Inference Scalability Applications Future Work Bibliography Authors' Biographies Index.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
4. Introduction to semi-supervised learning [2009]
- Zhu, Xiaojin, Ph. D.
- Cham, Switzerland : Springer, ©2009.
- Description
- Book — 1 online resource (xi, 116 pages) : color illustrations
- Summary
-
- Introduction to Statistical Machine Learning Overview of Semi-Supervised Learning Mixture Models and EM Co-Training Graph-Based Semi-Supervised Learning Semi-Supervised Support Vector Machines Human Semi-Supervised Learning Theory and Outlook.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
5. Grokking deep Q-networks [2020]
- [First edition]. - [Place of publication not identified] : Manning Publications, 2020.
- Description
- Video — 1 online resource (1 video file (1 hr., 7 min.)) : sound, color. Sound: digital. Digital: video file.
- Summary
-
Miguel Morales, the master of RL domain and the author of "Grokking Deep Reinforcement Learning", demonstrates how to make reinforcement learning more like supervised learning with the help of the popular algorithm Deep Q-Network, which is still one of the best performing DRL agents.
- Abarbanel, H. D. I., author.
- Cambridge, United Kingdom ; New York, NY : Cambridge University Press, 2022.
- Description
- Book — 1 online resource
- Summary
-
- 1. Prologue: linking 'The Future' with the present
- 2. A data assimilation reminder
- 3. Remembrance of things path
- 4. SDA variational principles
- Euler-Lagrange equations and Hamiltonian formulation
- 5. Using waveform information
- 6. Annealing in the model precision Rf
- 7. Discrete time integration in data assimilation variational principles
- Lagrangian and Hamiltonian formulations
- 8. Monte Carlo methods
- 9. Machine learning and its equivalence to statistical data assimilation
- 10. Two examples of the practical use of data assimilation
- 11. Unfinished business
- Bibliography
- Index.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- Søgaard, Anders, 1981- author. Author
- Cham, Switzerland : Springer, ©2013.
- Description
- Book — 1 online resource (x, 93 pages) : illustrations
- Summary
-
- Introduction Supervised and Unsupervised Prediction Semi-Supervised Learning Learning under Bias Learning under Unknown Bias Evaluating under Bias.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
8. Active learning [2012]
- Settles, Burr.
- Cham, Switzerland : Springer, ©2012.
- Description
- Book — 1 online resource (xiii, 100 pages) : illustrations
- Summary
-
- Automating Inquiry Uncertainty Sampling Searching Through the Hypothesis Space Minimizing Expected Error and Variance Exploiting Structure in Data Theory Practical Considerations.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- Hastie, Trevor author.
- Second edition. - New York : Springer, [2009]
- Description
- Book — xxii, 745 pages : illustrations (some color), charts ; 24 cm.
- Summary
-
- 1. Introduction
- 2. Overview of supervised learning
- 3. Linear methods for regression
- 4. Linear methods for classification
- 5. Basis expansions and regularization
- 6. Kernel smoothing methods
- 7. Model assessment and selection
- 8. Model inference and averaging
- 9. Additive models, trees, and related methods
- 10. Boosting and additive trees
- 11. Neural networks
- 12. Support vector machines and flexible discriminants
- 13. Prototype methods and nearest-neighbors
- 14. Unsupervised learning
- 15. Random forests
- 16. Ensemble learning
- 17. Undirected graphical models
- 18. High-dimensional problems: p>> N.
(source: Nielsen Book Data)
Business Library
Business Library | Status |
---|---|
Stacks | Request (opens in new tab) |
Q325.75 .H37 2009 | Unknown |
Articles+
Journal articles, e-books, & other e-resources
Guides
Course- and topic-based guides to collections, tools, and services.