1  20
Next
 Singapore : Springer, 2022.
 Description
 Book — 1 online resource.
 Summary

 Introduction to SVM
 Basics of SVM Method and Least Squares SVM
 Fractional Chebyshev Kernel Functions: Theory and Application
 Fractional Legendre Kernel Functions: Theory and Application
 Fractional Gegenbauer Kernel Functions: Theory and Application
 Fractional Jacobi Kernel Functions: Theory and Application
 Solving Ordinary Differential Equations by LSSVM
 Solving Partial Differential Equations by LSSVM
 Solving Integral Equations by LSSVR
 Solving DistributedOrder Fractional Equations by LSSVR
 GPU Acceleration of LSSVM, Based on Fractional Orthogonal Functions
 Classification Using Orthogonal Kernel Functions: Tutorial on ORSVM Package.
 Boca Raton : CRC Press, c2011.
 Description
 Book — x, 201 p. : ill. ; 24 cm.
 Summary

 Overview of support vector machines Background Maximal Interval Linear Classifier Kernel Functions and Kernel Matrix Optimization Theory Elements of Support Vector Machines Applications of Support Vector Machines Support vector machines for classification and regression Kernel Functions and Dimension Superiority Notion of Kernel Functions Kernel Matrix Support Vector Machines for Classification Computing SVMs for Linearly Separable Case Computing SVMs for Linearly Inseparable Case Application of SVC to Simulated Data Support Vector Machines for Regression epsilonBand and epsilonInsensitive Loss Function Linear epsilonSVR KernelBased epsilonSVR Application of SVR to Simulated Data Parametric Optimization for Support Vector Machines Variable Selection for Support Vector Machines Related Materials and Comments VC Dimension Kernel Functions and Quadratic Programming Dimension Increasing versus Dimension Reducing Appendix A: Computation of Slack VariableBased SVMs Appendix B: Computation of Linear epsilonSVR Kernel methods Kernel Methods: Three Key Ingredients Primal and Dual Forms Nonlinear Mapping Kernel Function and Kernel Matrix Modularity of Kernel Methods Kernel Principal Component Analysis Kernel Partial Least Squares Kernel Fisher Discriminant Analysis Relationship between Kernel Function and SVMs Kernel Matrix Pretreatment Internet Resources Ensemble learning of support vector machines Ensemble Learning Idea of Ensemble Learning Diversity of Ensemble Learning Bagging Support Vector Machines Boosting Support Vector Machines Boosting: A Simple Example Boosting SVMs for Classification Boosting SVMs for Regression Further Consideration Support vector machines applied to nearinfrared spectroscopy NearInfrared Spectroscopy Support Vector Machines for Classification of NearInfrared Data Recognition of Blended Vinegar Based on NearInfrared Spectroscopy Related Work on Support Vector Classification on NIR Support Vector Machines for Quantitative Analysis of NearInfrared Data Correlating Diesel Boiling Points with NIR Spectra Using SVR Related Work on Support Vector Regression on NIR Some Comments Support vector machines and QSAR/QSPR Quantitative StructureActivity/Property Relationship History of QSAR/QSPR and Molecular Descriptors Principles for QSAR Modeling Related QSAR/QSPR Studies Using SVMs Support Vector Machines for Regression Dataset Description Molecular Modeling and Descriptor Calculation Feature Selection Using a Generalized CrossValidation Program Model Internal Validation PLS Regression Model BPN Regression Model SVR Model Applicability Domain and External Validation Model Interpretation Support Vector Machines for Classification TwoStep Algorithm: KPCA Plus LSVM Dataset Description Performance Evaluation Effects of Model Parameters Prediction Results for Three SAR Datasets Support vector machines applied to traditional Chinese medicine Introduction Traditional Chinese Medicines and Their Quality Control Recognition of Authentic PCR and PCRV Using SVM Background Data Description Recognition of Authentic PCR and PCRV Using Whole Chromatography Variable Selection Improves Performance of SVM Some Remarks Support vector machines applied to OMICS study A Brief Description of OMICS Study Support Vector Machines in Genomics Support Vector Machines for Identifying Proteotypic Peptides in Proteomics Biomarker Discovery in Metabolomics Using Support Vector Machines Some Remarks Index.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
Science Library (Li and Ma)
Science Library (Li and Ma)  Status 

Stacks  
Q325.5 .S866 2011  Unknown 
 New York : Nova Science Publishers, c2011.
 Description
 Book — 1 online resource.
 Summary

 Preface
 The Support Vector Machine in Medical Imaging
 A SVMBased Regression Model to Study the Air Quality in the Urban Area of the City of Oviedo (Spain)
 Image Interpolation Using Support Vector Machines
 Utilization of Support Vector Machine (SVM) for Prediction of Ultimate Capacity of Driven Piles in Cohesionless Soils
 Support Vector Machines in Medical Classification Tasks
 Solving Text Mining Problems using Support Vector Machines with Complex Data Oriented Kernels
 SubspaceBased Support Vector Machines
 SVR for Time Series Prediction
 Application of Neural Networks & Support Vector Machines in Coding Theory & Practice
 Pattern Recognition for Machine Fault Diagnosis using Support Vector Machines
 Index.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
4. Support vector machines applications [2014]
 Cham : Springer, 2014.
 Description
 Book — 1 online resource (vii, 302 pages) : illustrations (some color) Digital: text file.PDF.
 Summary

 AugmentedSVM for gradient observations with application to learning multipleattractor dynamics. Multiclass Support Vector Machine. Novel Inductive and Transductive Transfer Learning Approaches Based on Support Vector Learning. Security Evaluation of Support Vector Machines in Adversarial Environments. Application of SVMs to the Bagoffeatures Model A Kernel Perspective. Support Vector Machines for Neuroimage Analysis: Interpretation from Discrimination. Kernel Machines for Imbalanced Data Problem and the Use in Biomedical Applications. Soft Biometrics from Face Images using Support Vector Machines.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Support vector machines (Saigal)
 New York : Nova Science Publishers, Inc., [2021]
 Description
 Book — 1 online resource (xii, 233 pages) : illustrations (some color), color maps
 Summary

 Introduction to support vector machines / Pooja Saigal, PhD, Vivekananda School of Information Technology, Vivekananda Institute of Professional Studies, New Delhi, India
 Journey of support vector machines : from maximummargin hyperplane to a pair of nonparallel hyperplanes / Pooja Saigal, PhD, Vivekananda School of Information Technology, Vivekananda Institute of Professional Studies, New Delhi, India
 Power spectrum entropybased support vector machine for quantitative diagnosis of rotor vibration process faults / ChengWei Fei, Department of Aeronautics and Astronautics, Fudan University, Shanghai, China.
Online 6. Predicting Solar Flares using Support Vector Machines [2016]
 Trinidad, Jacob Conrad (Author)
 February 2016
 Description
 Dataset
 Summary

The sun produces solar flares, which have the power to affect the Earth and nearEarth environment with their great bursts of electromagnetic energy and particles. These flares have the power to blow out transformers on power grids and disrupt satellite systems. As a result, we want to predict such flares to minimize its negative impact. Doing so can be a difficult because of the rarity of these events. In this iPython notebook, we explored such a challenge by extending upon the work of Bobra and Couvidat (2015). We categorized a class of positive and negative events that correspond with flaring and nonflaring active regions on the sun. Then we created various sets of features to describe these events. Using these features, we trained and tested using a machine learning algorithm known as a Support Vector Machine and evaluated its performance using a metric known as a True Skill Score. We were able to obtain an improvement on their original work by using additional features (that quantified the maximum change in the value of certain parameters of an active region) which were shown to have strong predictive power.
 Digital collection
 Stanford Research Data
 Washington, D.C. : United States. Dept. of Energy. ; Oak Ridge, Tenn. : distributed by the Office of Scientific and Technical Information, U.S. Dept. of Energy, 2016
 Description
 Book — 8 p. : digital, PDF file.
 Summary

Advanced materials with improved properties have the potential to fuel future technological advancements. However, identification and discovery of these optimal materials for a specific application is a nontrivial task, because of the vastness of the chemical search space with enormous compositional and configurational degrees of freedom. Materials informatics provides an efficient approach toward rational design of new materials, via learning from known data to make decisions on new and previously unexplored compounds in an accelerated manner. Here, we demonstrate the power and utility of such statistical learning (or machine learning, henceforth referred to as ML) via building a support vector machine (SVM) based classifier that uses elemental features (or descriptors) to predict the formability of a given ABX_{3} halide composition (where A and B represent monovalent and divalent cations, respectively, and X is F, Cl, Br, or I anion) in the perovskite crystal structure. The classification model is built by learning from a dataset of 185 experimentally known ABX_{3} compounds. After exploring a wide range of features, we identify ionic radii, tolerance factor, and octahedral factor to be the most important factors for the classification, suggesting that steric and geometric packing effects govern the stability of these halides. As a result, the trained and validated models then predict, with a high degree of confidence, several novel ABX_{3} compositions with perovskite crystal structure.
 Online
 Washington, D.C. : United States. Dept. of Energy. ; Oak Ridge, Tenn. : distributed by the Office of Scientific and Technical Information, U.S. Dept. of Energy, 2015
 Description
 Book — 1 online resource (p. 11471158 ) : digital, PDF file.
 Summary

In the process of macromolecular model building, crystallographers must examine electron density for isolated atoms and differentiate sites containing structured solvent molecules from those containing elemental ions. This task requires specific knowledge of metalbinding chemistry and scattering properties and is prone to error. A method has previously been described to identify ions based on manually chosen criteria for a number of elements. Here, the use of support vector machines (SVMs) to automatically classify isolated atoms as either solvent or one of various ions is described. Two data sets of protein crystal structures, one containing manually curated structures deposited with anomalous diffraction data and another with automatically filtered, highresolution structures, were constructed. On the manually curated data set, an SVM classifier was able to distinguish calcium from manganese, zinc, iron and nickel, as well as all five of these ions from water molecules, with a high degree of accuracy. Additionally, SVMs trained on the automatically curated set of highresolution structures were able to successfully classify most common elemental ions in an independent validation test set. This method is readily extensible to other elemental ions and can also be used in conjunction with previous methods based on a priori expectations of the chemical environment and Xray scattering.
 Online
 Washington, D.C. : United States. Office of the Assistant Secretary for Nuclear Energy ; Oak Ridge, Tenn. : distributed by the Office of Scientific and Technical Information, U.S. Dept. of Energy, 2012
 Description
 Book
 Summary

Reliability/safety analysis of stochastic dynamic systems (e.g., nuclear power plants, airplanes, chemical plants) is currently performed through a combination of EventTress and FaultTrees. However, these conventional methods suffer from certain drawbacks: • Timing of events is not explicitly modeled • Ordering of events is preset by the analyst • The modeling of complex accident scenarios is driven by expertjudgment For these reasons, there is currently an increasing interest into the development of dynamic PRA methodologies since they can be used to address the deficiencies of conventional methods listed above.
 Online
 Jayadeva, author.
 Cham, Switzerland : Springer, [2016]
 Description
 Book — 1 online resource (xiv, 211 pages) : illustrations (some color)
 Summary

 Introduction. Generalized Eigenvalue Proximal Support Vector Machines. Twin Support Vector Machines (TWSVM) for Classification. TWSVR: Twin Support Vector Machine Based Regression. Variants of Twin Support Vector Machines: Some More Formulations. TWSVM for Unsupervised and SemiSupervised Learning. Some Additional Topics. Applications Based on TWSVM. References.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Hamel, Lutz.
 Hoboken, N.J. : Wiley, c2009.
 Description
 Book — 1 online resource (xv, 246 p.) : ill.
 Summary

 Preface. PART I.
 1 What is Knowledge Discovery? 1.1 Machine Learning. 1.2 The Structure of the Universe X. 1.3 Inductive Learning. 1.4 Model Representations. Exercises. Bibliographic Notes.
 2 Knowledge Discovery Environments. 2.1 Computational Aspects of Knowledge Discovery. 2.1.1 Data Access. 2.1.2 Visualization. 2.1.3 Data Manipulation. 2.1.4 Model Building and Evaluation. 2.1.5 Model Deployment. 2.2 Other Toolsets. Exercises. Bibliographic Notes.
 3 Describing Data Mathematically. 3.1 From Data Sets to Vector Spaces. 3.1.1 Vectors. 3.1.2 Vector Spaces. 3.2 The Dot Product as a Similarity Score. 3.3 Lines, Planes, and Hyperplanes. Exercises. Bibliographic Notes.
 4 Linear Decision Surfaces and Functions. 4.1 From Data Sets to Decision Functions. 4.1.1 Linear Decision Surfaces through the Origin. 4.1.2 Decision Surfaces with an Offset Term. 4.2 A Simple Learning Algorithm. 4.3 Discussion. Exercises. Bibliographic Notes.
 5 Perceptron Learning. 5.1 Perceptron Architecture and Training. 5.2 Duality. 5.3 Discussion. Exercises. Bibliographic Notes.
 6 Maximum Margin Classifiers. 6.1 Optimization Problems. 6.2 Maximum Margins. 6.3 Optimizing the Margin. 6.4 Quadratic Programming. 6.5 Discussion. Exercises. Bibliographic Notes. PART II.
 7 Support Vector Machines. 7.1 The Lagrangian Dual. 7.2 Dual MaximumMargin Optimization. 7.2.1 The Dual Decision Function. 7.3 Linear Support Vector Machines. 7.4 NonLinear Support Vector Machines. 7.4.1 The Kernel Trick. 7.4.2 Feature Search. 7.4.3 A Closer Look at Kernels. 7.5 SoftMargin Classifiers. 7.5.1 The Dual Setting for SoftMargin Classifiers. 7.6 Tool Support. 7.6.1 WEKA. 7.6.2 R. 7.7 Discussion. Exercises. Bibliographic Notes.
 8 Implementation. 8.1 Gradient Ascent. 8.1.1 The KernelAdatron Algorithm. 8.2 Quadratic Programming. 8.2.1 Chunking. 8.3 Sequential Minimal Optimization. 8.4 Discussion. Exercises. Bibliographic Notes.
 9 Evaluating What has been Learned. 9.1 Performance Metrics. 9.1.1 The Confusion Matrix. 9.2 Model Evaluation. 9.2.1 The HoldOut Method. 9.2.2 The LeaveOneOut Method. 9.2.3 NFold CrossValidation. 9.3 Error Confidence Intervals. 9.3.1 Model Comparisons. 9.4 Model Evaluation in Practice. 9.4.1 WEKA. 9.4.2 R. Exercises. Bibliographic Notes.
 10 Elements of Statistical Learning Theory. 10.1 The VCDimension and Model Complexity. 10.2 A Theoretical Setting for Machine Learning. 10.3 Empirical Risk Minimization. 10.4 VCConfidence. 10.5 Structural Risk Minimization. 10.6 Discussion. Exercises. Bibliographic Notes. PART III.
 11 MultiClass Classification. 11.1 OneversustheRest Classification. 11.2 Pairwise Classification. 11.3 Discussion. Exercises. Bibliographic Notes.
 12 Regression with Support Vector Machines. 12.1 Regression as Machine Learning. 12.2 Simple and Multiple Linear Regression. 12.3 Regression with Maximum Margin Machines. 12.4 Regression with Support Vector Machines. 12.5 Model Evaluation. 12.6 Tool Support. 12.6.1 WEKA. 12.6.2 R. Exercises. Bibliographic Notes.
 13 Novelty Detection. 13.1 Maximum Margin Machines. 13.2 The Dual Setting. 13.3 Novelty Detection in R. Exercises. Bibliographic Notes. Appendix A: Notation. Appendix B: A Tutorial Introduction to R. B.1 Programming Constructs. B.2 Data Constructs. B.3 Basic Data Analysis. Bibliographic Notes. References. Index.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Deng, Naiyang, author.
 1st edition.  Chapman and Hall/CRC, 2012.
 Description
 Book — 1 online resource (363 pages) Digital: text file.
 Summary

Support Vector Machines: Optimization Based Theory, Algorithms, and Extensions presents an accessible treatment of the two main components of support vector machines (SVMs)classification problems and regression problems. The book emphasizes the close connection between optimization theory and SVMs since optimization is one of the pillars on which.
 Blaschzyk, Ingrid Karin.
 Wiesbaden : Springer Spektrum, 2020.
 Description
 Book — 1 online resource (xv, 126 pages)
 Summary

 Introduction
 Preliminaries
 Histogram Rule: Oracle Inequality and Learning Rates
 Localized SVMs: Oracle Inequalities and Learning Rates
 Discussion.
(source: Nielsen Book Data)
 Xu, Yuesheng author.
 Providence, RI : American Mathematical Society, [2019]
 Description
 Book — vi, 122 pages ; 25 cm.
 Summary

 Introduction Reproducing Kernel Banach Spaces Generalized Mercer Kernels Positive Definite Kernels Support Vector Machines Concluding Remarks Acknowledgments Index Bibliography.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
Science Library (Li and Ma)
Science Library (Li and Ma)  Status 

Serials  
QA3 .A57 NO.1243  Unknown 
 Campbell, Colin.
 San Rafael, Calif. (1537 Fourth Street, San Rafael, CA 94901 USA) : Morgan & Claypool, c2011.
 Description
 Book — 1 electronic text (viii, 83 p.).
 Summary

 Support Vector Machines for Classification Kernelbased Models Learning with Kernels.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Zhu, Xiaojin.
 San Rafael, Calif. (1537 Fourth Street, San Rafael, CA 94901 USA) : Morgan & Claypool Publishers, c2009.
 Description
 Book — 1 electronic text (xi, 116 p.) : ill.
 Summary

 Introduction to Statistical Machine Learning Overview of SemiSupervised Learning Mixture Models and EM CoTraining GraphBased SemiSupervised Learning SemiSupervised Support Vector Machines Human SemiSupervised Learning Theory and Outlook.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
17. Support vector machines and evolutionary algorithms for classification : single or together? [2014]
 Stoean, Catalin, author.
 Cham [Switzerland] : Springer, [2014]
 Description
 Book — 1 online resource Digital: text file.PDF.
 Summary

 Support Vector Machines. Evolutionary Algorithms. Support Vector Machines and Evolutionary Algorithms.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
18. Support vector machines for antenna array processing and electromagnetics [electronic resource] [2006]
 MartínezRamón, Manel, 1968
 1st ed.  San Rafael, Calif. (1537 Fourth Street, San Rafael, CA 94901 USA) : Morgan & Claypool Publishers, c2006.
 Description
 Book — 1 electronic text (ix, 110 p.) : ill.
 Summary

 Introduction Linear Support Vector Machines Nonlinear Support Vector Machines Advanced Topics Support Vector Machines for Beamforming Determination of Angle of Arrival Other Applications in Electromagnetics.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
Support Vector Machines (SVM) were introduced in the early 90's as a novel nonlinear solution for classification and regression tasks. These techniques have been proved to have superior performances in a large variety of real world applications due to their generalization abilities and robustness against noise and interferences. This book introduces a set of novel techniques based on SVM that are applied to antenna array processing and electromagnetics. In particular, it introduces methods for linear and nonlinear beamforming and parameter design for arrays and electromagnetic applications.
(source: Nielsen Book Data)
 Washington, D.C. : United States. Dept. of Energy. Office of Science ; Oak Ridge, Tenn. : distributed by the Office of Scientific and Technical Information, U.S. Dept. of Energy, 2015
 Description
 Book — Article No. e0123925 : digital, PDF file.
 Summary

The aqueous extract of yerba mate, a South American tea beverage made from Ilex paraguariensis leaves, has demonstrated bactericidal and inhibitory activity against bacterial pathogens, including methicillinresistant Staphylococcus aureus (MRSA). In this paper, the gas chromatographymass spectrometry (GCMS) analysis of two unique fractions of yerba mate aqueous extract revealed 8 identifiable small molecules in those fractions with antimicrobial activity. For a more comprehensive analysis, a data analysis pipeline was assembled to prioritize compounds for antimicrobial testing against both MRSA and methicillinsensitive S. aureus using fortytwo unique fractions of the tea extract that were generated in duplicate, assayed for activity, and analyzed with GCMS. As validation of our automated analysis, we checked our predicted active compounds for activity in literature references and used authentic standards to test for antimicrobial activity. 3,4dihydroxybenzaldehyde showed the most antibacterial activity against MRSA at low concentrations in our bioassays. In addition, quinic acid and quercetin were identified using random forests analysis and 5hydroxy pipecolic acid was identified using linear discriminant analysis. We also generated a ranked list of unidentified compounds that may contribute to the antimicrobial activity of yerba mate against MRSA. Finally, here we utilized GCMS data to implement an automated analysis that resulted in a ranked list of compounds that likely contribute to the antimicrobial activity of aqueous yerba mate extract against MRSA.
 Online
20. Learning with support vector machines [2011]
 Campbell, Colin.
 Cham, Switzerland : Springer, ©2011.
 Description
 Book — 1 online resource (viii, 83 pages) : illustrations
 Summary

 Support Vector Machines for Classification Kernelbased Models Learning with Kernels.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
Articles+
Journal articles, ebooks, & other eresources
Guides
Course and topicbased guides to collections, tools, and services.