1  8
 Herbrich, Ralf.
 Cambridge, Mass. : MIT Press, ©2002.
 Description
 Book — 1 online resource (xx, 364 pages) : illustrations.
 Summary

An overview of the theory and application of kernel classification methods. Linear classifiers in kernel spaces have emerged as a major topic within the field of machine learning. The kernel technique takes the linear classifiera limited, but wellestablished and comprehensively studied modeland extends its applicability to a wide range of nonlinear patternrecognition tasks such as natural language processing, machine vision, and biological sequence analysis. This book provides the first comprehensive overview of both the theory and algorithms of kernel classifiers, including the most recent developments. It begins by describing the major algorithmic advances: kernel perceptron learning, kernel Fisher discriminants, support vector machines, relevance vector machines, Gaussian processes, and Bayes point machines. Then follows a detailed introduction to learning theory, including VC and PACBayesian theory, datadependent structural risk minimization, and compression bounds. Throughout, the book emphasizes the interaction between theory and algorithms: how learning algorithms work and why. The book includes many examples, complete pseudo code of the algorithms presented, and an extensive source code library.
(source: Nielsen Book Data)
 Herbrich, Ralf.
 Cambridge, Mass. : MIT Press, ©2002.
 Description
 Book — 1 online resource (xx, 364 pages) : illustrations.
 Summary

An overview of the theory and application of kernel classification methods. Linear classifiers in kernel spaces have emerged as a major topic within the field of machine learning. The kernel technique takes the linear classifiera limited, but wellestablished and comprehensively studied modeland extends its applicability to a wide range of nonlinear patternrecognition tasks such as natural language processing, machine vision, and biological sequence analysis. This book provides the first comprehensive overview of both the theory and algorithms of kernel classifiers, including the most recent developments. It begins by describing the major algorithmic advances: kernel perceptron learning, kernel Fisher discriminants, support vector machines, relevance vector machines, Gaussian processes, and Bayes point machines. Then follows a detailed introduction to learning theory, including VC and PACBayesian theory, datadependent structural risk minimization, and compression bounds. Throughout, the book emphasizes the interaction between theory and algorithms: how learning algorithms work and why. The book includes many examples, complete pseudo code of the algorithms presented, and an extensive source code library.
(source: Nielsen Book Data)
 Herbrich, Ralf.
 Cambridge, Mass. : MIT Press, ©2002.
 Description
 Book — 1 online resource (xx, 364 pages) : illustrations
 Summary

An overview of the theory and application of kernel classification methods. Linear classifiers in kernel spaces have emerged as a major topic within the field of machine learning. The kernel technique takes the linear classifiera limited, but wellestablished and comprehensively studied modeland extends its applicability to a wide range of nonlinear patternrecognition tasks such as natural language processing, machine vision, and biological sequence analysis. This book provides the first comprehensive overview of both the theory and algorithms of kernel classifiers, including the most recent developments. It begins by describing the major algorithmic advances: kernel perceptron learning, kernel Fisher discriminants, support vector machines, relevance vector machines, Gaussian processes, and Bayes point machines. Then follows a detailed introduction to learning theory, including VC and PACBayesian theory, datadependent structural risk minimization, and compression bounds. Throughout, the book emphasizes the interaction between theory and algorithms: how learning algorithms work and why. The book includes many examples, complete pseudo code of the algorithms presented, and an extensive source code library.
(source: Nielsen Book Data)
 Herbrich, Ralf.
 Cambridge, Mass. : MIT Press, ©2002
 Description
 Book — 1 online resource (xx, 364 pages) : illustrations
 Summary

Linear classifiers in kernel spaces have emerged as a major topic within the field of machine learning. The kernel technique takes the linear classifiera limited, but wellestablished and comprehensively studied modeland extends its applicability to a wide range of nonlinear patternrecognition tasks such as natural language processing, machine vision, and biological sequence analysis. This book provides the first comprehensive overview of both the theory and algorithms of kernel classifiers, including the most recent developments. It begins by describing the major algorithmic advances: kernel perceptron learning, kernel Fisher discriminants, support vector machines, relevance vector machines, Gaussian processes, and Bayes point machines. Then follows a detailed introduction to learning theory, including VC and PACBayesian theory, datadependent structural risk minimization, and compression bounds. Throughout, the book emphasizes the interaction between theory and algorithms: how learning algorithms work and why. The book includes many examples, complete pseudo code of the algorithms presented, and an extensive source code library
 Herbrich, Ralf, author.
 Cambridge, Massachusetts : MIT Press, c2002 [Piscataqay, New Jersey] : IEEE Xplore, [2001]
 Description
 Book — 1 online resource (xx, 364 pages) : illustrations
 Summary

An overview of the theory and application of kernel classification methods. Linear classifiers in kernel spaces have emerged as a major topic within the field of machine learning. The kernel technique takes the linear classifiera limited, but wellestablished and comprehensively studied modeland extends its applicability to a wide range of nonlinear patternrecognition tasks such as natural language processing, machine vision, and biological sequence analysis. This book provides the first comprehensive overview of both the theory and algorithms of kernel classifiers, including the most recent developments. It begins by describing the major algorithmic advances: kernel perceptron learning, kernel Fisher discriminants, support vector machines, relevance vector machines, Gaussian processes, and Bayes point machines. Then follows a detailed introduction to learning theory, including VC and PACBayesian theory, datadependent structural risk minimization, and compression bounds. Throughout, the book emphasizes the interaction between theory and algorithms: how learning algorithms work and why. The book includes many examples, complete pseudo code of the algorithms presented, and an extensive source code library.
(source: Nielsen Book Data)
 Mukherjee, Sudipta.
 Birmingham, England ; Mumbai [India] : Packt Publishing, 2016.
 Description
 Book — 1 online resource (194 pages) : color illustrations, tables.
 Summary

 Cover
 Copyright
 Credits
 Foreword
 About the Author
 Acknowledgments
 About the Reviewers
 www.PacktPub.com
 Table of Contents
 Preface
 Chapter 1: Introduction to Machine Learning
 Objective
 Getting in touch
 Different areas where machine learning is being used
 Why use F#?
 Supervised machine learning
 Training and test dataset/corpus
 Some motivating real life examples of supervised learning
 Nearest Neighbour algorithm (a.k.a kNN algorithm)
 Distance metrics
 Decision tree algorithms
 Unsupervised learning
 Machine learning frameworks
 Machine learning for fun and profit
 Recognizing handwritten digits
 your "Hello World" ML program
 How does this work?
 Summary
 Chapter 2: Linear Regression
 Objective
 Different types of linear regression algorithms
 APIs used
 Math.NET Numerics for F# 3.7.0
 Getting Math.NET
 Experimenting with Math.NET
 The basics of matrices and vectors (a short and sweet refresher)
 Creating a vector
 Creating a matrix
 Finding the transpose of a matrix
 Finding the inverse of a matrix
 Trace of a matrix
 QR decomposition of a matrix
 SVD of a matrix
 Linear regression method of least square
 Finding linear regression coefficients using F#
 Finding the linear regression coefficients using Math.NET
 Putting it together with Math.NET and FsPlot
 Multiple linear regression
 Multiple linear regression and variations using Math.NET
 Weighted linear regression
 Plotting the result of multiple linear regression
 Ridge regression
 Multivariate multiple linear regression
 Feature scaling
 Summary
 Chapter 3: Classification Techniques
 Objective
 Different classification algorithms you will learn
 Some interesting things you can do
 Binary classification using kNN
 How does it work?.
 Finding cancerous cells using kNN: a case study
 Understanding logistic regression
 The sigmoid function chart
 Binary classification using logistic regression (using Accord.NET)
 Multiclass classification using logistic regression
 How does it work?
 Multiclass classification using decision trees
 Obtaining and using WekaSharp
 How does it work?
 Predicting a traffic jam using a decision tree: a case study
 Challenge yourself!
 Summary
 Chapter 4: Information Retrieval
 Objective
 Different IR algorithms you will learn
 What interesting things can you do?
 Information retrieval using tfidf
 Measures of similarity
 Generating a PDF from a histogram
 Minkowski family
 L1 family
 Intersection family
 Inner Product family
 Fidelity family or squaredchord family
 Squared L2 family
 Shannon's Entropy family
 Similarity of asymmetric binary attributes
 Some example usages of distance metrics
 Finding similar cookies using asymmetric binary similarity measures
 Grouping/clustering color images based on Canberra distance
 Summary
 Chapter 5: Collaborative Filtering
 Objective
 Different classification algorithms you will learn
 Vocabulary of collaborative filtering
 Baseline predictors
 Basis of UserUser collaborative filtering
 Implementing basic useruser collaborative filtering using F#
 Code walkthrough
 Variations of gap calculations and similarity measures
 Itemitem collaborative filtering
 TopN recommendations
 Evaluating recommendations
 Prediction accuracy
 Confusion matrix (decision support)
 Ranking accuracy metrics
 Predictionrating correlation
 Working with real movie review data (Movie Lens)
 Summary
 Chapter 6: Sentiment Analysis
 Objective
 What you will learn
 A baseline algorithm for SA using SentiWordNet lexicons.
 Handling negations
 Identifying praise or criticism with sentiment orientation
 Pointwise Mutual Information
 Using SOPMI to find sentiment analysis
 Summary
 Chapter 7: Anomaly Detection
 Objective
 Different classification algorithms
 Some cool things you will do
 The different types of anomalies
 Detecting point anomalies using IQR (Interquartile Range)
 Detecting point anomalies using Grubb's test
 Grubb's test for multivariate data using Mahalanobis distance
 Code walkthrough
 Chisquared statistic to determine anomalies
 Detecting anomalies using density estimation
 Strategy to convert a collective anomaly to a point anomaly problem
 Dealing with categorical data in collective anomalies
 Summary
 Index.
(source: Nielsen Book Data)
 Cham : Springer, [2020]
 Description
 Book — 1 online resource : illustrations Digital: text file.PDF.
 Summary

 A guide to the NeurIPS 2018 competitions / Ralf Herbrich, Sergio Escalera
 Pommerman & NeurIPS 2018 / Cinjon Resnick, Chao Gao, Görög Márton, Takayuki Osogami, Liang Pang, Toshihiro Takahashi
 The AI Driving Olympics at NeurIPS 2018 / Julian Zilly, Jacopo Tani, Breandan Considine, Bhairav Mehta, Andrea F. Daniele, Manfred Diaz et al.
 Artificial intelligence for prosthetics : challenge solutions / Łukasz Kidziński, Carmichael Ong, Sharada Prasanna Mohanty, Jennifer Hicks, Sean Carroll, Bo Zhou et al.
 Adversarial vision challenge / Wieland Brendel, Jonas Rauber, Alexey Kurakin, Nicolas Papernot, Behar Veliqi, Sharada P. Mohanty et al.
 The inclusive images competition / James Atwood, Yoni Halpern, Pallavi Baljekar, Eric Breck, D. Sculley, Pavel Ostyakov et al.
 The second conversational intelligence challenge (ConvAI2) / Emily Dinan, Varvara Logacheva, Valentin Malykh, Alexander Miller, Kurt Shuster, Jack Urbanek et al.
 AutoML @ NeurIPS 2018 challenge : design and reuslts / Hugo Jair Escalante, WeiWei Tu, Isabelle Guyon, Daniel L. Silver, Evelyne Viegas, Yuqiang Chen et al.
 The tracking machine learning challenge : accuracy phase / Sabrina Amrouche, Laurent Basara, Paolo Calafiura, Victor Estrade, Steven Farrell, Diogo R. Ferreira et al.
 Efficient and robust learning on elaborated gaits with curriculum learning / Bo Zhou, Hongsheng Zeng, Fan Wang, Rongzhong Lian, Hao Tian
 ConvAI2 dataset of nongoaloriented humantobot dialogues / Varvara Logacheva, Valentin Malykh, Aleksey Litinsky, Mikhail Burtsev
 Lost in conversation : a conversational agent based on the transformer and transfer learning / Sergey Golovanov, Alexander Tselousov, Rauf Kurbanov, Sergey I. Nikolenko
 Automatically optimized gradient boosting trees for classifying large volume high cardinality data streams under concept drift / Jobin Wilson, Amit Kumar Meher, Bivin Vinodkumar Bindu, Santanu Chaudhury, Brejesh Lall, Manoj Sharma et al.
(source: Nielsen Book Data)
8. Mit Optimismus in die Zukunft schauen : künstliche IntelligenzChancen und Rahmenbedingungen [2018]
 1. Auflage  Berlin : B & S Siebenhaar Verlag, 2018
 Description
 Book — 144 pages : color illustrations ; 23 cm
 Summary

 Vorwort
 Grusswort des Bundesministers / Peter Altmaier
 Vor der Tagung
 Künstliche Intelligenz : Thema und Kontext
 Die Träume der Vergangenheit werden Zukunft / Klaus Siebenhaar
 Künstliche Intelligenz bei Amazon : Forschung im Dienst des Kunden / Ralf Herbrich
 Prolog : Status, Perspektiven und Rahmenbedingungen
 Maschinelles Lernen : Entwicklung ohne Grenzen? / Bernhard Schölkopf
 Künstliche Intelligenz im Interesse der Gesellschaft gestalten : Ansätze für eine innovationsfreundliche GovernanceStrategie / Wolfgang Schulz
 Forum I : Mensch, Maschine : Innovation und gesellschaftliche Verantwortung in Zeiten von Big Data und Künstlicher Intelligenz
 Der Mensch muss ein neues Selbstverständnis finden / Joachim M. Buhmann
 Der Zusammenhalt in der Gesellschaft muss gewahrt bleiben / Jens Zimmermann
 Podiumsdiskussion / Mit Andreas Boes, Joachim M. Buhmann, Lea Helmers; moderiert von Joana Breidenbach
 Kl : Die Tagung in Bildern
 Fortschritt und Vertrauen : Künstliche Intelligenz in Alltag, Lebenswelt und Öffentlichkeit
 Die algorithmische Gesellschaft : Wie wir Künstliche Intelligenz in unsere Dienste stellen / Jörg Dräger
 Wir wollen den Fortschritt gern gestalten / Gottfried Ludewig
 Podiumsdiskussion / Mit Christian Bauckhage, Jörg Dräger, Martin Hirsch, Dirk Kretzschmar, Gottfried Ludewig; moderiert von Joachim Bühler
 "Schlaue Regulierung : Rahmenbedingungen für Kl in Deutschland und Europa
 Für ein europäisches KlLeitbild / Andreas Boes, Elisabeth Vogl
 Podiumsdiskussion / Mit Aljoscha Burchardt, Lorena JaumePalasí, Martina Mara; moderiert von Wolfgang Schulz
 Epilog
 Olimpias Erbe : Kunst, künstliche Intelligenz und Kreativität / Klaus Siebenhaar
 Kl : Die Tagung in Bildern
 Anhang
 Zu den Autoren, Herausgebern und Gesprächspartnern
 Veranstalter und Partner
 Bildnachweis, Impressum
 Online
SAL3 (offcampus storage)
SAL3 (offcampus storage)  Status 

Stacks  Request (opens in new tab) 
HM851 .M58 2018  Available 
Articles+
Journal articles, ebooks, & other eresources
Guides
Course and topicbased guides to collections, tools, and services.