1  20
Next
 Hagemann, Paul Lyonel, author.
 Cambridge : Cambridge University Press, 2023.
 Description
 Book — 1 online resource (57 pages)
 Summary

Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative models. This Element provides a unified framework to handle these approaches via Markov chains. The authors consider stochastic normalizing flows as a pair of Markov chains fulfilling some properties, and show how many stateoftheart models for data generation fit into this framework. Indeed numerical simulations show that including stochastic layers improves the expressivity of the network and allows for generating multimodal distributions from unimodal ones. The Markov chains point of view enables the coupling of both deterministic layers as invertible neural networks and stochastic layers as MetropolisHasting layers, Langevin layers, variational autoencoders and diffusion normalizing flows in a mathematically sound way. The authors' framework establishes a useful mathematical tool to combine the various approaches.
2. Markov Processes. Volume II [2022]
 Dynkin, E. B. (Evgeniĭ Borisovich), 19242014.
 Berlin, Germany : SpringerVerlag, 2022.
 Description
 Book — 1 online resource (284 pages)
 Summary

 Cover
 Title page
 Copyright page
 Contents
 Chapter Twelve
 Excessive, superharmonic and harmonic functions
 1. Excessive functions for transition functions
 2. Excessive functions for Markov processes
 3. Asymptotic behavior of excessive functionsalong trajectories of a process
 4. Superharmonie functions
 5. Harmonie functions
 Chapter Thirteen
 Harmonie and superharmonic functions associated withstrong Feller processes. Probabilistic solution of certainequations
 1. Some properties of strong Feller processes
 Continuous strong Markov processes on a closed interval
 1. General properties of onedimensional continuous strong Markov processes
 2. Characteristics of regular processes
 3. Computation of the characteristic and infinitesimal operators
 4. Superharmonie and harmonic functionsconnected with regular onedimensional processes
 Chapter Sixteen
 Continuous strong Markov processes on an open interval
 1. Harmonie functions and behavior of trajectories
 2. Sfunctions and character of the motion along a trajectory
 3. Infinitesimal operators
 Chapter Seventeen
 Construction of onedimensional continuous strong Markov processes
 1. Transformations of the state space. Canonical coordinate
 2. Construction of regular continuous strong Markov processeson an open interval
 3. Construction of regular continuous strong Markov processeson a closed interval
 4. Computation of the harmonic functions and resolvents for regular processes
 Appendix
 1. Measurable spaces and measurable transformations
 2. Measures and integrals
 3. Probability spaces. Conditional probabilities and mathematical expectations
 4. Martingales
 5. Topological measurable spaces
 6. Some theorems on partial differential equations
 7. Measures and countably additive set functions on the line and corresponding point functions
 8. Convex functions
 Historical
 Bibliographical note
 Bibliography
 Index
 List of symbols
 2. The Dirichlet problems. Regular points of the boundary
 3. Harmonie and superharmonic functionsassociated with diffusion processes
 4. Solutions of the equation ztf Vf= g
 5. Parts of a diffusion process and Green's functions
 Chapter Fourteen
 The multidimensional Wiener processand its transformations
 1. Harmonie and superharmonic functionsrelated to the Wiener process
 2. The mapping 'l'
 3. Additive functionals and Green's functions
 4. Brownian motion with killing measure μ and speed measure v
 5. qsubprocesses
 Chapter Fifteen
 Pirjol, Dan, author.
 Cham : Springer, [2022]
 Description
 Book — 1 online resource (ix, 132 pages) : illustrations (chiefly color).
 Summary

 1. Introduction to stochastic growth processes
 A. Growth processes in economics, biology and ecology
 B. Stochastic growth with multiplicative noise
 C. Stochastic growth with Markovian dependence
 2. Stochastic growth processes with exponential growth rates
 D. Exponential growth model driven by a geometric Brownian motion
 E. Exponential growth model with binomial tree growth rates
 F. ExpOrnsteinUhlenbeck model
 3. Lyapunov exponents of the exponential stochastic growth processes
 G. Numerical illustration: the bank account in the BlackDermanToy model
 H. Lyapunov exponents and their analyticity
 I.
 Lattice gas analogy
 J.
 Recursion relation for the moments
 4. Onedimensional lattice gas models with linear attractive interaction
 K. Lattice gases with mutual exclusion
 L. Lattice gas with universal interaction meanfield theory
 M. Kac potentials and the LebowitzPenrose theory
 N. Onedimensional lattice gas with linear attractive interaction: exact results
 5. Lattice gas with exponential attractive interactions
 O. Connection to the BlackKarasinski model
 P. Exact result for the Lyapunov exponents
 Q. Limiting case: KacHelfand lattice gas and the van der Waals theory
 6. Applications
 R. Monte Carlo simulation of stochastic volatility models
 S. Asymptotic bond pricing in the BlackDermanToy model
 .
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Bobrowski, Adam, author.
 Cambridge, United Kingdom ; New York, NY : Cambridge University Press, 2021
 Description
 Book — 1 online resource
 Summary

 A nontechnical introduction
 1. A guided tour through the land of operator semigroups
 2. Generators versus intensity matrices
 3. Boundary theory: core results
 4. Boundary theory continued
 5. The dual perspective
 Solutions and hints to selected exercises
 Commonly used notations
 References
 Index.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
5. Approximate quantum Markov chains [2018]
 Sutter, David, author.
 Cham, Switzerland : Springer, [2018]
 Description
 Book — 1 online resource. Digital: text file; PDF.
 Summary

 Introduction. Classical Markov chains. Quantum Markov chains. Outline. Preliminaries. Notation. Schatten norms. Functions on Hermitian operators. Quantum channels. Entropy measures. Background and further reading. Tools for noncommuting operators. Pinching. Complex interpolation theory. Background and further reading. Multivariate trace inequalities. Motivation. Multivariate ArakiLiebThirring inequality. Multivariate GoldenThompson inequality. Multivariate logarithmic trace inequality. Background and further reading. Approximate quantum Markov chains. Quantum Markov chains. Sufficient criterion for approximate recoverability. Necessary criterion for approximate recoverability. Strengthened entropy inequalities. Background and further reading. A A large conditional mutual information does not imply bad recovery. B Example showing the optimality of the Lmaxterm. C Solutions to exercises. References. Index.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
6. Markov chains [2018]
 Douc, Randal, author.
 Cham, Switzerland : Springer, 2018.
 Description
 Book — 1 online resource (xviii, 757 pages) : illustrations (some color) Digital: text file.PDF.
 Summary

 Part I Foundations. Markov Chains: Basic Definitions. Examples of Markov Chains. Stopping Times and the Strong Markov Property. Martingales, Harmonic Functions and PolssonDirichlet Problems. Ergodic Theory for Markov Chains. Part II Irreducible Chains: Basics. Atomic Chains. Markov Chains on a Discrete State Space. Convergence of Atomic Markov Chains. Small Sets, Irreducibility and Aperiodicity. Transience, Recurrence and Harris Recurrence. Splitting Construction and Invariant Measures. Feller and Tkernels. Part III Irreducible Chains: Advanced Topics. Rates of Convergence for Atomic Markov Chains. Geometric Recurrence and Regularity. Geometric Rates of Convergence. (f, r)recurrence and Regularity. Subgeometric Rates of Convergence. Uniform and Vgeometric Ergodicity by Operator Methods. Coupling for Irreducible Kernels. Part IV Selected Topics. Convergence in the Wasserstein Distance. Central Limit Theorems. Spectral Theory. Concentration Inequalities. Appendices. A Notations. B Topology, Measure, and Probability. C Weak Convergence. D Total and Vtotal Variation Distances. E Martingales. F Mixing Coefficients. G Solutions to Selected Exercises.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Privault, Nicolas, author.
 Second edition.  Singapore : Springer, 2018.
 Description
 Book — 1 online resource (xvii, 372 pages) : illustrations Digital: text file.PDF.
 Summary

 Probability Background
 Gambling Problems
 Random Walks
 DiscreteTime Markov Chains
 First Step Analysis
 Classification of States
 LongRun Behavior of Markov Chains
 Branching Processes
 ContinuousTime Markov Chains
 DiscreteTime Martingales
 Spatial Poisson Processes
 Reliability Theory.
 Silʹvestrov, D. S. (Dmitriĭ Sergeevich)
 Cham : Springer, [2017]
 Description
 Book — 1 online resource.
 Summary

 Laurent Asymptotic Expansions. Asymptotic Expansions for Moments of Hitting Times for Nonlinearly Perturbed SemiMarkov Processes. Asymptotic Expansions for Stationary Distributions of Nonlinearly Perturbed SemiMarkov Processes. Nonlinearly Perturbed BirthDeathType SemiMarkov Processes. Examples and Survey of Applied Perturbed Stochastic Models. A. Methodological and Bibliographical Remarks.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Rudnicki, Ryszard, author.
 Cham, Switzerland : Springer, [2017]
 Description
 Book — 1 online resource.
 Summary

 1 Biological Models. 2 Markov Processes. 3 Operator Semigroups. 4 Stochastic Semigroups. 5 Asymptotic Properties of Stochastic Semigroups  General Results. 6 Asymptotic Properties of Stochastic Semigroups  Applications.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
10. Markov processes [2015]
 Kirkwood, James R., author.
 Boca Raton, FL : CRC Press, [2015]
 Description
 Book — xii, 327 pages : illustrations ; 24 cm.
 Summary

 Review of Probability Short History Review of Basic Probability Definitions Some Common Probability Distributions Properties of a Probability Distribution Properties of the Expected Value Expected Value of a Random Variable with Common Distributions Generating Functions Moment Generating Functions Exercises DiscreteTime, FiniteState Markov Chains Introduction Notation Transition Matrices Directed Graphs: Examples of Markov Chains Random Walk with Reflecting Boundaries Gamblera (TM)s Ruin Ehrenfest Model Central Problem of Markov Chains Condition to Ensure a Unique Equilibrium State Finding the Equilibrium State Transient and Recurrent States Indicator Functions PerronFrobenius Theorem Absorbing Markov Chains Mean First Passage Time Mean Recurrence Time and the Equilibrium State Fundamental Matrix for Regular Markov Chains Dividing a Markov Chain into Equivalence Classes Periodic Markov Chains Reducible Markov Chains Summary Exercises DiscreteTime, InfiniteState Markov Chains Renewal Processes Delayed Renewal Processes Equilibrium State for Countable Markov Chains Physical Interpretation of the Equilibrium State Null Recurrent versus Positive Recurrent States Difference Equations Branching Processes Random Walk in Exercises Exponential Distribution and Poisson Process Continuous Random Variables Cumulative Distribution Function (Continuous Case) Exponential Distribution o(h) Functions Exponential Distribution as a Model for Arrivals Memoryless Random Variables Poisson Process Poisson Processes with Occurrences of Two Types Exercises ContinuousTime Markov Chains Introduction Generators of Continuous Markov Chains: The Kolmogorov Forward and Backward Equations Connection Between the Steady State of a Continuous Markov Chain and the Steady State of the Embedded Matrix Explosions Birth and BirthDeath Processes Birth and Death Processes Queuing Models Detailed Balance Equations Exercises Reversible Markov Chains Random Walks on Weighted Graphs DiscreteTime BirthDeath Process as a Reversible Markov Chain ContinuousTime Reversible Markov Chains Exercises Bibliography.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
Science Library (Li and Ma)
Science Library (Li and Ma)  Status 

Stacks  
QA274.7 .K58 2015  Unknown 
 Owen, Art B. author.
 Stanford, Calif. : Department of Statistics, Stanford University, November 2015.
 Description
 Book — 12 pages ; 28 cm.
Special Collections
Special Collections  Status 

University Archives  Request via Aeon (opens in new tab) 
260511  Inlibrary use 
12. Markov chains and dependability theory [2014]
 Rubino, Gerardo, 1955 author.
 Cambridge ; New York : Cambridge University Press, 2014.
 Description
 Book — viii, 278 pages : illustrations ; 26 cm
 Summary

 1. Introduction
 2. Discrete time Markov chains
 3. Continuous time Markov chains
 4. State aggregation of Markov chains
 5. Sojourn times in subsets of states
 6. Occupation times
 7. Performability
 8. Stationary detection
 9. Simulation of dependability models
 10. Bounding techniques.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
Science Library (Li and Ma)
Science Library (Li and Ma)  Status 

Stacks  
QA274.7 .R83 2014  Unknown 
13. Examples in Markov decision processes [2013]
 Piunovskiy, A. B.
 London : Imperial College Press, c2013.
 Description
 Book — xiii, 293 p. : ill. ; 24 cm.
 Summary

 Finite Horizon Models
 Infinite Horizon Models, Expected Total Loss and Discounted Loss
 Long Run Average Loss.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
Science Library (Li and Ma)
Science Library (Li and Ma)  Status 

Stacks  
QA274.7 .P58 2013  Unknown 
14. Latent Markov models for longitudinal data [2013]
 Bartolucci, Francesco, author.
 Boca Raton, FL : CRC Press, Taylor & Francis Group, [2013]
 Description
 Book — xix, 234 pages ; 24 cm.
 Summary

 Overview on Latent Markov Modeling Introduction Literature review on latent Markov models Alternative approaches Example datasets
 Background on Latent Variable and Markov Chain Models Introduction Latent variable models ExpectationMaximization algorithm Standard errors Latent class model Selection of the number of latent classes Applications Markov chain model for longitudinal data Applications
 Basic Latent Markov Model Introduction Univariate formulation Multivariate formulation Model identifiability Maximum likelihood estimation Selection of the number of latent states Applications
 Constrained Latent Markov Models Introduction Constraints on the measurement model Constraints on the latent model Maximum likelihood estimation Model selection and hypothesis testing Applications
 Including Individual Covariates and Relaxing Basic Model Assumptions Introduction Notation Covariates in the measurement model Covariates in the latent model Interpretation of the resulting models Maximum likelihood estimation Observed information matrix, identifiability, and standard errors Relaxing local independence Higher order extensions Applications
 Including Random Effects and Extension to Multilevel Data Introduction Randomeffects formulation Maximum likelihood estimation Multilevel formulation Application to the student math achievement dataset
 Advanced Topics about Latent Markov Modeling Introduction Dealing with continuous response variables Dealing with missing responses Additional computational issues Decoding and forecasting Selection of the number of latent states
 Bayesian Latent Markov Models Introduction Prior distributions Bayesian inference via reversible jump Alternative sampling Application to the labor market dataset
 Appendix: Software List of Main Symbols Bibliography Index.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
Science Library (Li and Ma)
Science Library (Li and Ma)  Status 

Stacks  
QA274.7 .B375 2013  Unknown 
 Sericola, Bruno.
 Hoboken, NJ : Wiley ; London : ISTE, Ltd., c2013.
 Description
 Book — 1 online resource.
 Summary

 Preface ix
 Chapter 1. DiscreteTime Markov Chains 1 1.1. Definitions and properties 1 1.2. Strong Markov property 5 1.3. Recurrent and transient states 8 1.4. State classification 12 1.5. Visits to a state 14 1.6. State space decomposition 18 1.7. Irreducible and recurrent Markov chains 22 1.8. Aperiodic Markov chains 30 1.9. Convergence to equilibrium 34 1.10. Ergodic theorem 41 1.11. First passage times and number of visits 53 1.12. Finite Markov chains 68 1.13. Absorbing Markov chains 70 1.14. Examples 76 1.15. Bibliographical notes 87
 Chapter 2. ContinuousTime Markov Chains 89 2.1. Definitions and properties 92 2.2. Transition functions and infinitesimal generator 93 2.3. Kolmogorov's backward equation 108 2.4. Kolmogorov's forward equation 114 2.5. Existence and uniqueness of the solutions 127 2.6. Recurrent and transient states 130 2.7. State classification 137 2.8. Explosion 141 2.9. Irreducible and recurrent Markov chains 148 2.10. Convergence to equilibrium 162 2.11. Ergodic theorem 166 2.12. First passage times 172 2.13. Absorbing Markov chains 184 2.14. Bibliographical notes 190
 Chapter 3. BirthandDeath Processes 191 3.1. Discretetime birthanddeath processes 191 3.2. Absorbing discretetime birthanddeath processes 200 3.3. Periodic discretetime birthanddeath processes 208 3.4. Continuoustime pure birth processes 209 3.5. Continuoustime birthanddeath processes 213 3.6. Absorbing continuoustime birthanddeath processes 228 3.7. Bibliographical notes 233
 Chapter 4. Uniformization 235 4.1. Introduction 235 4.2. Banach spaces and algebra 237 4.3. Infinite matrices and vectors 243 4.4. Poisson process 249 4.5. Uniformizable Markov chains 263 4.6. First passage time to a subset of states 273 4.7. Finite Markov chains 275 4.8. Transient regime 276 4.9. Bibliographical notes 286
 Chapter 5. Queues 287 5.1. The M/M/1 queue 288 5.2. The M/M/c queue 315 5.3. The M/M/ queue 318 5.4. Phasetype distributions 323 5.5. Markovian arrival processes 326 5.6. Batch Markovian arrival process 342 5.7. Blockstructured Markov chains 352 5.8. Applications 370 5.9. Bibliographical notes 380
 Appendix 1 Basic Results 381 Bibliography 387 Index 395.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Sericola, Bruno, author.
 London : ISTE Ltd ; Hoboken, NJ : John Wiley & Sons, Inc., 2013.
 Description
 Book — xi, 397 pages : illustrations ; 24 cm.
 Summary

 Preface ix
 Chapter 1. DiscreteTime Markov Chains 1 1.1. Definitions and properties 1 1.2. Strong Markov property 5 1.3. Recurrent and transient states 8 1.4. State classification 12 1.5. Visits to a state 14 1.6. State space decomposition 18 1.7. Irreducible and recurrent Markov chains 22 1.8. Aperiodic Markov chains 30 1.9. Convergence to equilibrium 34 1.10. Ergodic theorem 41 1.11. First passage times and number of visits 53 1.12. Finite Markov chains 68 1.13. Absorbing Markov chains 70 1.14. Examples 76 1.15. Bibliographical notes 87
 Chapter 2. ContinuousTime Markov Chains 89 2.1. Definitions and properties 92 2.2. Transition functions and infinitesimal generator 93 2.3. Kolmogorov's backward equation 108 2.4. Kolmogorov's forward equation 114 2.5. Existence and uniqueness of the solutions 127 2.6. Recurrent and transient states 130 2.7. State classification 137 2.8. Explosion 141 2.9. Irreducible and recurrent Markov chains 148 2.10. Convergence to equilibrium 162 2.11. Ergodic theorem 166 2.12. First passage times 172 2.13. Absorbing Markov chains 184 2.14. Bibliographical notes 190
 Chapter 3. BirthandDeath Processes 191 3.1. Discretetime birthanddeath processes 191 3.2. Absorbing discretetime birthanddeath processes 200 3.3. Periodic discretetime birthanddeath processes 208 3.4. Continuoustime pure birth processes 209 3.5. Continuoustime birthanddeath processes 213 3.6. Absorbing continuoustime birthanddeath processes 228 3.7. Bibliographical notes 233
 Chapter 4. Uniformization 235 4.1. Introduction 235 4.2. Banach spaces and algebra 237 4.3. Infinite matrices and vectors 243 4.4. Poisson process 249 4.5. Uniformizable Markov chains 263 4.6. First passage time to a subset of states 273 4.7. Finite Markov chains 275 4.8. Transient regime 276 4.9. Bibliographical notes 286
 Chapter 5. Queues 287 5.1. The M/M/1 queue 288 5.2. The M/M/c queue 315 5.3. The M/M/ queue 318 5.4. Phasetype distributions 323 5.5. Markovian arrival processes 326 5.6. Batch Markovian arrival process 342 5.7. Blockstructured Markov chains 352 5.8. Applications 370 5.9. Bibliographical notes 380
 Appendix 1 Basic Results 381 Bibliography 387 Index 395.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
 Online
Science Library (Li and Ma)
Science Library (Li and Ma)  Status 

Stacks  
QA274.7 .S47 2013  CHECKEDOUT 
 Privault, Nicolas, author.
 Singapore : Springer, 2013.
 Description
 Book — 1 online resource Digital: text file.PDF.
 Summary

 Introduction
 Probability Background
 Gambling Problems
 Random Walks
 DiscreteTime Markov Chains
 First Step Analysis
 Classification of States
 LongRun Behavior of Markov Chains
 Branching Processes
 ContinuousTime Markov Chains
 DiscreteTime Martingales
 Spatial Poisson Processes
 Reliability Theory.
(source: Nielsen Book Data)
 PrietoRumeau, Tomás.
 London : Imperial College Press ; Singapore ; Hackensack, N.J. : World Scientific [distributor], c2012.
 Description
 Book — xi, 279 p. : ill ; 24 cm.
 Summary

 Introduction
 Controlled Markov Chains
 Basic Optimality Criteria
 Policy Iteration and Approximation Theorems
 Overtaking, Bias, and Variance Optimality
 Sensitive Discount Optimality
 Blackwell Optimality
 Constrained Controlled Markov Chains
 Applications
 ZeroSum Markov Games
 Bias and Overtaking Equilibria for Markov Games.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
Science Library (Li and Ma)
Science Library (Li and Ma)  Status 

Stacks  
QA274.7 .P75 2012  Unknown 
 Chen, ZhenQing.
 Princeton : Princeton University Press, c2012.
 Description
 Book — xv, 479 p. : ill. ; 25 cm.
 Summary

 *FrontMatter, pg. i*Contents, pg. vii*Notation, pg. ix*Preface, pg. xi*Chapter One. Symmetric Markovian Semigroups and Dirichlet Forms, pg. 1*Chapter Two. Basic Properties and Examples of Dirichlet Forms, pg. 37*Chapter Three. Symmetric Hunt Processes and Regular Dirichlet Forms, pg. 92*Chapter Four. Additive Functionals of Symmetric Markov Processes, pg. 130*Chapter Five. Time Changes of Symmetric Markov Processes, pg. 166*Chapter Six. Reflected Dirichlet Spaces, pg. 240*Chapter Seven. Boundary Theory for Symmetric Markov Processes, pg. 300*Appendix A. Essentials of Markov Processes, pg. 391*Appendix B. Solutions To Exercises, pg. 443*Notes, pg. 451*Bibliography, pg. 457*Catalogue Of Some Useful Theorems, pg. 467*Index, pg. 473.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
Science Library (Li and Ma)
Science Library (Li and Ma)  Status 

Stacks  
QA274.7 .C468 2012  Unknown 
20. Labelled Markov processes [2009]
 Panangaden, P. (Prakash)
 London : Imperial College Press ; Singapore ; Hackensack, NJ : Distributed by World Scientific Pub., c2009.
 Description
 Book — xii, 199 p. : ill. ; 24 cm.
 Summary

 Introduction
 Measure Theory
 Integration
 The RadonNikodym Theorem
 A Category of Stochastic Relations
 Probability Theory on Continuous Spaces
 Bisimulation for Labelled Markov Processes
 Metrics for Labelled Markov Processes
 Approximating Labelled Markov Processes.
 (source: Nielsen Book Data)
(source: Nielsen Book Data)
Science Library (Li and Ma)
Science Library (Li and Ma)  Status 

Stacks  
QA274.7 .P36 2009  Unknown 
Articles+
Journal articles, ebooks, & other eresources
Guides
Course and topicbased guides to collections, tools, and services.