1 - 20
Next
1. Learning in energy-efficient neuromorphic computing : algorithm and architecture co-design [2020]
- Zheng, Nan, 1989- author.
- Hoboken, NJ : Wiley-IEEE Press, 2020.
- Description
- Book — 1 online resource (xx, 276 pages)
- Summary
-
- Preface xi
- Acknowledgment xix
- 1 Overview 1
- 1.1 History of Neural Networks 1
- 1.2 Neural Networks in Software 2
- 1.2.1 Artificial Neural Network 2
- 1.2.2 Spiking Neural Network 3
- 1.3 Need for Neuromorphic Hardware 3
- 1.4 Objectives and Outlines of the Book 5
- References 8
- 2 Fundamentals and Learning of Artificial Neural Networks 11
- 2.1 Operational Principles of Artificial Neural Networks 11
- 2.1.1 Inference 11
- 2.1.2 Learning 13
- 2.2 Neural Network Based Machine Learning 16
- 2.2.1 Supervised Learning 17
- 2.2.2 Reinforcement Learning 20
- 2.2.3 Unsupervised Learning 22
- 2.2.4 Case Study: Action-Dependent Heuristic Dynamic Programming 23
- 2.2.4.1 Actor-Critic Networks 24
- 2.2.4.2 On-Line Learning Algorithm 25
- 2.2.4.3 Virtual Update Technique 27
- 2.3 Network Topologies 31
- 2.3.1 Fully Connected Neural Networks 31
- 2.3.2 Convolutional Neural Networks 32
- 2.3.3 Recurrent Neural Networks 35
- 2.4 Dataset and Benchmarks 38
- 2.5 Deep Learning 41
- 2.5.1 Pre-Deep-Learning Era 41
- 2.5.2 The Rise of Deep Learning 41
- 2.5.3 Deep Learning Techniques 42
- 2.5.3.1 Performance-Improving Techniques 42
- 2.5.3.2 Energy-Efficiency-Improving Techniques 46
- 2.5.4 Deep Neural Network Examples 50
- References 53
- 3 Artificial Neural Networks in Hardware 61
- 3.1 Overview 61
- 3.2 General-Purpose Processors 62
- 3.3 Digital Accelerators 63
- 3.3.1 A Digital ASIC Approach 63
- 3.3.1.1 Optimization on Data Movement and Memory Access 63
- 3.3.1.2 Scaling Precision 71
- 3.3.1.3 Leveraging Sparsity 76
- 3.3.2 FPGA-Based Accelerators 80
- 3.4 Analog/Mixed-Signal Accelerators 82
- 3.4.1 Neural Networks in Conventional Integrated Technology 82
- 3.4.1.1 In/Near-Memory Computing 82
- 3.4.1.2 Near-Sensor Computing 85
- 3.4.2 Neural Network Based on Emerging Non-volatile Memory 88
- 3.4.2.1 Crossbar as a Massively Parallel Engine 89
- 3.4.2.2 Learning in a Crossbar 91
- 3.4.3 Optical Accelerator 93
- 3.5 Case Study: An Energy-Efficient Accelerator for Adaptive Dynamic Programming 94
- 3.5.1 Hardware Architecture 95
- 3.5.1.1 On-Chip Memory 95
- 3.5.1.2 Datapath 97
- 3.5.1.3 Controller 99
- 3.5.2 Design Examples 101
- References 108
- 4 Operational Principles and Learning in Spiking Neural Networks 119
- 4.1 Spiking Neural Networks 119
- 4.1.1 Popular Spiking Neuron Models 120
- 4.1.1.1 Hodgkin-Huxley Model 120
- 4.1.1.2 Leaky Integrate-and-Fire Model 121
- 4.1.1.3 Izhikevich Model 121
- 4.1.2 Information Encoding 122
- 4.1.3 Spiking Neuron versus Non-Spiking Neuron 123
- 4.2 Learning in Shallow SNNs 124
- 4.2.1 ReSuMe 124
- 4.2.2 Tempotron 125
- 4.2.3 Spike-Timing-Dependent Plasticity 127
- 4.2.4 Learning Through Modulating Weight-Dependent STDP in Two-Layer Neural Networks 131
- 4.2.4.1 Motivations 131
- 4.2.4.2 Estimating Gradients with Spike Timings 131
- 4.2.4.3 Reinforcement Learning Example 135
- 4.3 Learning in Deep SNNs 146
- 4.3.1 SpikeProp 146
- 4.3.2 Stack of Shallow Networks 147
- 4.3.3 Conversion from ANNs 148
- 4.3.4 Recent Advances in Backpropagation for Deep SNNs 150
- 4.3.5 Learning Through Modulating Weight-Dependent STDP in Multilayer Neural Networks 151
- 4.3.5.1 Motivations 151
- 4.3.5.2 Learning Through Modulating Weight-Dependent STDP 151
- 4.3.5.3 Simulation Results 158
- References 167
- 5 Hardware Implementations of Spiking Neural Networks 173
- 5.1 The Need for Specialized Hardware 173
- 5.1.1 Address-Event Representation 173
- 5.1.2 Event-Driven Computation 174
- 5.1.3 Inference with a Progressive Precision 175
- 5.1.4 Hardware Considerations for Implementing the Weight-Dependent STDP Learning Rule 181
- 5.1.4.1 Centralized Memory Architecture 182
- 5.1.4.2 Distributed Memory Architecture 183
- 5.2 Digital SNNs 186
- 5.2.1 Large-Scale SNN ASICs 186
- 5.2.1.1 SpiNNaker 186
- 5.2.1.2 TrueNorth 187
- 5.2.1.3 Loihi 191
- 5.2.2 Small/Moderate-Scale Digital SNNs 192
- 5.2.2.1 Bottom-Up Approach 192
- 5.2.2.2 Top-Down Approach 193
- 5.2.3 Hardware-Friendly Reinforcement Learning in SNNs 194
- 5.2.4 Hardware-Friendly Supervised Learning in Multilayer SNNs 199
- 5.2.4.1 Hardware Architecture 199
- 5.2.4.2 CMOS Implementation Results 205
- 5.3 Analog/Mixed-Signal SNNs 210
- 5.3.1 Basic Building Blocks 210
- 5.3.2 Large-Scale Analog/Mixed-Signal CMOS SNNs 211
- 5.3.2.1 CAVIAR 211
- 5.3.2.2 BrainScaleS 214
- 5.3.2.3 Neurogrid 215
- 5.3.3 Other Analog/Mixed-Signal CMOS SNN ASICs 216
- 5.3.4 SNNs Based on Emerging Nanotechnologies 216
- 5.3.4.1 Energy-Efficient Solutions 217
- 5.3.4.2 Synaptic Plasticity 218
- 5.3.5 Case Study: Memristor Crossbar Based Learning in SNNs 220
- 5.3.5.1 Motivations 220
- 5.3.5.2 Algorithm Adaptations 222
- 5.3.5.3 Non-idealities 231
- 5.3.5.4 Benchmarks 238
- References 238
- 6 Conclusions 247
- 6.1 Outlooks 247
- 6.1.1 Brain-Inspired Computing 247
- 6.1.2 Emerging Nanotechnologies 249
- 6.1.3 Reliable Computing with Neuromorphic Systems 250
- 6.1.4 Blending of ANNs and SNNs 251
- 6.2 Conclusions 252
- References 253
- A Appendix 257
- A.1 Hopfield Network 257
- A.2 Memory Self-Repair with Hopfield Network 258
- References 266
- Index 269.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- Hoboken : Wiley, 2013.
- Description
- Book — 1 online resource (310 p.)
- Summary
-
- Preface xv
- 1 Application Fields and Fundamental Merits 1 Akira Hirose 1.1 Introduction 1 1.2 Applications of Complex-Valued Neural Networks 2 1.3 What is a complex number? 5 1.4 Complex numbers in feedforward neural networks 8 1.5 Metric in complex domain 12 1.6 Experiments to elucidate the generalization characteristics 16 1.7 Conclusions 26
- 2 Neural System Learning on Complex-Valued Manifolds 33 Simone Fiori 2.1 Introduction 34 2.2 Learning Averages over the Lie Group of Unitary Matrices 35 2.3 Riemannian-Gradient-Based Learning on the Complex Matrix-Hypersphere 41 2.4 Complex ICA Applied to Telecommunications 49 2.5 Conclusion 53
- 3 N -Dimensional Vector Neuron and Its Application to the N -Bit Parity Problem 59 Tohru Nitta 3.1 Introduction 59 3.2 Neuron Models with High-Dimensional Parameters 60 3.3 N-Dimensional Vector Neuron 65 3.4 Discussion 69 3.5 Conclusion 70
- 4 Learning Algorithms in Complex-Valued Neural Networks using Wirtinger Calculus 75 Md. Faijul Amin and Kazuyuki Murase 4.1 Introduction 76 4.2 Derivatives in Wirtinger Calculus 78 4.3 Complex Gradient 80 4.4 Learning Algorithms for Feedforward CVNNs 82 4.5 Learning Algorithms for Recurrent CVNNs 91 4.6 Conclusion 99
- 5 Quaternionic Neural Networks for Associative Memories 103 Teijiro Isokawa, Haruhiko Nishimura, and Nobuyuki Matsui 5.1 Introduction 104 5.2 Quaternionic Algebra 105 5.3 Stability of Quaternionic Neural Networks 108 5.4 Learning Schemes for Embedding Patterns 124 5.5 Conclusion 128
- 6 Models of Recurrent Clifford Neural Networks and Their Dynamics 133 Yasuaki Kuroe 6.1 Introduction 134 6.2 Clifford Algebra 134 6.3 Hopfield-Type Neural Networks and Their Energy Functions 137 6.4 Models of Hopfield-Type Clifford Neural Networks 139 6.5 Definition of Energy Functions 140 6.6 Existence Conditions of Energy Functions 142 6.7 Conclusion 149
- 7 Meta-cognitive Complex-valued Relaxation Network and its Sequential Learning Algorithm 153 Ramasamy Savitha, Sundaram Suresh, and Narasimhan Sundararajan 7.1 Meta-cognition in Machine Learning 154 7.2 Meta-cognition in Complex-valued Neural Networks 156 7.3 Meta-cognitive Fully Complex-valued Relaxation Network 164 7.4 Performance Evaluation of McFCRN: Synthetic Complexvalued Function Approximation Problem 171 7.5 Performance Evaluation of McFCRN: Real-valued Classification Problems 172 7.6 Conclusion 178
- 8 Multilayer Feedforward Neural Network with Multi-Valued Neurons for Brain-Computer Interfacing 185 Nikolay V. Manyakov, Igor Aizenberg, Nikolay Chumerin, and Marc M. Van Hulle 8.1 Brain-Computer Interface (BCI) 185 8.2 BCI Based on Steady-State Visual Evoked Potentials 188 8.3 EEG Signal Preprocessing 192 8.4 Decoding Based on MLMVN for Phase-Coded SSVEP BCI 196 8.5 System Validation 201 8.6 Discussion 203
- 9 Complex-Valued B-Spline Neural Networks for Modeling and Inverse of Wiener Systems 209 Xia Hong, Sheng Chen and Chris J. Harris 9.1 Introduction 210 9.2 Identification and Inverse of Complex-Valued Wiener Systems 211 9.3 Application to Digital Predistorter Design 222 9.4 Conclusions 229
- 10 Quaternionic Fuzzy Neural Network for View-invariant Color Face Image Recognition 235 Wai Kit Wong, Gin Chong Lee, Chu Kiong Loo, Way Soong Lim, and Raymond Lock 10.1 Introduction 236 10.2 Face Recognition System 238 10.3 Quaternion-Based View-invariant Color Face Image Recognition 244 10.4 Enrollment Stage and Recognition Stage for Quaternion- Based Color Face Image Correlator 255 10.5 Max-Product Fuzzy Neural Network Classifier 260 10.6 Experimental Results 266 10.7 Conclusion and Future Research Directions 274 References 274 Index 279.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- Italian Workshop on Neural Nets (22nd : 2012 : Vietri sul Mare, Italy)
- Beriln ; New York : Springer, c2013.
- Description
- Book — 1 online resource (444 p.)
- Summary
-
- Algorithms.- Signal Processing.- Applications.- Special Session on "Smart Grids: new frontiers and challenges".- Special Session on "Computational Intelligence in Emotional or Affective Systems".
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- New York : Nova Science Publishers, Inc., [2012]
- Description
- Book — 1 online resource
- Summary
-
- Preface
- Intelligent Market: A Brain-Computer Interface for Analyzing Investment Behavior & Market Stability
- Neural-Based Image Segmentation Architecture with Execution on a GPU
- Learning for Combining Fingerprint Matchers: A Case Study FVC-onGoing
- Comparison of EEG Montages for Diagnosis of Alzheimer's Disease using Spectral Features & Support Vector Machines
- Portfolio Optimization with Dimension Reduction Techniques: A Comprehensive Simulation Study
- Design & Training of Neural Architectures using Extreme Learning Machines
- Systematic Comparisons of Single- & Multiple-Hidden-Layer Neural Networks
- Learning Multiple ICA Modules for Distributed & Factorial Neural Representations
- Ontologies & Neural Associative Networks
- Index.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- Apolloni, Bruno.
- Berlin, Heidelberg : Springer Berlin Heidelberg : Imprint : Springer, 2013.
- Description
- Book — 1 online resource (XII, 464 pages 169 illustrations)
- Summary
-
- Algorithms.- Signal Processing.- Applications.- Special Session on "Smart Grids: new frontiers and challenges".- Special Session on "Computational Intelligence in Emotional or Affective Systems".
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- Kattan, Ali.
- New York : Novinka/Nova Science Publishers, Inc., [2011]
- Description
- Book — 1 online resource.
- Summary
-
- Introduction
- Feed-Forward Neural Networks
- FFANN Software Simulation
- FFANN Training Concept
- Trajectory-Driven Training Paradigm
- Evolutionary-Based Training Paradigm
- FFANN Simulation Utilizing Graphic-Processing Units
- Conclusion
- Index.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
7. Artificial neural networks [2011]
- Hauppauge, N.Y. : Nova Science Publishers, c2011.
- Description
- Book — 1 online resource
- Summary
-
- Preface
- Artificial Neural Network Modeling of Water & Wastewater Treatment Processes
- Recent Advances & Challenges in the Application of Artificial Neural Networks (ANN) in Neurological Sciences: An Overview
- Different Types of Applications Performed with Different Types of Neural Networks
- Application of Artificial Neural Networks in Wire Electro-Discharge Machining (WEDM)
- Artificial Neural Network Training & Software Implementation Techniques
- Artificial Neural Networks in Small-Signal & Noise Modeling of Microwave Transistors
- Parameter Extraction of Advanced Semiconductor Detectors with Artificial Neural Network
- Processing of Impedance Data Records using Artificial Neural Networks
- Identification of Patients Based on Spectral Analysis of Heart Rate Variability using Artificial Neural Networks
- Modelling the Ultrafiltration of Protein Solutions by Artificial Neural Networks
- Artificial Neural Networks in the Optimization of Microemulsion Liquid Chromatography Retention & Resolution
- Index.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- Baragona, Roberto.
- Heidelberg ; New York : Springer, ©2011.
- Description
- Book — 1 online resource (xi, 276 pages) Digital: text file.PDF.
- Summary
-
- Bio-inspired Optimization Methods
- Topics Organization
- Evolutionary Computation
- Evolutionary Computation Methods
- Properties of Genetic Algorithms
- Evolving Regression Models
- Identification
- Parameter Estimation
- Independent Component Analysis
- Time Series Linear and Nonlinear Models
- Models of Time Series
- Autoregressive Moving Average Models
- Nonlinear Models
- Design of Experiments
- Experiments and Design of Experiments.
- The Evolutionary Design of Experiments
- The Evolutionary Model-Based Experimental Design: The Statistical Models in the Evolution
- Outliers
- Outliers in Independent Data
- Outliers in Time Series
- Genetic Algorithms for Multiple Outlier Detection
- Cluster Analysis
- Partitioning Problem
- Genetic Clustering Algorithms
- Fuzzy Partition
- Multivariate Mixture Models Estimation by Evolutionary Computing
- Genetic Algorithms in Classification and Regression Trees Models
- Clusters of Time Series and Directional Data
- Multiobjective Genetic ClusteringReferences.
9. Focus on artificial neural networks [2011]
- New York : Nova Science Publishers, ©2011.
- Description
- Book — 1 online resource (xiv, 410 pages) : illustrations Digital: data file.
- Summary
-
- Preface
- Application of Artificial Neural Networks (ANNs) in Development of Pharmaceutical Microemulsions
- Investigations of Application of Artificial Neural Network for Flow Shop Scheduling Problems
- Artificial Neural Networks in Environmental Sciences & Chemical Engineering
- Establishing Productivity Indices for Wheat in the Argentine Pampas by an Artificial Neural Network Approach
- Design of Artificial Neural Network Predictors in Mechanical Systems Problems
- Massive-Training Artificial Neural Networks for Supervised Enhancement/Suppression of Lesions/Patterns in Medical Images
- An Inverse Neu
- Artificial Neural Networks: Definition, Properties & Misuses
- Evidences of New Biophysical Properties of Microtubules
- Forecasting Stream Temperature Using Adaptive Neuron-Fuzzy Logic & Artificial Neural Network Models
- Neural Network Applications in Modern Induction Machine Control Systems
- Wavelet Neural Networks: A Recent Strategy for Processing Complex Signals: Applications to Chemistry
- Robustness Verification of Artificial Neural Network Predictors in a Purpose-Built Data Compression Scheme
- Intelligent Inverse Kinematics Solution for Serial Manipulators Passing through Singular Configurations with Performance Prediction Network
- Using Artificial Neural Networks for Continuously Decreasing Time Series Data Forecasting
- Application of Artificial Neural Networks in Enzyme Technology
- Development of an ANN Model for Runoff Prediction
- Artificial Neural Networks Concept: Tools to Simulate, Predict & Control Processes
- Index.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- Coolen, A. C. C. (Anthony C. C.), 1960-
- Oxford : Oxford University Press, 2005.
- Description
- Book — 1 online resource (xvi, 569 pages) : illustrations
- Summary
-
- Machine generated contents note: pt. I Introduction to neural networks
- 1. General introduction
- 2. Layered networks
- 3. Recurrent networks with binary neurons
- 4. Notes and suggestions for further reading
- pt. II Advanced neural networks
- 5. Competitive unsupervised learning processes
- 6. Bayesian techniques in supervised learning
- 7. Gaussian processes
- 8. Support vector machines for binary classification
- 9. Notes and suggestions for further reading
- pt. III Information theory and neural networks
- 10. Measuring information
- 11. Identification of entropy as an information measure
- 12. Building blocks of Shannon's information theory
- 13. Information theory and statistical inference
- 14. Applications to neural networks
- 15. Notes and suggestions for further reading
- pt. IV Macroscopic analysis of dynamics
- 16. Network operation : macroscopic dynamics
- 17. Dynamics of online learning in binary perceptions
- 18. Dynamics of online gradient descent learning
- 19. Notes and suggestions for further reading
- pt. V Equilibrium statistical mechanics of neural networks
- 20. Basics of equilibrium statistical mechanics
- 21. Network operation : equilibrium analysis
- 22. Gardner theory of task realizability
- 23. Notes and suggestions for further reading
- App. A Probability theory in a nutshell
- App. B Conditions for the central limit theorem to apply
- App. C Some simple summation identities
- App. D Gaussian integrals and probability distributions
- App. E Matrix identities
- App. F [delta]-distribution
- App. G Inequalities based on convexity
- App. H Metrics for parametrized probability distributions.
(source: Nielsen Book Data)
- Berlin ; London : Springer, 2007.
- Description
- Book — 1 online resource (xxiii, 605 pages) : illustrations Digital: text file.PDF.
- Summary
-
- Optimum Tracking in Dynamic Environments.- Explicit Memory Schemes for Evolutionary Algorithms in Dynamic Environments.- Particle Swarm Optimization in Dynamic Environments.- Evolution Strategies in Dynamic Environments.- Orthogonal Dynamic Hill Climbing Algorithm: ODHC.- Genetic Algorithms with Self-Organizing Behaviour in Dynamic Environments.- Learning and Anticipation in Online Dynamic Optimization.- Evolutionary Online Data Mining: An Investigation in a Dynamic Environment.- Adaptive Business Intelligence: Three Case Studies.- Evolutionary Algorithms for Combinatorial Problems in the Uncertain Environment of the Wireless Sensor Networks.- Approximation of Fitness Functions.- Individual-based Management of Meta-models for Evolutionary Optimization with Application to Three-Dimensional Blade Optimization.- Evolutionary Shape Optimization Using Gaussian Processes.- A Study of Techniques to Improve the Efficiency of a Multi-Objective Particle Swarm Optimizer.- An Evolutionary Multi-objective Adaptive Meta-modeling Procedure Using Artificial Neural Networks.- Surrogate Model-Based Optimization Framework: A Case Study in Aerospace Design.- Handling Noisy Fitness Functions.- Hierarchical Evolutionary Algorithms and Noise Compensation via Adaptation.- Evolving Multi Rover Systems in Dynamic and Noisy Environments.- A Memetic Algorithm Using a Trust-Region Derivative-Free Optimization with Quadratic Modelling for Optimization of Expensive and Noisy Black-box Functions.- Genetic Algorithm to Optimize Fitness Function with Sampling Error and its Application to Financial Optimization Problem.- Search for Robust Solutions.- Single/Multi-objective Inverse Robust Evolutionary Design Methodology in the Presence of Uncertainty.- Evolving the Tradeoffs between Pareto-Optimality and Robustness in Multi-Objective Evolutionary Algorithms.- Evolutionary Robust Design of Analog Filters Using Genetic Programming.- Robust Salting Route Optimization Using Evolutionary Algorithms.- An Evolutionary Approach For Robust Layout Synthesis of MEMS.- A Hybrid Approach Based on Evolutionary Strategies and Interval Arithmetic to Perform Robust Designs.- An Evolutionary Approach for Assessing the Degree of Robustness of Solutions to Multi-Objective Models.- Deterministic Robust Optimal Design Based on Standard Crowding Genetic Algorithm.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- Ng, G. W. (Gee Wah), 1964-
- Hackensack, NJ : World Scientific, ©2009.
- Description
- Book — 1 online resource (xii, 371 pages) : illustrations (some color) Digital: data file.
- Summary
-
- Brain ? Center of Attraction
- Neurons and Synapses ? Key to Intelligence
- Cortex Architecture ? Building Block of Intelligence
- Many Faces of Memory ? Investigating the Human Multiple Memory Systems
- Learn Like a Human ? How Does Learning Take Place in Our Brain?
- How Emotions Give Rise to Cognition
- Laminar Computing ? How, What and Why?
- Probabilistic Computing ? The Bayesian Mind
- Thinking Machine ? Higher Theories of Brain and Commonsense Knowledge Generation
- Modeling the Entire Brain Architecture ? Biologically Inspired Cognitive Architecture
- Are We There? What Can Computers Do Today and Tomorrow?
- Brain ? A Forest Not Totally Explored. What Are the Issues?
- Understanding the Brain to Build Intelligent Systems ? Design Principles
- Conclusions ? Theory of the Mind, His and Mine.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
13. Quantum neural computation [2010]
- Ivancevic, Vladimir G.
- Dordrecht ; New York : Springer, ©2010.
- Description
- Book — 1 online resource (xv, 929 pages) : illustrations Digital: data file.
- Summary
-
- 1 Introduction 1.1 Neurodynamics 1.2 Quantum Computation 1.3 Discrete Quantum Computers 1.4 Topological Quantum Computers 1.5 Computation at the Edge of Chaos and Quantum Neural Networks 1.6 Adaptive Path Integral: An 1-Dimensional QNN 1.6.1 Computational Partition Function 1.6.2 From Thermodynamics to Quantum Field Theory 1.6.3 1-Dimensional QNNs 1.7 Brain Topology vs. Small-World Topology 1.8 Quantum Brain and Mind 1.8.1 Connectionism, Control Theory and Brain Theory 1.8.2 Neocortical Biophysics 1.8.3 Quantum Neurodynamics 1.8.4 Bi-Stable Perception and Consciousness 1.9 Notational Conventions 2 Brain and Classical Neural Networks 2.1 Human Brain 2.1.1 Basics of Brain Physiology 2.1.2 Modern 3D Brain Imaging 2.2 Biological versus Artificial Neural Networks 2.2.1 Common Discrete ANNs 2.2.2 Common Continuous ANNs 2.3 Synchronization in Neurodynamics 2.3.1 Phase Synchronization in Coupled Chaotic Oscillators 2.3.2 Oscillatory Phase Neurodynamics 2.3.3 Kuramoto Synchronization Model 2.3.4 Lyapunov Chaotic Synchronization 2.4 Spike Neural Networks and Wavelet Resonance 2.4.1 Ensemble Neuron Model 2.4.2 Wavelet Neurodynamics 2.4.3 Wavelets of Epileptic Spikes 2.5 Human Motor Control and Learning 2.5.1 Motor Control 2.5.2 Human Memory 2.5.3 Human Learning 2.5.4 Spinal Musculo-Skeletal Control 2.5.5 Cerebellum and Muscular Synergy 3 Quantum Theory Basics 3.1 Basics of Non-Relativistic Quantum Mechanics 3.1.1 Soft Introduction to Quantum Mechanics 3.1.2 Quantum States and Operators 3.1.3 The Tree Standard Quantum Pictures 3.1.4 Dirac's Probability Amplitude and Perturbation 3.1.5 State-Space for n Non-Relativistic Quantum Particles 3.2 Introduction to Quantum Fields 3.2.1 Amplitude, Relativistic Invariance and Causality 3.2.2 Gauge Theories 3.2.3 Free andInteracting Field Theories 3.2.4 Dirac's Quantum Electrodynamics (QED) 3.2.5 Abelian Higgs Model 3.2.6 Topological Quantum Computation 3.3 The Feynman Path Integral 3.3.1 The Action-Amplitude Formalism 3.3.2 Correlation Functions and Generating Functional 3.3.3 Quantization of the Electromagnetic Field 3.3.4 Wavelet-Based QFT 3.4 The Path-Integral TQFT 3.4.1 Schwarz-Type and Witten-Type Theories 3.4.2 Hodge Decomposition Theorem 3.4.3 Hodge Decomposition and Chern-Simons Theory 3.5 Non-Abelian Gauge Theories 3.5.1 Introduction to Non-Abelian Theories 3.5.2 Yang-Mills Theory 3.5.3 Quantization of Yang-Mills theory 3.5.4 Basics of Conformal Field Theory (CFT) 4 Spatio-Temporal Chaos, Solitons and NLS 4.1 Reaction-Diffusion Processes and Ricci Flow 4.1.1 Bio-Reaction-Diffusion Systems 4.1.2 Reactive Neurodynamics 4.1.3 Dissipative Evolution Under the Ricci Flow 4.2 Turbulence and Chaos in PDEs 4.3 Quantum Chaos and Its Control 4.3.1 Quantum Chaos vs. Classical Chaos 4.3.2 Optimal Control of Quantum Chaos 4.4 Solitions 4.4.1 Short History of Solitons 4.4.2 Lie-Poisson Bracket 4.4.3 Solitons and Muscular Contraction 4.5 Dispersive Wave Equations and Stability of Solitons 4.5.1 KdV Solitons 4.5.2 The Inverse Scattering Approach 4.6 Nonlinear Schr..odinger Equation (NLS) 4.6.1 Cubic NLS 4.6.2 Nonlinear Wave and Schr..odinger Equations 4.6.3 Physical NLS-Derivation 4.6.4 A Compact Attractor for High-Dimensional NLS 4.6.5 Finite-Difference Scheme for NLS 4.6.6 Method of Lines for NLS 5 Quantum Brain and Cognition 5.1 Biochemistry of Microtubules 5.2 Kink Soliton Model of MT-Dynamics 5.3 Macro- and Microscopic Neurodynamical Self-Similarity 5.3.1 Open Liouville Equation 5.4 Dissipative Quantum Brain Model 5.5 QED Brain Model 5.6.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
14. Neural networks and pattern recognition [1998]
- San Diego, Calif. : Academic Press, ©1998.
- Description
- Book — 1 online resource (xvi, 351 pages) : illustrations
- Summary
-
- Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks.H. Li and J. Wang, A Neural Network Model for Optical Flow Computation.F. Unal and N. Tepedelenlioglu, Temporal Pattern Matching Using an Artificial Neural Network.J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions.Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks: Introduction. Basic Model. Multiple Pulses. Multiple Receptive Field Inputs. Time Evolution of Two Cells. Space to Time. LinkingWaves and Time Scales. Groups. Invariances. Segmentation. Adaptation. Time to Space. Implementations. Integration into Systems. Concluding Remarks. References. H. Li and J. Wang, A Neural Network Model for Optical Flow Computation: Introduction. Theoretical Background. Discussion on the Reformulation. Choosing Regularization Parameters. A Recurrent Neural Network Model. Experiments. Comparison to Other Work. Summary and Discussion. References. F. Unal and N. Tepedelenlioglu, TemporalPattern Matching Using an Artificial Neural Network: Introduction. Solving Optimization Problems Using the Hopfield Network. Dynamic Time Warping Using Hopfield Network. Computer Simulation Results. Conclusions. References. J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing: Introduction. Dynamic Networks. Chaotic Attractors and Attractor Locking. Developing Multiple Attractors. Attractor Basins and Dynamic Binary Networks. Time Delay Mechanisms and Attractor Training. Timing of Action Potentials in Impulse Trains. Discussion. Acknowledgments. References. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons: Introduction. A Macroscopic Model for Cell Assemblies. Interactions Between Two Neural Groups. Stability of Equilibrium States. Oscillation Frequency Estimation. Experimental Validation. Conclusion. Appendix. References. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches: Introduction. State Machines. Dynamical Systems. Recurrent Neural Network. RNN as a State Machine. RNN as a Collection of Dynamical Systems. RNN with Two State Neurons. Experiments--Learning Loops of FSM. Discussion. References. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error: Introduction. Hebb's Rule. Theoretical Learning Rules. Biological Evidence. Conclusions. Acknowledgments. References and Bibliography. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items: Introduction. Learning Isolated and Embedded Spatial Patterns. Storing Items with Decreasing Activity. The LTM Invariance Principle. Using Rehearsal to Process Arbitrarily Long Lists. Implementing the LTM Invariance Principle with an On-Center Off-Surround Circuit. Resetting Items Once They can be Classified. Properties of a Classifying System. Simulations. Discussion. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks: Introduction. Fundamentals of PNs. Modeling of Biological Neural Systems with High Level PNs. New/Modified Elements Added to HPNs to Model BNNs. Example of a BNN: The Olfactory Bulb. Conclusions. References. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions: Introduction. Linear Finite Dimensional Memory Structures. The Gamma Neural Network. Applications of the Gamma Memory. Interpretations of the Gamma Memory. Laguerre and Gamma II Memories. Analog VLSI Implementations of the Gamma Filter. Conclusions. References.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
15. Neural nets and chaotic carriers [2010]
- Whittle, Peter, 1927-
- 2nd ed. - London : Imperial College Press ; Hackensack, NJ : Distributed by World Scientific Pub., ©2010.
- Description
- Book — 1 online resource (xii, 230 pages) : illustrations
- Summary
-
- Opening and Themes:Introduction and Aspirations--Optimal Statistical Procedures--Linear Links and Nonlinear Knots: The Basic Neural Net--Bifurcations and Chaos
- Associative and Storage Memories:What is a Memory? The Hamming and Hopfield Nets--Compound and SpuriousA" Traces--Preserving Plasticity: A Bayesian Approach--The Key Task: The Fixing of Fading Data, Conclusions I--Performance of the Probability-Maximising Algorithm--Other Memories - Other Considerations
- Oscillatory Operation and the Biological Model:Neuron Models and Neural Masses--Freeman Oscillators - Solo and in Concert--Associative Memories Incorporating the Freeman Oscillator--Olfactory Comparisons, Conclusions II--Transmission Delays--.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
16. Neural nets and chaotic carriers [2010]
- Whittle, Peter, 1927-
- 2nd ed. - London : Imperial College Press ; Hackensack, NJ : Distributed by World Scientific Pub., ©2010.
- Description
- Book — 1 online resource (xii, 230 pages) : illustrations
- Summary
-
- Opening and Themes:Introduction and Aspirations--Optimal Statistical Procedures--Linear Links and Nonlinear Knots: The Basic Neural Net--Bifurcations and Chaos
- Associative and Storage Memories:What is a Memory? The Hamming and Hopfield Nets--Compound and SpuriousA" Traces--Preserving Plasticity: A Bayesian Approach--The Key Task: The Fixing of Fading Data, Conclusions I--Performance of the Probability-Maximising Algorithm--Other Memories - Other Considerations
- Oscillatory Operation and the Biological Model:Neuron Models and Neural Masses--Freeman Oscillators - Solo and in Concert--Associative Memories Incorporating the Freeman Oscillator--Olfactory Comparisons, Conclusions II--Transmission Delays--.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- Cirrincione, Giansalvo, 1959-
- Hoboken, N.J. : Wiley, ©2010.
- Description
- Book — 1 online resource (xviii, 243 pages, [12] pages of plates) : illustrations (some color).
- Summary
-
- Foreword. Preface.
- 1 The Total Least Squares Problems. 1.1 Introduction. 1.2 Some TLS Applications. 1.3 Preliminaries. 1.4 Ordinary Least Squares Problems. 1.5 Basic TLS Problem. 1.6 Multidimensional TLS Problem. 1.7 Nongeneric Unidimensional TLS Problem. 1.8 Mixed OLS-TLS Problem. 1.9 Algebraic Comparisons Between TLS and OLS. 1.10 Statistical Properties and Validity. 1.11 Basic Data Least Squares Problem. 1.12 The Partial TLS Algorithm. 1.13 Iterative Computation Methods. 1.14 Rayleigh Quotient Minimization Non Neural and Neural Methods.
- 2 The MCA EXIN Neuron. 2.1 The Rayleigh Quotient. 2.2 The Minor Component Analysis. 2.3 The MCA EXIN Linear Neuron. 2.4 The Rayleigh Quotient Gradient Flows. 2.5 The MCA EXIN ODE Stability Analysis. 2.6 Dynamics of the MCA Neurons. 2.7 Fluctuations (Dynamic Stability) and Learning Rate. 2.8 Numerical Considerations. 2.9 TLS Hyperplane Fitting. 2.10 Simulations for the MCA EXIN Neuron. 2.11 Conclusions.
- 3 Variants of the MCA EXIN Neuron. 3.1 High-Order MCA Neurons. 3.2 The Robust MCA EXIN Nonlinear Neuron (NMCA EXIN). 3.3 Extensions of the Neural MCA.
- 4 Introduction to the TLS EXIN Neuron. 4.1 From MCA EXIN to TLS EXIN. 4.2 Deterministic Proof and Batch Mode. 4.3 Acceleration Techniques. 4.4 Comparison with TLS GAO. 4.5 A TLS Application: Adaptive IIR Filtering. 4.6 Numerical Considerations. 4.7 The TLS Cost Landscape: Geometric Approach. 4.8 First Considerations on the TLS Stability Analysis.
- 5 Generalization of Linear Regression Problems. 5.1 Introduction. 5.2 The Generalized Total Least Squares (GeTLS EXIN) Approach. 5.3 The GeTLS Stability Analysis. 5.4 Neural Nongeneric Unidimensional TLS. 5.5 Scheduling. 5.6 The Accelerated MCA EXIN Neuron (MCA EXIN+). 5.7 Further Considerations. 5.8 Simulations for the GeTLS EXIN Neuron.
- 6 The GeMCA EXIN Theory. 6.1 The GeMCA Approach. 6.2 Analysis of Matrix K . 6.3 Analysis of the Derivative of the Eigensystem of GeTLS EXIN. 6.4 Rank One Analysis Around the TLS Solution. 6.5 The GeMCA Spectra. 6.6 Qualitative Analysis of the Critical Points of the GeMCA EXIN Error Function. 6.7 Conclusion. References. Index.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- Cirrincione, Giansalvo, 1959-
- Hoboken, N.J. : Wiley, ©2010.
- Description
- Book — 1 online resource (xviii, 243 pages, [12] pages of plates) : illustrations (some color).
- Summary
-
- Foreword. Preface.
- 1 The Total Least Squares Problems. 1.1 Introduction. 1.2 Some TLS Applications. 1.3 Preliminaries. 1.4 Ordinary Least Squares Problems. 1.5 Basic TLS Problem. 1.6 Multidimensional TLS Problem. 1.7 Nongeneric Unidimensional TLS Problem. 1.8 Mixed OLS-TLS Problem. 1.9 Algebraic Comparisons Between TLS and OLS. 1.10 Statistical Properties and Validity. 1.11 Basic Data Least Squares Problem. 1.12 The Partial TLS Algorithm. 1.13 Iterative Computation Methods. 1.14 Rayleigh Quotient Minimization Non Neural and Neural Methods.
- 2 The MCA EXIN Neuron. 2.1 The Rayleigh Quotient. 2.2 The Minor Component Analysis. 2.3 The MCA EXIN Linear Neuron. 2.4 The Rayleigh Quotient Gradient Flows. 2.5 The MCA EXIN ODE Stability Analysis. 2.6 Dynamics of the MCA Neurons. 2.7 Fluctuations (Dynamic Stability) and Learning Rate. 2.8 Numerical Considerations. 2.9 TLS Hyperplane Fitting. 2.10 Simulations for the MCA EXIN Neuron. 2.11 Conclusions.
- 3 Variants of the MCA EXIN Neuron. 3.1 High-Order MCA Neurons. 3.2 The Robust MCA EXIN Nonlinear Neuron (NMCA EXIN). 3.3 Extensions of the Neural MCA.
- 4 Introduction to the TLS EXIN Neuron. 4.1 From MCA EXIN to TLS EXIN. 4.2 Deterministic Proof and Batch Mode. 4.3 Acceleration Techniques. 4.4 Comparison with TLS GAO. 4.5 A TLS Application: Adaptive IIR Filtering. 4.6 Numerical Considerations. 4.7 The TLS Cost Landscape: Geometric Approach. 4.8 First Considerations on the TLS Stability Analysis.
- 5 Generalization of Linear Regression Problems. 5.1 Introduction. 5.2 The Generalized Total Least Squares (GeTLS EXIN) Approach. 5.3 The GeTLS Stability Analysis. 5.4 Neural Nongeneric Unidimensional TLS. 5.5 Scheduling. 5.6 The Accelerated MCA EXIN Neuron (MCA EXIN+). 5.7 Further Considerations. 5.8 Simulations for the GeTLS EXIN Neuron.
- 6 The GeMCA EXIN Theory. 6.1 The GeMCA Approach. 6.2 Analysis of Matrix K . 6.3 Analysis of the Derivative of the Eigensystem of GeTLS EXIN. 6.4 Rank One Analysis Around the TLS Solution. 6.5 The GeMCA Spectra. 6.6 Qualitative Analysis of the Critical Points of the GeMCA EXIN Error Function. 6.7 Conclusion. References. Index.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- Tang, Huajin.
- Berlin : Springer, ©2007.
- Description
- Book — 1 online resource (xiii, 299 pages) : illustrations Digital: text file.PDF.
- Summary
-
- Feedforward Neural Networks and Training Methods.- New Dynamical Optimal Learning for Linear Multilayer FNN.- Fundamentals of Dynamic Systems.- Various Computational Models and Applications.- Convergence Analysis of Discrete Time RNNs for Linear Variational Inequality Problem.- Parameter Settings of Hopfield Networks Applied to Traveling Salesman Problems.- Competitive Model for Combinatorial Optimization Problems.- Competitive Neural Networks for Image Segmentation.- Columnar Competitive Model for Solving Multi-Traveling Salesman Problem.- Improving Local Minima of Columnar Competitive Model for TSPs.- A New Algorithm for Finding the Shortest Paths Using PCNN.- Qualitative Analysis for Neural Networks with LT Transfer Functions.- Analysis of Cyclic Dynamics for Networks of Linear Threshold Neurons.- LT Network Dynamics and Analog Associative Memory.- Output Convergence Analysis for Delayed RNN with Time Varying Inputs.- Background Neural Networks with Uniform Firing Rate and Background Input.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- Singapore : World Scientific Pub. Co., 2013.
- Description
- Book — 1 online resource (xii, 549 pages) : illustrations
- Summary
-
- Cellular Nonlinear Networks, Nonlinear Circuits and Cellular Automata: Genealogy of Chua's Circuit (Peter Kennedy)
- Impasse Points, Mutators, and Other Chua Creations (Hyongsuk Kim)
- Chua's Lagrangian Circuit Elements (Orla Feely)
- From CNN Dynamics to Cellular Wave Computers (Tamas Roska)
- Contributions of CNN to Bio-Robotics and Brain Science (P Arena and L Patane)
- From Radio-amateurs' Electronics to Toroidal Chaos (Otto E Rossler and Christophe Letellier)
- Analyzing the Dynamics of Excitatory Neural Networks by Synaptic Cellular Automata (V Nekorkin, A Dmitrichev, D Kasatkin and V Afraimovich)
- Dynamical Systems Perspective of Wolfram's Cellular Automata (M Courbage and B Kaminki)
- The Genesis of Chua's Circuit: Connecting Science, Art and Creativity (F Bertacchini, E Bilotta, G Laria and P Pantano)
- Nonlinear Electronics Laboratory (NOEL): A Reminiscence (Chai Wah Wu)
- Bursting in Cellular Automata and Cardiac Arrhythmias (Gil Bub, Alvin Shrier and Leon Glass)
- Local Activity Principle: The Cause of Complexity and Symmetry Breaking (Klaus Mainzer)
- Explorations in the Forest of Bifurcation Trees: Route from Chua's circuit to Chua's Memristive Oscillator (Lukasz Czerwinski and Maciej J Ogorzalek)
- Chua's Nonlinear Dynamics Perspective Cellular Automata (Giovanni E Pazienza)
- Application of CNN to Brainlike Computing (Bertram E Shi)
- Ideal Turbulence Phenomenon and Transmission Line with Chua's Diode (E Yu Romanenko and A N Sharkovsky)
- Dynamical Systems and Chaos: Connectivity of Julia Sets for Singularly Perturbed Rational Maps (Robert L Devaney and Elizabeth D Russell)
- Structural Transformations and Stability of Dynamical Networks (L A Bunimovich and B Z Webb)
- Chua's time (Arturo Buscarino, Luigi Fortuna and Mattia Frasca)
- Chaotic Neural Networks and Beyond (K Aihara, T Yamada and M Oku)
- Chaotic Neocortical Dynamics (Walter J Freeman)
- Nonlinear Dynamics of a Class of Piecewise Linear Systems (M Lakshmanan and K Murali)
- Chaotic Mathematical Circuitry (R Lozi)
- Chua's Equation was Proved to be Chaotic in Two Years, Lorenz Equation in Thirty Six Years (Bharathwaj Muthuswamy)
- Toward a Quantitative Formulation of Emergence (G Nicolis)
- Controlled Synchronization of Chaotic Oscillators with Huygens' Coupling (J Pena-Ramirez, R H B Fey and H Nijmeijer)
- Using Time-Delay Feedback for Control and Synchronization of Dynamical Systems (K Pyragas, V Pyragas and T Pyragiene)
- Models of Catastrophic Events and Suggestions to Foretell Them (Yves Pomeau and Martine Le Berre)
- Synchronization Propensity in Networks of Dynamical Systems (Stefano Fasan and Sergio Rinaldi)
- Further Progress in Partial Control of Chaotic Systems (Miguel Sanjuan)
- Phase and Complete Synchronizations in Time-Delay Systems (D V Senthilkumar, M Manju Shrii and J Kurths)
- Symbolic Dynamics and Spiral Structures due to the Saddle-Focus Bifurcations (Andrey Shilnikov, Leonid Shilnikov and Roberto Barrio)
- Dynamics of Periodically Forced Mass Point on Constrained Surface with Changing Curvature (Yoshisuke Ueda)
- Solitons for Describing 3-D Physical Reality: The Current Frontier (Paul J Werbos)
- Thermal Solitons in 1D and 2D Anharmonic Lattices - Solectrons and the Organization of Non-Linear Fluctuations in Long-Living Dynamical Structures (M G Velarde, W Ebeling, A P Chetverikov)
- Global Optimizations by Intermittent Diffusion (Shui-Nee Chow, Tzi-Sheng Yang and Hao-Min Zhou)
- Memristors: How We Found the Missing Memristor (R Stanley Williams)
- Aftermath of Finding the Memristor (R Stanley Williams)
- The Singing Arc: The Oldest Memristor? (Jean-Marc Ginoux and Bruno Rossetto)
- Two Centuries of Memristors (Themistoklis Prodromakis)
- State Equations for Active Circuits with Memristors (Martin Hasler)
- Analytical Analysis of Memristive Networks (Torsten Schmidt, Willi Neudeck, Ute Feldmann and Ronald Tetzlaff)
- Hardware Memristor Emulators (Andrew L Fitch, Herbert H C Iu and Chi K Tse)
- Leon Chua's Memristor (Guanrong Chen).
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
Articles+
Journal articles, e-books, & other e-resources
Guides
Course- and topic-based guides to collections, tools, and services.