1 - 20
Next
- Downing, Keith L.
- Cambridge, MA : The MIT Press, 2023
- Description
- Book — 1 online resource (280 pages).
- Summary
-
An insightful investigation into the mechanisms underlying the predictive functions of neural networks--and their ability to chart a new path for AI. Prediction is a cognitive advantage like few others, inherently linked to our ability to survive and thrive. Our brains are awash in signals that embody prediction. Can we extend this capability more explicitly into synthetic neural networks to improve the function of AI and enhance its place in our world Gradient Expectations is a bold effort by Keith L. Downing to map the origins and anatomy of natural and artificial neural networks to explore how, when designed as predictive modules, their components might serve as the basis for the simulated evolution of advanced neural network systems. Downing delves into the known neural architecture of the mammalian brain to illuminate the structure of predictive networks and determine more precisely how the ability to predict might have evolved from more primitive neural circuits. He then surveys past and present computational neural models that leverage predictive mechanisms with biological plausibility, identifying elements, such as gradients, that natural and artificial networks share. Behind well-founded predictions lie gradients, Downing finds, but of a different scope than those that belong to today's deep learning. Digging into the connections between predictions and gradients, and their manifestation in the brain and neural networks, is one compelling example of how Downing enriches both our understanding of such relationships and their role in strengthening AI tools. Synthesizing critical research in neuroscience, cognitive science, and connectionism, Gradient Expectations offers unique depth and breadth of perspective on predictive neural-network models, including a grasp of predictive neural circuits that enables the integration of computational models of prediction with evolutionary algorithms.
- Aggarwal, Charu C., author.
- Second edition - Cham : Springer, 2023
- Description
- Book — 1 online resource (490 pages) : illustrations (black and white, and color)
- Summary
-
- An Introduction to Neural Networks
- The Backpropagation Algorithm
- Machine Learning with Shallow Neural Networks
- Deep Learning: Principles and Training Algorithms
- Teaching a Deep Neural Network to Generalize
- Radial Basis Function Networks
- Restricted Boltzmann Machines
- Recurrent Neural Networks
- Convolutional Neural Networks
- Graph Neural Networks
- Deep Reinforcement Learning
- Advanced Topics in Deep Learning
- Kaddoura, Sanaa, 1986-
- Cham : Springer, 2023
- Description
- Book — 1 online resource (91 p.).
- Summary
-
- Intro
- Preface
- Acknowledgments
- Contents
- Chapter 1: Overview of GAN Structure
- 1.1 Introduction
- 1.2 Generative Models
- 1.3 GANS
- Overview of GAN Structure
- The Discriminator
- The Generator
- Training the GAN
- Loss Function
- GANs Weaknesses
- References
- Chapter 2: Your First GAN
- 2.1 Preparing the Environment
- Hardware Requirements
- Software Requirements
- Importing Required Modules and Libraries
- Prepare and Preprocess the Dataset
- 2.2 Implementing the Generator
- 2.3 Implementing the Discriminator
- 2.4 Training Stage
- Model Construction
- Loss Function
- Plot Generated Data Samples
- Training GAN
- Common Challenges While Implementing GANs
- References
- Chapter 3: Real-World Applications
- 3.1 Human Faces Generation
- Data Collection and Preparation
- Model Design
- The Generator Model
- The Discriminator Model
- Training
- Evaluation and Refinement
- Deployment
- 3.2 Deep Fake
- Data Collection and Preparation
- Model Design
- Training
- 3.3 Image-to-Image Translation
- Data Collection and Preparation
- Model Design
- The Generator Model
- The Discriminator Model
- The Adversarial Network
- Training
- 3.4 Text to Image
- Module Requirements
- Dataset
- Data Preprocessing
- Model Design
- Generator Model
- Discriminator Model
- Adversarial Model
- Training Stage
- Evaluation and Refinement
- 3.5 CycleGAN
- Dataset
- Model Design
- Generator Model
- Discriminator Model
- Training Stage
- 3.6 Enhancing Image Resolution
- Dataset
- Model Design
- Generator Model
- Discriminator Model
- Training Stage
- 3.7 Semantic Image Inpainting
- Dataset
- Model Design
- Generator Model
- Discriminator Model
- Training
- 3.8 Text to Speech
- Dataset
- Data Preprocessing
- Model Design
- Generator Model
- Discriminator Model
- Training
- References
- Chapter 4: Conclusion
- Knaup, Julian, author.
- Wiesbaden, Germany : Springer Vieweg, 2022.
- Description
- Book — 1 online resource (xii, 77 pages) : illustrations.
- Summary
-
- 1 Introduction
- 2 Preliminaries
- 3 Scientific State of the Art
- 4 Approach
- 5 Evaluation
- 6 Conclusion and Outlook.
- Ben Abdallah, Abderazek, author.
- Cham : Springer, [2022]
- Description
- Book — 1 online resource (xxi, 244 pages) : illustrations (chiefly color)
- Summary
-
- 1 Introduction to Neuromorphic Computing Systems.- 2 Neuromorphic System Design Fundamentals.- 3 Learning in Neuromorphic Systems.- 4 Emerging Memory Devices for Neuromorphic Systems.- 5 Communication Networks for Neuromorphic Systems.- 6 Fault-Tolerant Neuromorphic System Design.- 7 Reconfigurable Neuromorphic Computing System.- 8 Case Study: Real Hardware-Software Design of 3D-NoC-based Neuromorphic System.- 9 Survey of Neuromorphic Systems.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
6. Learning in energy-efficient neuromorphic computing : algorithm and architecture co-design [2019]
- Zheng, Nan, 1989- author.
- Hoboken, New Jersey : Wiley-IEEE Press, [2019] [Piscataqay, New Jersey] : IEEE Xplore, [2019]
- Description
- Book — 1 online resource (296 pages).
- Summary
-
- Preface xi
- Acknowledgment xix
- 1 Overview 1
- 1.1 History of Neural Networks 1
- 1.2 Neural Networks in Software 2
- 1.2.1 Artificial Neural Network 2
- 1.2.2 Spiking Neural Network 3
- 1.3 Need for Neuromorphic Hardware 3
- 1.4 Objectives and Outlines of the Book 5
- References 8
- 2 Fundamentals and Learning of Artificial Neural Networks 11
- 2.1 Operational Principles of Artificial Neural Networks 11
- 2.1.1 Inference 11
- 2.1.2 Learning 13
- 2.2 Neural Network Based Machine Learning 16
- 2.2.1 Supervised Learning 17
- 2.2.2 Reinforcement Learning 20
- 2.2.3 Unsupervised Learning 22
- 2.2.4 Case Study: Action-Dependent Heuristic Dynamic Programming 23
- 2.2.4.1 Actor-Critic Networks 24
- 2.2.4.2 On-Line Learning Algorithm 25
- 2.2.4.3 Virtual Update Technique 27
- 2.3 Network Topologies 31
- 2.3.1 Fully Connected Neural Networks 31
- 2.3.2 Convolutional Neural Networks 32
- 2.3.3 Recurrent Neural Networks 35
- 2.4 Dataset and Benchmarks 38
- 2.5 Deep Learning 41
- 2.5.1 Pre-Deep-Learning Era 41
- 2.5.2 The Rise of Deep Learning 41
- 2.5.3 Deep Learning Techniques 42
- 2.5.3.1 Performance-Improving Techniques 42
- 2.5.3.2 Energy-Efficiency-Improving Techniques 46
- 2.5.4 Deep Neural Network Examples 50
- References 53
- 3 Artificial Neural Networks in Hardware 61
- 3.1 Overview 61
- 3.2 General-Purpose Processors 62
- 3.3 Digital Accelerators 63
- 3.3.1 A Digital ASIC Approach 63
- 3.3.1.1 Optimization on Data Movement and Memory Access 63
- 3.3.1.2 Scaling Precision 71
- 3.3.1.3 Leveraging Sparsity 76
- 3.3.2 FPGA-Based Accelerators 80
- 3.4 Analog/Mixed-Signal Accelerators 82
- 3.4.1 Neural Networks in Conventional Integrated Technology 82
- 3.4.1.1 In/Near-Memory Computing 82
- 3.4.1.2 Near-Sensor Computing 85
- 3.4.2 Neural Network Based on Emerging Non-volatile Memory 88
- 3.4.2.1 Crossbar as a Massively Parallel Engine 89
- 3.4.2.2 Learning in a Crossbar 91
- 3.4.3 Optical Accelerator 93
- 3.5 Case Study: An Energy-Efficient Accelerator for Adaptive Dynamic Programming 94
- 3.5.1 Hardware Architecture 95
- 3.5.1.1 On-Chip Memory 95
- 3.5.1.2 Datapath 97
- 3.5.1.3 Controller 99
- 3.5.2 Design Examples 101
- References 108
- 4 Operational Principles and Learning in Spiking Neural Networks 119
- 4.1 Spiking Neural Networks 119
- 4.1.1 Popular Spiking Neuron Models 120
- 4.1.1.1 Hodgkin-Huxley Model 120
- 4.1.1.2 Leaky Integrate-and-Fire Model 121
- 4.1.1.3 Izhikevich Model 121
- 4.1.2 Information Encoding 122
- 4.1.3 Spiking Neuron versus Non-Spiking Neuron 123
- 4.2 Learning in Shallow SNNs 124
- 4.2.1 ReSuMe 124
- 4.2.2 Tempotron 125
- 4.2.3 Spike-Timing-Dependent Plasticity 127
- 4.2.4 Learning Through Modulating Weight-Dependent STDP in Two-Layer Neural Networks 131
- 4.2.4.1 Motivations 131
- 4.2.4.2 Estimating Gradients with Spike Timings 131
- 4.2.4.3 Reinforcement Learning Example 135
- 4.3 Learning in Deep SNNs 146
- 4.3.1 SpikeProp 146
- 4.3.2 Stack of Shallow Networks 147
- 4.3.3 Conversion from ANNs 148
- 4.3.4 Recent Advances in Backpropagation for Deep SNNs 150
- 4.3.5 Learning Through Modulating Weight-Dependent STDP in Multilayer Neural Networks 151
- 4.3.5.1 Motivations 151
- 4.3.5.2 Learning Through Modulating Weight-Dependent STDP 151
- 4.3.5.3 Simulation Results 158
- References 167
- 5 Hardware Implementations of Spiking Neural Networks 173
- 5.1 The Need for Specialized Hardware 173
- 5.1.1 Address-Event Representation 173
- 5.1.2 Event-Driven Computation 174
- 5.1.3 Inference with a Progressive Precision 175
- 5.1.4 Hardware Considerations for Implementing the Weight-Dependent STDP Learning Rule 181
- 5.1.4.1 Centralized Memory Architecture 182
- 5.1.4.2 Distributed Memory Architecture 183
- 5.2 Digital SNNs 186
- 5.2.1 Large-Scale SNN ASICs 186
- 5.2.1.1 SpiNNaker 186
- 5.2.1.2 TrueNorth 187
- 5.2.1.3 Loihi 191
- 5.2.2 Small/Moderate-Scale Digital SNNs 192
- 5.2.2.1 Bottom-Up Approach 192
- 5.2.2.2 Top-Down Approach 193
- 5.2.3 Hardware-Friendly Reinforcement Learning in SNNs 194
- 5.2.4 Hardware-Friendly Supervised Learning in Multilayer SNNs 199
- 5.2.4.1 Hardware Architecture 199
- 5.2.4.2 CMOS Implementation Results 205
- 5.3 Analog/Mixed-Signal SNNs 210
- 5.3.1 Basic Building Blocks 210
- 5.3.2 Large-Scale Analog/Mixed-Signal CMOS SNNs 211
- 5.3.2.1 CAVIAR 211
- 5.3.2.2 BrainScaleS 214
- 5.3.2.3 Neurogrid 215
- 5.3.3 Other Analog/Mixed-Signal CMOS SNN ASICs 216
- 5.3.4 SNNs Based on Emerging Nanotechnologies 216
- 5.3.4.1 Energy-Efficient Solutions 217
- 5.3.4.2 Synaptic Plasticity 218
- 5.3.5 Case Study: Memristor Crossbar Based Learning in SNNs 220
- 5.3.5.1 Motivations 220
- 5.3.5.2 Algorithm Adaptations 222
- 5.3.5.3 Non-idealities 231
- 5.3.5.4 Benchmarks 238
- References 238
- 6 Conclusions 247
- 6.1 Outlooks 247
- 6.1.1 Brain-Inspired Computing 247
- 6.1.2 Emerging Nanotechnologies 249
- 6.1.3 Reliable Computing with Neuromorphic Systems 250
- 6.1.4 Blending of ANNs and SNNs 251
- 6.2 Conclusions 252
- References 253
- A Appendix 257
- A.1 Hopfield Network 257
- A.2 Memory Self-Repair with Hopfield Network 258
- References 266
- Index 269.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- Graupe, Daniel author.
- 4th edition. - Hackensack, NJ : World Scientific Publishing Co. Pte. Ltd., [2019]
- Description
- Book — 1 online resource.
- Summary
-
- Introduction and role of artificial neural networks
- Fundamentals of biological neural networks
- Basic principles of ANNs and their structures
- The perceptron
- The madaline
- Back propagation
- Hopfield networks
- Counter propagation
- Adaptive resonance theory
- The cognitron and neocognition
- Statistical training
- Recurrent (time cycling) back propagation networks
- Deep learning neural networks : principles and scope
- Deep learning convolutional neural network
- LAMSTAR neural networks
- Performance of DLNN : comparative case studies.
(source: Nielsen Book Data)
- Caicedo Bravo, Eduardo Francisco, author.
- Edición digital - Cali, Colombia : Universidad del Valle, Programa Editorial, [2017]
- Description
- Book — 1 online resource Digital: text file.PDF.
9. Growing adaptive machines : combining development and learning in artificial neural networks [2014]
- Heidelberg : Springer, 2014.
- Description
- Book — 1 online resource (vii, 261 pages) : illustrations (some color) Digital: text file.PDF.
- Summary
-
- Artificial neurogenesis: An introduction and selective review
- A Brief Introduction to Probabilistic Machine Learning and its Relation to Neuroscience
- Evolving culture versus local minima
- Learning sparse features with an auto-associator
- HyperNEAT: the first five years
- Using the GReaNs (Genetic Regulatory evolving artificial Networks) platform for signal processing, animat control, and artificial multicellular development
- Constructing complex systems via activity-driven unsupervised Hebbian self-organization
- Neuro-centric and holocentric approaches to the evolution of developmental neural networks
- Artificial evolution of plastic neural networks: A few key concepts.
- Graupe, Daniel.
- 3rd ed. - Singapore ; Hackensack, N.J. : World Scientific Pub. Co., c2013.
- Description
- Book — xviii, 364 p. : ill. (some col.)
- Summary
-
- Introduction and Role of Artificial Neural Networks
- Fundamentals of Biological Neural Networks
- Basic Principles of ANNs and Their Early Structures
- The Perceptron
- The Madaline
- Back Propagation
- Hopfield Networks
- Counter Propagation
- Large Scale Memory Storage and Retrieval (LAMSTAR) Network
- Adaptive Resonance Theory
- The Cognitron and the Neocognitron
- Statistical Training
- Recurrent (Time Cycling) Back Propagation Networks.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- Slavova, Angela.
- Dordrecht ; London : Springer, 2011.
- Description
- Book — 1 online resource (x, 220 pages) : illustrations
- Summary
-
- Preface.
- 1: Basic theory about CNNs. 1
- .1. Introduction to the CNN paradigm. 1
- .2. Main types of CNN equations. 1
- .3. Theorems and results on CNN stability. 1
- .4. Examples.
- 2: Dynamics of nonlinear and delay CNNs. 2
- .1. Nonlinear CNNs. 2
- .2. CNN with delay. 2
- .3. Examples.
- 3: Hysteresis and chaos in CNNs. 3
- .1. CNNs with hystersis in the feedback system. 3
- .2. Nonlinear CNNs with hysteresis in the output dynamics. 3
- .3. Feedback and hysteresis. 3
- .4. Control of chaotic CNNs.
- 4: CNN modelling in biology, physics and ecology. 4
- .1. Modelling PDEs via CNNs. 4
- .2. CNN model of Sine-Gordon equation. 4
- .3. CNN model of FitzHugh-Nagumo equation. 4
- .4. CNN model of Fisher's equation. 4
- .5. CNN model of Brusselator equation. 4
- .6. CNN model of Toda Lattice equation. 4
- .7. Lotka-Volterra equation and its CNN model.
- 5: Appendix A: Topological degree method.
- 6: Appendix B: Hysteresis and its models.
- 7: Appendix C: Describing function method and its application for analysis of Cellular Neural Networks. References. Index.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
12. Handbook of neural engineering [2007]
- Hoboken, N.J. : Wiley-Interscience, c2007.
- Description
- Book — xvi, 662 p. : ill. ; 26 cm.
- Summary
-
- PREFACE. CONTRIBUTORS. PART I: NEURAL SIGNAL AND IMAGE PROCESSING AND MODELING.
- CHAPTER 1: OPTIMAL SIGNAL PROCESSING FOR BRAIN-MACHINE INTERFACES (Justin C. Sanchez and Jose C. Principe).
- CHAPTER 2: MODULATION OF ELECTROPHYSIOLOGICAL ACTIVITY IN NEURAL NETWORKS: TOWARD A BIOARTIFICIAL LIVING SYSTEM (Laura Bonzano, Alessandro Vato, Michela Chiappalone, and Sergio Martinoia).
- CHAPTER 3: ESTIMATION OF POSTERIOR PROBABILITIES WITH NEURAL NETWORKS: APPLICATION TO MICROCALCIFICATION DETECTION IN BREAST CANCER DIAGNOSIS (Juan Ignacio Arribas, Jesus Cid-Sueiro, and Carlos Alberola-Lopez).
- CHAPTER 4: IDENTIFICATION OF CENTRAL AUDITORY PROCESSING DISORDERS BY BINAURALLY EVOKED BRAINSTEM RESPONSES (Daniel J. Strauss, Wolfgang Delb, and Peter K. Plinkert).
- CHAPTER 5: FUNCTIONAL CHARACTERIZATION OF ADAPTIVE VISUAL ENCODING (Nicholas A. Lesica and Garrett B. Stanley).
- CHAPTER 6: DECONVOLUTION OF OVERLAPPING AUDITORY BRAINSTEM RESPONSES OBTAINED AT HIGH STIMULUS RATES (O. Ozdamar, R. E. Delgado, E. Yavuz, and N. Acikgoz).
- CHAPTER 7: AUTONOMIC CARDIAC MODULATION AT SINOATRIAL AND ATRIOVENTRICULAR NODES: OBSERVATIONS AND MODELS (S. Ward, R. Shouldice, C. Heneghan, P. Nolan, and G. McDarby).
- CHAPTER 8: NEURAL NETWORKS AND TIME-FREQUENCY ANALYSIS OF SURFACE ELECTROMYOGRAPHIC SIGNALS FOR MUSCLE CEREBRAL CONTROL (Bruno Azzerboni, Maurizio Ipsale, Fabio La Foresta, and Francesco Carlo Morabito).
- CHAPTER 9: MULTIRESOLUTION FRACTAL ANALYSIS OF MEDICAL IMAGES (Khan M. Iftekharuddin and Carlos Parra).
- CHAPTER 10: METHODS FOR NEURAL-NETWORK-BASED SEGMENTATION OF MAGNETIC RESONANCE IMAGES (Lia Morra, Silvia Delsanto, and Fabrizio Lamberti).
- CHAPTER 11: HIGH-RESOLUTION EEG AND ESTIMATION OF CORTICAL ACTIVITY FOR BRAIN-COMPUTER INTERFACE APPLICATIONS (F. Cincotti, M. Mattiocco, D. Mattia, F. Babiloni, and L. Astolfi)
- CHAPTER 12: ESTIMATION OF CORTICAL SOURCES RELATED TO SHORT-TERM MEMORY IN HUMANS WITH HIGHRESOLUTION EEG RECORDINGS AND STATISTICAL PROBABILITY MAPPING (L. Astolfi, D. Mattia, F. Babiloni, and F. Cincotti).
- CHAPTER 13: EXPLORING SEMANTIC MEMORY AREAS BY FUNCTIONAL MRI (G. Rizzo, P. Vitali, G. Baselli, M. Tettamanti, P. Scifo, S. Cerutti, D. Perani, and F. Fazio). PART II: NEURO-NANOTECHNOLOGY: ARTIFICIAL IMPLANTS AND NEURAL PROTHESES.
- CHAPTER 14: RESTORATION OF MOVEMENT BY IMPLANTABLE NEURAL MOTOR PROSTHESES (Thomas Sinkjar and Dejan B. Popovic).
- CHAPTER 15: HYBRID OLFACTORY BIOSENSOR USING MULTICHANNEL ELECTROANTENNOGRAM: DESIGN AND APPLICATION (John R. Hetling, Andrew J. Myrick, Kye-Chung Park, and Thomas C. Baker).
- CHAPTER 16: RECONFIGURABLE RETINA-LIKE PREPROCESSING PLATFORM FOR CORTICAL VISUAL NEUROPROSTHESES (Samuel Romero, Francisco J. Pelayo, Christian A. Morillas, Antonio Mart -nez, and Eduardo Ferna-ndez).
- CHAPTER 17: BIOMIMETIC INTEGRATION OF NEURAL AND ACOUSTIC SIGNAL PROCESSING (Rolf Muller and Herbert Peremans).
- CHAPTER 18: RETINAL IMAGE AND PHOSPHENE IMAGE: AN ANALOGY (Luke E. Hallum, Spencer C. Chen, Gregg J. Suaning, and Nigel H. Lovell).
- CHAPTER 19: BRAIN-IMPLANTABLE BIOMIMETIC ELECTRONICS AS NEURAL PROSTHESES TO RESTORE LOST COGNITIVE FUNCTION (Theodore W. Berger, Ashish Ahuja, Spiros H. Courellis, Gopal Erinjippurath, Ghassan Gholmieh, John J. Granacki, Min Chi Hsaio, Jeff LaCoss, Vasilis Z. Marmarelis, Patrick Nasiatka, Vijay Srinivasan, Dong Song, Armand R. Tanguay, Jr., and Jack Wills).
- CHAPTER 20: ADVANCES IN RETINAL NEUROPROSTHETICS (Nigel H. Lovell, Luke E. Hallum, Spencer C. Chen, Socrates Dokos, Philip Byrnes-Preston, Rylie Green, Laura Poole-Warren, Torsten Lehmann, and Gregg J. Suaning).
- CHAPTER 21: TOWARDS A CULTURED NEURAL PROBE: PATTERNING OF NETWORKS AND THEIR ELECTRICAL ACTIVITY (W. L. C. Rutten, T. G. Ruardij, E. Marani, and B. H. Roelofsen).
- CHAPTER 22: SPIKE SUPERPOSITION RESOLUTION IN MULTICHANNEL EXTRACELLULAR NEURAL RECORDINGS: A NOVEL APPROACH (Karim Oweiss and David Anderson).
- CHAPTER 23: TOWARD A BUTTON-SIZED 1024-SITE WIRELESS CORTICAL MICROSTIMULATING ARRAY (Maysam Ghovanloo and Khalil Najafi).
- CHAPTER 24: PRACTICAL CONSIDERATIONS IN RETINAL NEUROPROSTHESIS DESIGN (Gregg J. Suaning, Luke E. Hallum, Spencer Chen, Philip Preston, Socrates Dokos, and Nigel H. Lovell). PART III: NEUROROBOTICS AND NEURAL REHABILATION ENGINEERING.
- CHAPTER 25: INTERFACING NEURAL AND ARTIFICIAL SYSTEMS: FROM NEUROENGINEERING TO NEUROROBOTICS (P. Dario, C. Laschi, A. Menciassi, E. Guglielmelli, M. C. Carrozza, and S. Micera).
- CHAPTER 26: NEUROCONTROLLER FOR ROBOT ARMS BASED ON BIOLOGICALLY INSPIRED VISUOMOTOR COORDINATION NEURAL MODELS (E. Guglielmelli, G. Asuni, F. Leoni, A. Starita, and P. Dario).
- CHAPTER 27: MUSCLE SYNERGIES FOR MOTOR CONTROL (Andrea d'Avella and Matthew Tresch).
- CHAPTER 28: ROBOTS WITH NEURAL BUILDING BLOCKS (Henrik Hautop Lund and Jacob Nielsen)
- CHAPTER 29: DECODING SENSORY STIMULI FROM POPULATIONS OF NEURONS: METHODS FOR LONG-TERM LONGITUDINAL STUDIES (Guglielmo Foffani, Banu Tutunculer, Steven C. Leiser, and Karen A. Moxon)
- CHAPTER 30: MODEL OF MAMMALIAN VISUAL SYSTEM WITH OPTICAL LOGIC CELLS (J. A. Mart& acute--n-Pereda and A. Gonzalez Marcos).
- CHAPTER 31: CNS REORGANIZATION DURING SENSORY-SUPPORTED TREADMILL TRAINING (I. Cikajlo, Z. Matjacic, and T. Bajd).
- CHAPTER 32: INDEPENDENT COMPONENT ANALYSIS OF SURFACE EMG FOR DETECTION OF SINGLE MOTONEURONS FIRING IN VOLUNTARY ISOMETRIC CONTRACTION (Gonzalo A. Garc& acute--a, Ryuhei Okuno, and Kenzo Akazawa).
- CHAPTER 33: RECENT ADVANCES IN COMPOSITE AEP/EEG INDICES FOR ESTIMATING HYPNOTIC DEPTH DURING GENERAL ANESTHESIA? (Erik Weber Jensen, Pablo Martinez, Hector Litvan, Hugo Vereecke, Bernardo Rodriguez, and Michel M. R. F. Struys).
- CHAPTER 34: ENG RECORDING AMPLIFIER CONFIGURATIONS FOR TRIPOLAR CUFF ELECTRODES (I. F. Triantis, A. Demosthenous, M. S. Rahal, and N. Donaldson).
- CHAPTER 35: CABLE EQUATION MODEL FOR MYELINATED NERVE FIBER (P. D. Einziger, L. M. Livshitz, and J. Mizrahi).
- CHAPTER 36: BAYESIAN NETWORKS FOR MODELING CORTICAL INTEGRATION Paul Sajda, Kyungim Baek and Leif Finkel).
- CHAPTER 37: NORMAL AND ABNORMAL AUDITORY INFORMATION PROCESSING REVEALED BY NONSTATIONARY SIGNAL ANALYSIS OF EEG (Ben H. Jansen, Anant Hegde, Jacob Ruben, and Nashaat N. Boutros).
- CHAPTER 38: PROBING OSCILLATORY VISUAL DYNAMICS AT THE PERCEPTUAL LEVEL (H. Fotowat, H. Ogmen, H. E. Bedell, and B. G. Breitmeyer).
- CHAPTER 39: NONLINEAR APPROACHES TO LEARNING AND MEMORY (Klaus Lehnertz).
- CHAPTER 40: SINGLE-TRIAL ANALYSIS OF EEG FOR ENABLING COGNITIVE USER INTERFACES (Adam D. Gerson, Lucas C. Parra, and Paul Sajda). INDEX. ABOUT THE EDITOR.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
SAL3 (off-campus storage)
SAL3 (off-campus storage) | Status |
---|---|
Stacks | Request (opens in new tab) |
QA76.87 .H3623 2007 | Available |
13. Neural networks theory [2007]
- Galushkin, A. I. (Aleksandr Ivanovich)
- Berlin ; New York : Springer, 2007.
- Description
- Book — xx, 396 p. : ill. ; 24 cm.
- Summary
-
- Section 1. Neural Network Structure.- Transfer from logical basis of Boolean elements "And, Or, Not" to the threshold logical basis.- Qualitative characteristics of neural networks architectures.- Optimization of cross connection multi-layer neural networks structure.- Continual neural networks.-
- Section 2. Optimal Models of Neural Networks.- Investigation of neural network input signals characteristics.- Design of neural network optimal models.- Analysis of the open-loop neural networks.- Development of multivariable functions extremum search algorithms.- Section3. Adaptive Neural Network.- Neural network adjustment algorithms.- Adjustment of continuum neural Networks.- Selection of initial conditions during neural network adjustment. Typical neural network input signals.- Analysis of closed-loop multi-layer neural networks.- Synthesis of multi-layer neural networks with flexible structure.- Informative features selection in multi-layer neural networks.-
- Section 4. Neural networks Reliability and Diagnostics.- Neural networks reliability.- Neural networks diagnostics.- Methods of problem solutions in the neural network logical basis.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
SAL3 (off-campus storage)
SAL3 (off-campus storage) | Status |
---|---|
Stacks | Request (opens in new tab) |
QA76.87 .G39 2007 | Available |
14. Neural networks theory [electronic resource] [2007]
- Galushkin, A. I. (Aleksandr Ivanovich)
- Berlin ; New York : Springer, 2007.
- Description
- Book — xx, 396 p. : ill.
15. Principles of artificial neural networks [2007]
- Graupe, Daniel.
- 2nd ed. - New Jersey : World Scientific, c2007.
- Description
- Book — xv, 303 p. : ill. ; 26 cm.
- Summary
-
- Introduction and Role of Artificial Neural Networks
- Fundamentals of Biological Neural Networks
- Basic Principles of ANNs and Their Early Structures
- The Perceptron
- The Madaline
- Back Propagation
- Hopfield Networks
- Counter Propagation
- Adaptive Resonance Theory
- The Cognitron and the Neocogntiron
- Statistical Training
- Recurrent (Time Cycling) Back Propagation Networks
- Large Scale Memory Storage and Retrieval (LAMSTAR) Network.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
Engineering Library (Terman)
Engineering Library (Terman) | Status |
---|---|
Stacks | |
QA76.87 .G77 2007 | Unknown |
16. Neural networks in a softcomputing framework [2006]
- Du, K.-L.
- London : Springer, c2006.
- Description
- Book — l, 566 p. : ill. ; 25 cm.
- Summary
-
- IntroductionFundamentals of Machine Learning and SoftcomputingMultilayer PerceptronsHopfield Networks and Boltzmann MachinesCompetitive Learning and ClusteringRadial Basis Function NetworksPrincipal Component Analysis NetworksFuzzy Logic and Neuro-fuzzy SystemsEvolutionary Algorithms and Evolving Neural NetworksDiscussion and OutlookAppendix: Mathematical Preliminaries.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
SAL3 (off-campus storage)
SAL3 (off-campus storage) | Status |
---|---|
Stacks | Request (opens in new tab) |
QA76.87 .D8 2006 | Available |
17. Artificial neural networks : an introduction [2005]
- Priddy, Kevin L.
- Bellingham, Wash. : SPIE Press, c2005.
- Description
- Book — ix, 165 p. : ill. (some col.) ; 26 cm.
- Summary
-
This tutorial text provides the reader with an understanding of artificial neural networks (ANNs), and their application, beginning with the biological systems which inspired them, through the learning methods that have been developed, and the data collection processes, to the many ways ANNs are being used today. The material is presented with a minimum of math (although the mathematical details are included in the appendices for interested readers), and with a maximum of hands-on experience. All specialized terms are included in a glossary. The result is a highly readable text that will teach the engineer the guiding principles necessary to use and apply artificial neural networks.
(source: Nielsen Book Data)
Engineering Library (Terman)
Engineering Library (Terman) | Status |
---|---|
Stacks | |
QA76.87 .P736 2005 | Unknown |
- Dordrecht : Springer, c2005.
- Description
- Book — viii, 406 p. : ill.
- Summary
-
- Pre-WIRN workshop on Computational Intelligence Methods for Bioinformatics and Biostatistics (CIBB): 1 ProGenGrid: A Grid Framework for Bioinformatics, G. Aloisio et al.
- 2 A preliminary investigation on connecting genotype to oral cancer development through XCS, 2 F. Baronti et al.
- 3 Mass Spectrometry Data Analysis for Early Detection of Inherited Breast Cancer, F. Baudi, M. Cannataro et al
- 4 Feature Selection combined with random subspace ensemble for gene expression based diagnosis of malignancies, A. Bertoni et al.
- 5 Pruning the Nodule Candidate Set in Postero Anterior Chest Radiographs, P. Campadelli, E. Casiraghi
- 6 Protein Structure Assembly from Knowledge of beta-sheet Motifs and Secondary Structure, A. Ceroni et al.
- 7 Analysis of Oligonucleotide Microarray Images using a fuzzy sets Approach in HLA Typing, G.B. Ferrara et al.
- 8 Combinatorial and Machine Learning Approaches in Clustering Microarray Data, S. Pozzi, I. Zoppis
- 9 Gene expression data modelling and validation of gene selection methods, F. Ruffino
- 10 Mining Yeast Gene Microaray Data with Latent Variable Models, A. Staiano et al.
- 11 Recent Applications of Neural Networks in Bioinformatics, M.J. Wood, J.D. Hirst
- 12 An Algorithm for Reducing the Number of Support Vectors, D. Anguita et al.
- Pre-WIRN workshop on Computational Intelligence on Hardware: Algorithms, Implementations and Applications (CIHAIA): 13 Genetic Design of linear block error-correcting codes, A. Barbieri et al.
- 14 Neural hardware based on kernel methods for industrial and scientific applications, A. Boni et al.
- 15 Statistical Learning for Parton Identification, D. Cauz et al.
- 16 Time-Varying Signals Classification Using a Liquid State Machine, A. Chella, R. Rizzo
- 17 FPGA Based Statistical Data Mining Processor, E. Pasero et al.
- 18 Neural Classification of HEP Experimental Data, S. Vitabile et al.
- WIRN Regular Sessions- Architectures and Algorithms
- Models
- Applications.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
- Fyfe, Colin.
- New York : Springer, 2005.
- Description
- Book — xviii, 383 p. : ill. ; 24 cm.
- Summary
-
- Introduction * Part I: Single Stream Networks * Background
- The Negative Feedback Network
- Peer-Inhibitory Neurons
- Multiple Cause Data
- Exploratory Data Analysis
- Topology Preserving Maps
- Maximum Likelihood Hebbian Learning
- Part II: Dual Stream Networks * Two Neural Networks for Canonical Correlation Analysis
- Alternative Derivations of CCA Networks
- Kernel and Nonlinear Correlations
- Exploratory Correlation Analysis
- Multicollinearity and Partial Least Squares
- Twinned Principal curves
- The Future
- Appendices: A. Negative Feedback Artificial Neural Networks * B. Previous Factor Analysis Models * C. Related Models for ICA * D. Previous Dual Stream Approaches * E. Data Sets * References * Index.
- (source: Nielsen Book Data)
(source: Nielsen Book Data)
SAL3 (off-campus storage)
SAL3 (off-campus storage) | Status |
---|---|
Stacks | Request (opens in new tab) |
QA76.87 .F898 2005 | Available |
- Fyfe, Colin.
- New York : Springer, 2005.
- Description
- Book — xviii, 383 p. : ill.
Articles+
Journal articles, e-books, & other e-resources
Guides
Course- and topic-based guides to collections, tools, and services.