Information theoretics vis-a-vis neural networks generally embodies parametric entities and conceptual bases pertinent to memory considerations and information storage, information-theoretic based cost-functions, and neurocybernetics and self-organization. Existing studies only sparsely cover the entropy and/or cybernetic aspects of neural information.Information-Theoretic Aspects of Neural Networks cohesively explores this burgeoning discipline, covering topics such as:Shannon information and information dynamicsneural complexity as an information processing systemmemory and information storage in the interconnected neural webextremum (maximum and minimum) information entropyneural network trainingnon-conventional, statistical distance-measures for neural network optimizationssymmetric and asymmetric characteristics of information-theoretic error-metricsalgorithmic complexity based representation of neural information-theoretic parametersgenetic algorithms versus neural informationdynamics of neurocybernetics viewed in the information-theoretic planenonlinear, information-theoretic transfer function of the neural cellular unitsstatistical mechanics, neural networks, and information theorysemiotic framework of neural information processing and neural information flowfuzzy information and neural networksneural dynamics conceived through fuzzy information parametersneural information flow dynamicsinformatics of neural stochastic resonanceInformation-Theoretic Aspects of Neural Networks acts as an exceptional resource for engineers, scientists, and computer scientists working in the field of artificial neural networks as well as biologists applying the concepts of communication theory and protocols to the functioning of the brain. The information in this book explores new avenues in the field and creates a common platform for analyzing the neural complex as well as artificial neural networks.