Abstract.- Structure and Structuring.- 1 Introduction.- Science and information.- Man as control loop.- Information, complexity and typical sequences.- Concepts of information.- Information, its technical dimension and the meaning of a message.- Information as a central concept.- 2 Basic considerations.- 2.1 Formal derivation of information.- 2.1.1 Unit and reference scale.- 2.1.2 Information and the unit element.- 2.2 Application of the information measure (Shannon's information).- 2.2.1 Summary.- 2.3 The law of Weber and Fechner.- 2.4 Information of discrete random variables.- 3 Historic development of information theory.- 3.1 Development of information transmission.- 3.1.1 Samuel F. B. Morse 1837.- 3.1.2 Thomas Edison 1874.- 3.1.3 Nyquist 1924.- 3.1.4 Optimal number of characters of the alphabet used for the coding.- 3.2 Development of information functions.- 3.2.1 Hartley 1928.- 3.2.2 Dennis Gabor 1946.- 3.2.3 Shannon 1948.- 3.2.3.1 Validity of the postulates for Shannon's Information.- 3.2.3.2 Shannon's information (another possibility of a derivation).- 3.2.3.3 Properties of Shannon's information, entropy.- 3.2.3.4 Shannon's entropy or Shannon's information.- 3.2.3.5 The Kraft inequality.- Kraft's inequality:.- Proof of Kraft's inequality:.- 3.2.3.6 Limits of the optimal length of codewords.- 3.2.3.6.1 Shannon's coding theorem.- 3.2.3.6.2 A sequence of n symbols (elements).- 3.2.3.6.3 Application of the previous results.- 3.2.3.7 Information and utility (coding, porfolio analysis).- 4 The concept of entropy in physics.- The laws of thermodynamics:.- 4.1 Macroscopic entropy.- 4.1.1 Sadi Carnot 1824.- 4.1.2 Clausius's entropy 1850.- 4.1.3 Increase of entropy in a closed system.- 4.1.4 Prigogine's entropy.- 4.1.5 Entropy balance equation.- 4.1.6 Gibbs's free energy and the quality of the energy.- 4.1.7 Considerations on the macroscopic entropy.- 4.1.7.1 Irreversible transformations.- 4.1.7.2 Perpetuum mobile and transfer of heat.- 4.2 Statistical entropy.- 4.2.1 Boltzmann's entropy.- 4.2.2 Derivation of Boltzmann's entropy.- 4.2.2.1 Variation, permutation and the formula of Stirling.- 4.2.2.2 Special case: Two states.- 4.2.2.3 Example: Lottery.- 4.2.3 The Boltzmann factor.- 4.2.4 Maximum entropy in equilibrium.- 4.2.5 Statistical interpretation of entropy.- 4.2.6 Examples regarding statistical entropy.- 4.2.6.1 Energy and fluctuation.- 4.2.6.2 Quantized oscillator.- 4.2.7 Brillouin-Schroedinger negentropy.- 4.2.7.1 Brillouin: Precise definition of information.- 4.2.7.2 Negentropy as a generalization of Carnot's principle.- Maxwell's demon.- 4.2.8 Information measures of Hartley and Boltzmann.- 4.2.8.1 Examples.- 4.2.9 Shannon's entropy.- 4.3 Dynamic entropy.- 4.3.1 Eddington and the arrow of time.- 4.3.2 Kolmogorov's entropy.- 4.3.3 Renyi's entropy.- 5 Extension of Shannon's information.- 5.1 Renyi's Information 1960.- 5.1.1 Properties of Renyi's entropy.- 5.1.2 Limits in the interval 0 ? ?< ?.- 5.1.3 Nonnegativity for discrete events.- 5.1.4 Additivity and a connection to Minkowski's norm.- 5.1.5 The meaning of S?(A) for ? 1.- 5.1.6 Graphical presentations of Renyi's information.- 5.2 Another generalized entropy (logical expansion).- 5.3 Gain of information via conditional probabilities.- 5.4 Other entropy or information measures.- 5.4.1 Daroczy's entropy.- 5.4.2 Quadratic entropy.- 5.4.3 R-norm entropy.- 6 Generalized entropy measures.- 6.1 The corresponding measures of divergence.- 6.2 Weighted entropies and expectation values of entropies.- 7 Information functions and gaussian distributions.- 7.1 Renyi's information of a gaussian distributed random variable.- 7.1.1 Renyi's ?-information.- 7.1.2 Renyi's G-divergence.- 7.2 Shannon's information.- 8 Shannon's information of discrete probability distributions.- 8.1 Continuous and discrete random variables.- 8.1.1 Summary.- 8.2 Shannon's information of a gaussian distribution.- 8.3 Shannon's information as the possible gain of information in an observation.- 8.4 Limits of the information, limitations of the resolution.- 8.4.1 The resolution or the precision of the measurements.- 8.4.2 The uncertainty relation of the Fourier transformation.- 8.5 Maximization of the entropy of a continuous random variable.- 9 Information functions for gaussian distributions part II.- 9.1 Kullback's information.- 9.1.1 G1 for gaussian distribution densites.- 9.2 Kullback's divergence.- 9.2.1 Jensen's inequality for G1.- 9.3 Kolmogorov's information.- 9.4 Transformation of the coordinate system and the effects on the information.- 9.4.1 S?-information.- 9.4.2 G-divergence.- 9.4.3 S-information.- 9.4.3.1 Example.- 9.4.4 Discrimination information.- 9.4.5 Kolmogorov's information.- 9.4.6 Prerequisites for the transformations.- 9.5 Transformation, discrete and continuous measures of entropy.- 9.6 Summary of the information functions.- 10 Bounds of the variance.- 10.1 Cramer-Rao bound.- 10.1.1 Fisher's information for gaussian distribution densities.- 10.1.2 Fisher's information and Kullback's information.- 10.1.3 Fisher's information and the metric tensor.- 10.1.4 Fisher's information and the stochastic observability.- 10.1.4.1 Fisher's information and the Matrix-Riccati equation.- 10.1.5 Fisher's information and maximum likelihood estimation.- 10.1.6 Fisher's information and weighted least-squares estimation.- 10.1.7 The availability of the Cramer-Rao bound.- 10.1.8 Efficiency, asymptotic efficiency, consistency, bias.- 10.1.8.1 Unbiased estimator.- 10.1.8.2 Consistency.- 10.1.8.3 Efficiency.- 10.1.9 Summary.- 10.2 Chapman-Robbins bound.- 10.2.1 Cramer-Rao bound versus Chapman-Robbins bound.- 10.3 Bhattacharrya bound.- Remark:.- Remark.- 10.3.1 Bhattacharrya bound and Cramer-Rao bound.- 10.3.2 Bhattacharrya's bound for gaussian distribution densities.- 10.4 Barankin bound.- 10.5 Other bounds.- Fraser-Guttman bound.- Kiefer bound.- Extended Fraser-Guttman bound.- 10.6 Summary.- 10.7 Biased estimator.- 10.7.1 Biased estimator versus unbiased estimator.- 11 Ambiguity function.- 11.1 The ambiguity function and Kullback's information.- 11.2 Connection between ambiguity function and Fisher's information.- 11.3 Maximum likelihood estimation and the ambiguity function.- 11.3.1 Maximum likelihood estimation = minimum Kullback estimation = maximum ambiguity estimation = minimum variance estimation.- 11.3.2 Maximum likelihood estimation.- 11.3.2.1 Application: Discriminator (Demodulation).- 11.4 The ML estimation is asymptotically efficient.- 11.5 Transition to the Akaike information criterion.- 12 Akaike's information criterion.- 12.1 Akaike's information criterion and regression.- 12.1.1 Least-squares regression.- 12.1.2 Application of the results to the ambiguity function.- 12.2 BIC, SC or HQ.- 13 Channel information.- 13.1 Redundancy.- 13.1.1 Knowledge, redundancy, utility.- 13.2 Rate of transmission and equivocation.- 13.3 Hadamard's inequality and Gibbs's second theorem.- 13.4 Kolmogorov's information.- 13.5 Kullbacks divergence.- 13.6 An example of a transmission.- 13.7 Communication channel and information processing.- 13.7.1 Semantic, syntactic and pragmatic information.- 13.7.2 Information, first-time occurrence, confirmation.- 13.8 Shannon's bound.- 13.9 Example of the channel capacity.- 14 'Deterministic' and stochastic information.- 14.1 Information in state space models.- 14.2 The observation equation.- 14.3 Transmission faster than light.- 14.4 Information about state space variables.- 15 Maximum entropy estimation.- 15.1 The difference between maximum entropy and minimum variance.- 15.2 The difference from bootstrap or resampling methods.- 15.3 A maximum entropy example.- 15.4 Maximum entropy: The method.- 15.4.1 Maximum Shannon entropy.- 15.4.2 Minimum Kullback-Leibler distance.- 15.5 Maximum entropy and minimum discrimination information.- 15.6 Generation of generalized entropy measures.- 15.6.1 Example: Gaussian distribution and Shannon's information.- 16 Concluding remarks.- 16.1 Information, entropy and self-organization.- 16.2 Complexity theory.- 16.3 Data reduction.- 16.4 Cryptology.- 16.5 Concluding considerations.- 16.5.1 Information, entropy and probability.- 16.6 Information.- A.1 Inequality for Kullback's information.- A.2 The log-sum inequality.- A.3 Generalized entropy, divergence and distance measures.- A.3.1 Entropy measures.- A.3.2 Generalized measures of distance.- A.3.3 Generalized measures of the directed divergence.- A.3.4 Generalized measures of divergence.- A.3.4.1 Information radius and the J-divergence.- A.3.4.2 Generalization of the R-divergence.- A.3.4.3 Generalization of the J-divergence.- A.4 A short introduction to probability theory.- A.4.1 Axiomatic definition of probability.- A.4.1.1 Events, elementary events, sample space.- A.4.1.2 Classes of subsets, fields.- A.4.1.3 Axiomatic definition of probability according to Kolmogorov.- Probability space.- A.4.1.4 Random variables.- A.4.1.5 Probability distribution.- A.4.1.6 Probability space, sample space, realization space.- A.4.1.7 Probability distribution and distribution density function.- A.4.1.8 Probability distribution density function (PDF).- A.5 The regularity conditions.- A.6 State space description.