Hostname: page-component-586b7cd67f-l7hp2 Total loading time: 0 Render date: 2024-11-24T09:08:18.387Z Has data issue: false hasContentIssue false

Extension of de Bruijn's identity to dependent non-Gaussian noise channels

Published online by Cambridge University Press:  21 June 2016

Nayereh Bagheri Khoolenjani*
Affiliation:
University of Isfahan
Mohammad Hossein Alamatsaz*
Affiliation:
University of Isfahan
*
* Postal address: Department of Statistics, University of Isfahan, Isfahan, 81746-73441, Iran.
* Postal address: Department of Statistics, University of Isfahan, Isfahan, 81746-73441, Iran.

Abstract

De Bruijn's identity relates two important concepts in information theory: Fisher information and differential entropy. Unlike the common practice in the literature, in this paper we consider general additive non-Gaussian noise channels where more realistically, the input signal and additive noise are not independently distributed. It is shown that, for general dependent signal and noise, the first derivative of the differential entropy is directly related to the conditional mean estimate of the input. Then, by using Gaussian and Farlie–Gumbel–Morgenstern copulas, special versions of the result are given in the respective case of additive normally distributed noise. The previous result on independent Gaussian noise channels is included as a special case. Illustrative examples are also provided.

Type
Research Papers
Copyright
Copyright © Applied Probability Trust 2016 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

[1]Cover, T. M. and Thomas, J. A. (2006).Elements of Information Theory, 2nd edn.John Wiley, Hoboken, NJ.Google Scholar
[2]Gholizadeh, M. H., Amindavar, H. and Ritcey, J. A. (2013).Analytic Nakagami fading parameter estimation in dependent noise channel using copula.EURASIP J. Adv. Signal Process. 129, 10.1186/1687-6180-2013-129.Google Scholar
[3]Guo, D., Shamai, S. and Verdu, S. (2005).Additive non-Gaussian noise channels: mutual information and conditional mean estimation. In Proc. IEEE Internat. Symp. Inf. Theory, IEEE, New York, pp.719723.Google Scholar
[4]Joe, H. (1997).Multivariate Models and Dependence Concepts (Monogr. Statist. Appl. Prob.73).Chapman & Hall, London.Google Scholar
[5]Johnson, O. (2004).Information Theory and the Central Limit Theorem.Imperial College Press, London.Google Scholar
[6]Johnson, O. (2013).A de Bruijn identity for symmetric stable laws. Preprint. Available at http://arxiv.org/abs/1310.2045v1.Google Scholar
[7]Kay, S. (2009).Waveform design for multistatic radar detection.IEEE Trans. Aerospace Electron. Systems 45, 11531166.Google Scholar
[8]Li, H. and Sun, Y. (2009).Tail dependence for heavy-tailed scale mixtures of multivariate distributions.J. Appl. Prob. 46, 925937.Google Scholar
[9]Moon, J. and Park, J. (2001).Pattern-dependent noise prediction in signal-dependent noise.IEEE J. Selected Areas Commun. 19, 730743.CrossRefGoogle Scholar
[10]Nelsen, R. B. (2006).An Introduction to Copulas, 2nd edn.Springer, New York.Google Scholar
[11]Park, S., Serpedin, E. and Qaraqe, K. (2012).On the equivalence between Stein and de Bruijn identities.IEEE Trans. Inf. Theory 58, 70457067.Google Scholar
[12]Payaro, M. and Palomar, D. P. (2009).Hessian and concavity of mutual information, differential entropy, and entropy power in linear vector Gaussian channels.IEEE Trans. Inf. Theory 55, 36133628.Google Scholar
[13]Pham, D.-T. (2005).Entropy of a variable slightly contaminated with another.IEEE Signal Process. Lett. 12, 536539.CrossRefGoogle Scholar
[14]Rioul, O. (2011).Information theoretic proofs of entropy power inequalities.IEEE Trans. Inf. Theory 57, 3355.Google Scholar
[15]Sklar, M. (1959).Fonctions de répartition à n dimensions et leurs marges.Publ. Inst. Statist. Univ. Paris 8, 229231.Google Scholar
[16]Stam, A. J. (1959).Some inequalities satisfied by the quantities of information of Fisher and Shannon.Information Control 2, 101112.Google Scholar