×

On the symmetrized \(s\)-divergence. (English) Zbl 1443.60029

Summary: In this study, we work with the relative divergence of type \(s\), \(s\in{\mathbb{R}}\), which includes the Kullback-Leibler divergence and the Hellinger and \(\chi^2\) distances as particular cases. We study the symmetrized divergences in additive and multiplicative forms. Some basic properties such as symmetry, monotonicity and log-convexity are established. An important result from the convexity theory is also proved.

MSC:

60E15 Inequalities; stochastic orderings

References:

[1] I. Csiszár, Information-type measures of difference of probability functions and indirect observations, Studia Sci. Math. Hungar. 2 (1967), 299-318. · Zbl 0157.25802
[2] S. Kullback, Information Theory and Statistics, John Willey & Sons, New York, 1959. · Zbl 0149.37901
[3] I. J. Taneja, New developments in generalized information measures, Adv. Imaging Electron. Phys. 91 (1985), 37-135.
[4] S. Simić, On logarithmic convexity for differences of power means, J. Inequal. Appl. 2007 (2007), 37359. · Zbl 1133.26007
[5] H. Jeffreys, An invariant form for the prior probability in estimation problems, Proc. R. Soc. Lond. Ser. A 186 (1946), 453-461. · Zbl 0063.03050
[6] G. H. Hardy, J. E. Littlewood, and G. Polya, Inequalities, Cambridge University Press, Cambridge, 1978. · JFM 60.0169.01
[7] I. Vajda, Theory of Statistical Inference and Information, Kluwer Academic Press, London, 1989. · Zbl 0711.62002
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.