367 387

Title

Comparison of Some Entropy Measures for Non-Central Fisher Probability Distribution

  Mazeat Koreny 1 * ,   Mohamed Bisher Zeina 2 ,   Shaza Zubeadah 3

1  Department of Mathematical Statistics, Faculty of Science, University of Aleppo, Aleppo, Syria
    (mazeat.koreny@gmail.com)

2  Department of Mathematical Statistics, Faculty of Science, University of Aleppo, Aleppo, Syria
    (bisher.zeina@gmail.com)

3  Department of Basic Science, Faculty of Informatics Engineering, University of Aleppo, Aleppo, Syria
    (szubeadah@gmail.com)


Doi   :   https://doi.org/10.54216/GJMSA.050102

Received: December 27, 2022 Accepted: March 25, 2023

Abstract :

In this paper, many entropy measures of noncentral Fisher distribution were driven including Shannon, Renyi, Sharma, Havrda, Arimoto and Tsallis. A comparison between these entropies was made according to distribution’s shift parameter, distribution’s degrees of freedom, shape parameter and truncation parameter. The entropy that had less relative loss was said to be better than the other. There were significant differences according to all studied parameters except the shift parameter and we found that the best entropy of the mentioned entropies for noncentral Fisher distribution was Renyi entropy.

Keywords :

Goodness of Fit. Entropy; Noncentral Distributions; Relative Loss; Truncated Distribution; Shift Parameter.

References :

[1]

"The Second Law of Thermodynamics. In: Generalized Thermodynamics," in Fundamental Theories of Physics, vol. 124. Springer, 2004.

[2]

R. Edward A, J. Platig, J. A. Tuszynski and G. Lakka Klement, "hermodynamic measures of cancer: Gibbs free energy and entropy of protein–protein interactions," Journal of Biological Physics, vol. 42, no. 3, p. 339, 2016.

[3]

H. Karsten and M. Norbert J, "Physics behind the minimum of relative entropy measures for correlations," The European Physical Journal B, vol. 86, no. 7, p. 328, 2013.

[4]

T. Durt, "Competing Definitions of Information Versus Entropy in Physics," Foundations of Science, vol. 16, no. 4, p. 315, 2011.

[5]

G. Lindblad, "Entropy, information and quantum measurements," Communications in Mathematical Physics, vol. 33, no. 4, p. 305, 1973.

[6]

A. J. 2, On Measures of Information and Their Characterizations, New York: New York San Francisco London: A CAD E M I C P R E S S،, 1997.

[7]

A. M. ADNAN and J. A. AMEEN , "Application of Entropy to a Life-Time Model," IMA Journal of Mathematical Control & Information, pp. 143-147, 1987.

[8]

D. Ellerman, "Logical information theory: new logical foundations for information theory," Logic Journal of the IGPL, vol. 25, no. 5, p. 806, 2017.

[9]

E. T.Jaynes, "Information Theory and Statistical Mechanics," Phys. Rev., vol. 106, no. 4, pp. 620--630, 1957.

[10]

D. W. Robinson, "Entropy and Uncertainty," Entropy, vol. 10, no. 4, pp. 493-506, 2008.

[11]

Karmeshu and N. R. Pal, "Uncertainty, Entropy and Maximum Entropy Principle — An Overview," in Entropy Measures, Maximum Entropy Principle and Emerging Applications, Berlin, Heidelberg, 2003, pp. 1-53.

[12]

A. Mandilara, K. Evgueni and N. J. Cerf, "Uncertainty, Entropy and non-Gaussianity for mixed states," Universit´e Libre, Vols. 7727,, 2010.

[13]

S. CE, "A mathematical theory of communication," The Bell System Technical Journal, vol. 27, no. 1, p. 379–423, 1948,.

[14]

R. A, "On measures of entropy and information," Berkeley Symposium on Mathematical Statistics and Probability, no. 1, pp. 47-561, 1960.

[15]

H. J. a. C. t. F, "Quantification method of classification processes," concept of structural a entropy, vol. 3, no. 1, p. 30–35, 1967.

[16]

A. S, "Information-theoretical considerations on estimation problems," Information and Control, vol. 19, no. 3, p. 181–194, 1971.

[17]

B. a. M. Sharma, "New non -additive measures of relative information," Journal of Combinatorics, Information and System Sciences, no. 2, pp. 122-133, 1977.

[18]

T. C, "Possible generalization of Boltzmann-Gibbs statistics," ournal of Statistical Physics, vol. 52, no. 2, p. 479–487, 1988.

[19]

S. Dey, S. Maiti and M. Ahmad, "Comparison of different entropy measures," Pakistan Journal of Statistics, vol. 32, no. 2, pp. 97-108, 2016.

[20]

A. A. Al-Babtain, I. Elbatal, C. Chesneau and M. Elgarhy, "Estimation of different types of entropies for the Kumaraswamy distribution," PLOS ONE, vol. 16, no. 3, 2021.

[21]

M. Ijaz, S. Naji AL-Aziz, S. M. A. J. g. d. and A. A.-A. H. E.-B. , "Comparison of Different Entropy Measures Using the," Information Sciences Letters, vol. 10, no. 3, pp. 553-559, 2021.

[22]

A. M. A. and A. J. A. , "Application of Entropy to a Life-Time Model," Journal of Mathematical Control & Information, vol. 4, pp. 143-147, 1987.

[23]

C. Walck, Hand-book on STATISTICAL DISTRIBUTIONS for experimentalists, 2007.

[24]

K. Krishnamoorthy, Handbook of Statistical Distributions with Application, Vols. -13: 978-1-4987-4150-7, Lafayette, Louisiana, USA: University of Louisiana, 2016.

[25]

M. K. and G. S. B. , "A Brief Review on Different Measures of Entropy," International Journal on Emerging Technologies, vol. 10, no. 2b, pp. 31-38, 2019.

[26]

A. B. and Z. I. , "Recent Advances In Entropy: A New Class Of," Journal of Multidisciplinary Engineering Science and Technology, vol. 7, no. 11, 2020.


Cite this Article as :
Style #
MLA Mazeat Koreny, Mohamed Bisher Zeina, Shaza Zubeadah. "Comparison of Some Entropy Measures for Non-Central Fisher Probability Distribution." Galoitica: Journal of Mathematical Structures and Applications, Vol. 5, No. 1, 2023 ,PP. 17-35 (Doi   :  https://doi.org/10.54216/GJMSA.050102)
APA Mazeat Koreny, Mohamed Bisher Zeina, Shaza Zubeadah. (2023). Comparison of Some Entropy Measures for Non-Central Fisher Probability Distribution. Journal of Galoitica: Journal of Mathematical Structures and Applications, 5 ( 1 ), 17-35 (Doi   :  https://doi.org/10.54216/GJMSA.050102)
Chicago Mazeat Koreny, Mohamed Bisher Zeina, Shaza Zubeadah. "Comparison of Some Entropy Measures for Non-Central Fisher Probability Distribution." Journal of Galoitica: Journal of Mathematical Structures and Applications, 5 no. 1 (2023): 17-35 (Doi   :  https://doi.org/10.54216/GJMSA.050102)
Harvard Mazeat Koreny, Mohamed Bisher Zeina, Shaza Zubeadah. (2023). Comparison of Some Entropy Measures for Non-Central Fisher Probability Distribution. Journal of Galoitica: Journal of Mathematical Structures and Applications, 5 ( 1 ), 17-35 (Doi   :  https://doi.org/10.54216/GJMSA.050102)
Vancouver Mazeat Koreny, Mohamed Bisher Zeina, Shaza Zubeadah. Comparison of Some Entropy Measures for Non-Central Fisher Probability Distribution. Journal of Galoitica: Journal of Mathematical Structures and Applications, (2023); 5 ( 1 ): 17-35 (Doi   :  https://doi.org/10.54216/GJMSA.050102)
IEEE Mazeat Koreny, Mohamed Bisher Zeina, Shaza Zubeadah, Comparison of Some Entropy Measures for Non-Central Fisher Probability Distribution, Journal of Galoitica: Journal of Mathematical Structures and Applications, Vol. 5 , No. 1 , (2023) : 17-35 (Doi   :  https://doi.org/10.54216/GJMSA.050102)