Testing Exponentiality Based on the Lin-Wong Divergence on the Residual Lifetime Data

Authors

1 Department of Statistics, School of Mathematical Sciences, Ferdowsi University of Mashhad, Mashhad, Iran.

2 Department of Statistics, School of Mathematical Sciences and Statistic, University of Birjand, Birjand, Iran.

10.29252/jirss.18.2.39

Abstract

Testing exponentiality has long been an interesting issue in statistical inferences. The present article is based on a modified measure of distance between two distributions. The proposed new measure is similar to the Kullback-Leibler divergence and it is related to the Lin-Wong divergence applied on the residual lifetime data. A modified measure is developed here which is a consistent test statistic for testing the hypothesis of exponentiality against some alternatives. First, we consider a method similar to Vasicek's and Correa's techniques of estimating the density function in order to construct statistic for LW divergence. Then the critical values of the test are computed, using a Monte-Carlo simulation method. Also, we find the differences of exponential distribution detection power between the proposed test and other tests. It is shown that the proposed test performs better than other tests of exponentiality when the hazard rate is in the form of an increasing function. Finally, a case of application of the proposed test is shown through two illustrative examples.

Keywords

Abbasnejad, M., Arghami, N.R., Tavakoli, M. (2012). A goodness of fit test for exponentiality based on Lin Wong information. JIRSS. 11(2):191-202. Alizadeh, N.H. (2010). A new estimator of entropy and its application in testing normality. Journal of Statistical Computation and Simulation. 80(4):1151- 1162. Alizadeh N.H., Arghami, N.R. (2011b). Testing exponentiality based on characterizations of the exponential distribution. Journal of Statistical Computation and Simulation. 81(11):1641-1651. Anderson, T.W., Darling, D.A. (1954). A test of goodness of fit. Journal of American Statistical Association. 49:765-769. Asadi, M., Ebrahimi, N. (2000). Residual entropy and its characterizations in terms of hazard function and mean residual life function. Statistics and probability letters. 49(3):263-269. Baratpour, S., Habibirad, A. (2012). Testing goodness-of-fit for exponential distribution based on cumulative residual entropy. Communications in StatisticsTheory and Methods. 41(8):1387-1396. Bowman, A.W. (1992). Density based tests for goodness-of-fit. Journal of Statistical Computation and Simulation. 40(1-2):1-13. Choi, B., Kim, K., Song, S.H. (2004). Goodness-of-Fit Test for Exponentiality Based on Kullback-Leibler Information. Communications in StatisticsSimulation and Computation. 33(2):525-536. Conover, W.J. (1999). Practical Nonparametric Statistics. John Wiley & Sons, New York. Cramer H. (1928). On the composition of elementary errors, Skandinavisk Aktuarietidskrift. 11:13-74, 141-180. Di Crescenzo, A., Longobardi, M. (2004). A measure of discrimination between past lifetime distributions. Statistics and Probability Letters. 67(2):173-182. Ebrahimi, N. (1996). How to measure uncertainty about residual lifetime. SankhyaA. 58(1):48-57. Ebrahimi, N. (1998). Testing exponentiality of the residual life, based on dynamic Kullback-Leibler information. IEEE transactions on reliability. 47(2):197-201. Ebrahimi, N., Kirmani, S.N.U.A. (1996a). A measure of discrimination between two residual life-time distributions and its applications. Annals of the Institute of Statistical Mathematics. 48(2):257-265. Ebrahimi, N., Pellerey, F. (1995). New partial ordering of survival functions based on the notion of uncertainty. Journal of Applied Probability. 32(1):202-211. Gurevich, G., Davidson, A. (2008). Standardized forms of Kullback-Leibler information based statistics for normality and exponentiality. Computer Modelling and New Technologies. 12(1):14-25. Hanis, C.M. (1976). A Note on Testing for Exponentiality. Num. Res. Bogist.28(3):169176. Henze, N., Meintanis, S.G. (2002b). Goodness-of-fit tests based on a new characterization of the exponential distribution. Communications in Statistics, Theory and Methods. 31(9):1479-1497. Jager, L., Wellner, J.A. (2007). Goodness-of-fit tests via phi-divergences. The Annals of Statistics. 2018-2053. Khalili, M., Habibirad, A., Yousefzadeh, F. (2017). Some Properties of Lin Wong Divergence on the Past Lifetime Data. Communications in Statistics, Theory and Methods. 1-13. Kolmogorov, A.N. (1933). Sulla determinazione empirica di una legge di distribuzione. Giornale dell Instituto Italiano degli Attuari. 4:83-91. Koziol, J.A., Byar, D.P. (1975). Percentage points of the asymptotic distributions of one and two sample KS statistics for truncated or censored data. Technometrics. 17(4):507-510. Kullback, S., Leibler, R.A. (1951). On information and sufficiency. The annals of mathematical statistics. 22(1):79-86. Lin, J. (1991). Divergence measures based on the Shannon entropy. IEEE Transactions on Information theory. 37(1):145-151. Lin, J., Wong, S.K.M. (1990). A New Directed Divergence Measure and Its Characterization. International Journal Of General System. 17(1):73-81. Meaker, M.K. (1987). Clinical Gerontology: A Guide to Assessment and Intervention. American Journal of Occupational Therapy. 41(4):269-269. Nair, K.R.M., Rajesh, G. (1998). Characterization of the probability distributions using the residual entropy function. Journal of the Indian Statistics Association. 36:157-166. Navarro, J., del Aguila, Y., Asadi, M. (2010). Some new results on the cumulative residual entropy. Journal of Statistical Planning and Inference. 140(1):310-322. Navarro, J., Franco, M., Ruiz, J.M. (1998). Characterization through moments of the residual life and conditional spacings. Sankhya, The Indian Journal of Statistics, Series A. 36-48. Parzen, E. (1979). Nonparametric Statistical Data Modeling. Journal of the American Statistical Association, 74(365):105-121. Shannon, C.E. (1948). A mathematical theory of communication. Bell System Technical Journal. 27:379-423. Shioya, H., Da-Te, T. (1995). A generalization of Lin divergence and the derivation of a new information divergence. Electronics and Communications in Japan (Part III: Fundamental Electronic Science). 78(7):34-40. Vasicek, O. (1976). A test for normality based on sample entropy. Journal of Research Statistical Society. 38(Serial. B):5459. Von Mises, R. (1931). Wahrscheinlichkeitsrechnung und Ihre Anwendung in der Statistik und Theoretischen Physik. F. Deuticke, Leipzig. 6(1). Wieczorkowski, R. Grzegorzewski, P. (1999). Entropy estimators - Improvements and comparisons. Communication Statistics Computing and Simulation. 28:541-567. Zamanzade, E., Arghami, N.R. (2011). Goodness of fit test based on correcting moments of modified entropy estimator.Journal of Statistical Computation and Simulation. 81:2077-2093.
Volume 18, Issue 2
December 2019
Pages 39-61
  • Receive Date: 23 July 2022
  • Revise Date: 13 May 2024
  • Accept Date: 23 July 2022