1. Ardakani, O. M., Ebrahimi, N., and Soofi, E. S. (2018), Ranking forecasts by stochastic error distance, information and reliability measures. International Statistical Review, 86(3), 442-468. [
DOI:10.1111/insr.12250]
2. Ardakani, O. M., Asadi, M., Ebrahimi, N., and Soofi, E. S. (2020), MR plot: A big data tool for distinguishing distributions. Statistical Analysis and Data Mining the ASA Data Science Journal, 13, 405-418. [
DOI:10.1002/sam.11464]
3. Asadi, M. and Zohrevand, Y. (2007), On the dynamic cumulative residual entropy. Journal of Statistical Planning and Inference, 137, 1931-1941. [
DOI:10.1016/j.jspi.2006.06.035]
4. Asadi, M., Ebrahimi, N., and Soofi, E. S. (2017), Connections of Gini, Fisher, and Shannon, by Bayes risk under proportional hazards. Journal of Applied Probability, 54, 1027-1050. [
DOI:10.1017/jpr.2017.51]
5. Asadi, M., Ebrahimi, N., and Soofi, E. S. (2018), Optimal hazard models based on partial information. European Journal of Operational Research, 270(2), 1-11. [
DOI:10.1016/j.ejor.2018.04.006]
6. Asadi, M., Ebrahimi, N., and Soofi, E. S. (2019), The alpha-mixture of survival functions. Journal of Applied Probability, 56(4), 1151-1167. [
DOI:10.1017/jpr.2019.72]
7. Asadi, M., Ebrahimi, N., kharazmi, O., and Soofi, E. S. (2019), Mixture models, Bayes Fisher information, and divergence measures. IEEE Transactions on Information Theory, 65, 2316-2321. [
DOI:10.1109/TIT.2018.2877608]
8. Asadi, M., Ebrahimi, N., Soofi, E. S., and Zohrevand, Y. (2016), Jensen-Shannon information of the coherent system lifetime. Reliability Engineering and System Safety, 156(C), 244-255. [
DOI:10.1016/j.ress.2016.07.015]
9. Bajgiran, A. H., Mardikoraem, M., and Soofi, E. S. (2021), Maximum entropy distributions with quantile information. European Journal of Operational Research, 290(1), 196-209. [
DOI:10.1016/j.ejor.2020.07.052]
10. Barlow, R. E., Marshall, A. W., and Proschan, F. (1963), Properties of probability distributions with monotone hazard rate. Annals of Mathematical Statistics, 34, 375-389. [
DOI:10.1214/aoms/1177704147]
11. Behboodian, J. (1970), On a mixture of normal distributions. Biometrika, 57(1), 215-217. [
DOI:10.1093/biomet/57.1.215]
12. Behboodian, J. (1972), On the distribution of a symmetric statistics from a mixed population. Technometrics, 14, 919-923. [
DOI:10.1080/00401706.1972.10488987]
13. Beheshti, N., Racine, J. S., and Soofi, E. S. (2019), Information measures of kernel estimation. Econometric Reviews, 38(1), 47-68. [
DOI:10.1080/07474938.2016.1222236]
14. Bercher, J. F. (2011), Escort entropies and divergences and related canonical distribution. Physics Letters A, 375, 2969-2973. [
DOI:10.1016/j.physleta.2011.06.057]
15. Bercher, J. F. (2012), A simple probabilistic construction yielding generalized entropies and divergences, escort distributions and q-Gaussians. Physica a: Statistical Mechanics and Its Applications, 391(19), 4460-4469. [
DOI:10.1016/j.physa.2012.04.024]
16. Bissiri, P. G., Holmes, C. C., and Walker, S. G. (2016), A general framework for updating belief distributions. Journal of the Royal Statistical Society Series B, 78(5), 1103-1130. [
DOI:10.1111/rssb.12158]
17. Block, H. W., and Savits, T. H. (1997). Burn-in. Statistical Science, 12, 1-19. [
DOI:10.1214/ss/1029963258]
18. Chernoff, H. (1952), A measure of asymptotic efficiency of tests of a hypothesis based on the sum of observations. Annals of Mathematical Statistics, 23 ,493-507. [
DOI:10.1214/aoms/1177729330]
19. Cover, T. M. and Thomas, J. A. (2006), Elements of Information Theory, 2nd ed.. New York: Wiley.
20. Harremoes, P. (2001), Binomial and Poisson distributions as maximum entropy distributions. IEEE Transactions on Information Theory, 47, 2039-2041. [
DOI:10.1109/18.930936]
21. Holmes, C. C., and Walker, S. G. (2017). Assigning a value to a power likelihood in a general Bayesian model. Biometrika, 104, 497-503.
22. Ibrahim, J. G., and Chen, M. H. (2000), Power prior distributions for regression models. Statistical Science, 15, 46-60.
23. Ibrahim, J. G., Chen, M. H., and Sinha, D. (2003), On optimality of the power prior. Journal of the American Statistical Association, 98, 204-213. [
DOI:10.1198/016214503388619229]
24. Kochar, S., Mukerjee, H., and Samaniego, F. J. (1999), The "signature" of a coherent system and its application to comparisons among systems. Naval Research Logistics, 46(5), 507-523.
https://doi.org/10.1002/(SICI)1520-6750(199908)46:5<507::AID-NAV4>3.0.CO;2-D [
DOI:10.1002/(SICI)1520-6750(199908)46:53.0.CO;2-D]
25. Kullback, S. (1959), Information theory and statistics. New York: Wiley (reprinted in 1968 by Dover).
26. Li, Q., and Racine (2007). Nonparametric Econometrics: Theory and Practice. New Jersey: Princeton University Press.
27. Lin, J. (1991), Divergence measures based on the Shannon entropy. Transactions on Information Theory, 37, 145-151. [
DOI:10.1109/18.61115]
28. Lindley, D. V. (1956), On a measure of the information provided by an experiment. Annals of Mathematical Statistics, 27, 986-1005. [
DOI:10.1214/aoms/1177728069]
29. Lynn, N. J., and Singpurwalla, N. D. (1997), Comment: "Burn-in'" makes us feel good. Statistical Science, 12, 13-19.
30. McCulloch, R. E. (1989), Local model influence. Journal of the American Statistical Association, 84, 473-478. [
DOI:10.1080/01621459.1989.10478793]
31. Mcvinish, M., Rousseau, J., and Mengersen, K. (2009), Bayesian goodness of fit testing with mixtures of triangular distributions. Scand. J Statist., 36, 337-354.
32. Navarro, J., and Rychlik, T. (2007), Reliability and expectation bounds for coherent systems with exchangeable components. Journal of Multivariate Analysis, 98(1), 102-113. [
DOI:10.1016/j.jmva.2005.09.003]
33. Rao, M., Chen, Y., Vemuri, B. C., and Wang, F. (2004), Cumulative residual entropy: A new measure of information. IEEE Transactions on Information Theory, 50, 1220-1228. [
DOI:10.1109/TIT.2004.828057]
34. Samaniego, F. J. (1985), On closure of the IFR class under formation of coherent systems. IEEE Transactions on Reliability, R-34(1), 69-72. [
DOI:10.1109/TR.1985.5221935]
35. Samaniego, F. J. (2007), System signatures and their applications in engineering reliability. Springer.
36. Shaked, M., and Shanthikumar, J. G. (2007), Stochastic orders. Springer.
37. Shaked, M., and Suarez-Llorens, A. (2003), On the comparison of reliability experiments based on the convolution order. Journal of the American Statistical Association, 98(463), 693-702. [
DOI:10.1198/016214503000000602]
38. Shoja, M., and Soofi, E. S. (2017), Uncertainty, information, and disagreement of economic forecasters. Econometric Reviews, 36(6-9), 796-817. [
DOI:10.1080/07474938.2017.1307577]
39. Soofi, E. S., Ebrahimi, N., and Habibullah, M. (1995), Information distinguishability with application to analysis of failure data. Journal of the American Statistical Association, 90, 657-668. [
DOI:10.1080/01621459.1995.10476560]
40. Tsallis, C. (1998), Generalized entropy-based criterion for consistent testing. Physics Review E, 58, 1442-1445. [
DOI:10.1103/PhysRevE.58.1442]
41. Vakili-Nezhaad, G. R., and Mansoori, G. A. (2004), An application of non-extensive statistical mechanics to nanosystems. Journal of Computational and Theortical Nanonscience, 1, 233-235. [
DOI:10.1166/jctn.2004.021]
42. van Erven, T., and Harremo"es, P. (2014), R'enyi divergence and Kullback-Leibler divergence. IEEE Transactions on Information Theory, {bf 60, 3797-3820. [
DOI:10.1109/TIT.2014.2320500]
43. Walker, S. G. (2016), Bayesian information in an experiment and the Fisher information distance. Statistics and Probability Letters, {bf 112, 5-9. [
DOI:10.1016/j.spl.2016.01.014]
44. Wang, W., and Lahiri, K. (2021), Estimating macroeconomic uncertainty and discord using info-metrics. In Innovations in info-metrics A cross-disciplinary perspective on information and information processing, 1-55.
45. Wang, L., and Madiman, M. (2014), Beyond the entropy power inequality, via rearrangements. IEEE Transactions on Information Theory, 60(9), 5116-5137. [
DOI:10.1109/TIT.2014.2338852]