Physica A xx (xxxx) xxx–xxx
Contents lists available at ScienceDirect
Physica A journal homepage: www.elsevier.com/locate/physa
Q1
Some results on Tsallis entropy measure and k-record values
Q2
Vikas Kumar Department of Applied Sciences, UIET, M. D. University, Rohtak-124001, India
highlights • We propose the measure of non-additive Tsallis entropy for k-record statistics from some continuous probability models. • We study this measure for various distributions used in lifetime testing, survival analysis, image processing and signal processing. • Also we define the measure of residual Tsallis entropy of k-record statistics and study some characterization results for it.
article
abstract
info
Article history: Received 27 October 2015 Received in revised form 20 April 2016 Available online xxxx
Extensive or non-extensive statistical mechanics arise from the additive or non-additivity of the corresponding entropy measures. Non-additive entropy measures are important for many applications. In this article, we consider and study a non-additive Tsallis entropy for k-record statistics from some continuous probability models. Furthermore, we prove a characterization result for the Tsallis entropy of k-record values. At the end, we study Tsallis residual entropy for k-record statistics. © 2016 Elsevier B.V. All rights reserved.
Keywords: Shannon entropy k-record values Residual entropy Characterization
1. Introduction
1
Suppose {Xi , i ≥ 1}be a sequence of independent and identically distributed (i.i.d.) random variables with a common absolutely continuous distribution function (cdf) F (x) and probability density function (pdf) f (x). The order statistics of a sample is defined by the arrangement of X1 , X2 , . . . , Xn from the smallest to the largest denoted as X1:n , X2:n , . . . , Xn:n . An observation Xj will be called an upper record value if its value exceeds that of all previous observations. Thus, Xj is an upper record if Xj > Xi for every i < j. An analogous definition can be given for a lower record value. Upper k-record process is defined in terms of the kth largest X yet seen. Define T1,k = k and for n ≥ 2, Tn,k = min j : j > Tn−1,k , Xj−k+1:j >
XTn−1,k −k+1:Tn−1,k , Yn,k = XTn,k −k+1:Tn,k , n ≥ 1 where Xi:n denotes the ith order statistic in a sample of size n. The sequence {Yn,k, n ≥ 1} are referred to as k-record values. Obviously, we obtain ordinary record values in the case k = 1. The model of k-record values is proposed by Dziubdziela and Kopocinski [1]. Then the probability density function and distribution function of the nth upper k-record are given by
fn,k (x) =
kn
Γ (n)
[− log F¯ (x)]n−1 [F¯ (x)]k−1 f (x),
(1.1)
and
Q3
2 3 4 5 6 7 8 9 10 11
12
13
Fn,k (x) =
−k log F¯ (x)
0
u
n −1 −u
e
Γ (n)
du,
respectively, where Γ (.) is the complete gamma function, see Arnold et al. [2]. E-mail address:
[email protected]. http://dx.doi.org/10.1016/j.physa.2016.05.064 0378-4371/© 2016 Elsevier B.V. All rights reserved.
(1.2)
14
15
2
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
17
V. Kumar / Physica A xx (xxxx) xxx–xxx
Records can be used in a wide range of problem, including seismology, sporting and athletic events, meteorological analysis, industrial stress testing, hydrology, oil and mining surveys and characterization of probability distribution, refer to Refs. [3,2,4]. In reliability theory, order statistics and record values are used for statistical modeling. The (n − k + 1)th order statistic in a sample of size n represents the life length of a k-out of-n system. Therefore the nth k-record value is just the life length of a k-out of-Tn,k system. In last decades, they have found many interesting applications in physics, including random walks, Brownian motion and related interface models, refer to Refs. [5,6]. Also, one of the important applications of records is to construct median filters for image and signal processing. Considering importance of Tsallis entropy, order statistics and record values, we try to extend the concept of Tsallis entropy using record values which can be further used in image or signal processing in physics. Information is an essential notation of our everyday life. A quantitative approach to information is also fundamental to science and technology. Especially, growing impact is witnessed at the intersection with physics, where informational concepts are progressive for instance in neural biophysics, thermodynamics, statistical physics, complexity science, nonlinear physics and the physics of information, refer to Refs. [7–12]. A very important quantitative approach to information, expressed in a statistical framework are based on the concept of entropy. The concept of entropy is important for studies in many areas such as physics, probability and statistics, communication theory and economics. The idea of information-theoretic entropy was first introduced by Shannon [13], which is defined as H (X ) = −
∞
f (x) log f (x)dx,
(1.3)
0 18 19 20 21
22
23 24
where f (x) is the pdf of a random variable X . It is later realized that Gibbs and Shannon entropy measures are very closely related. In literature many authors have generalized (1.3) in different ways and give various properties, including additivity properties, and characterizations theorems. A well-known parametric extension of the Shannon entropy measure was defined by Havrda and Charvat [14] as Hα ( X ) =
1 1−α
∞
f α (x)dx − 1 ,
α ̸= 1, α > 0.
(1.4)
0
This entropy is similar to Shannon entropy except for its non-additive nature, that is H (X +α Y ) = H (X ) + H (Y ) + (1 − α)H (X )H (Y ).
46
Clearly as α → 1, (1.4) reduces to (1.3). Although entropy measure (1.4) was first introduced by Havrda and Charvat [14] in the context of cybernetics theory, it was Tsallis [15] who exploited its non-extensive features and placed it in a physical setting. Hence, entropy measure (1.4) is also known as Tsallis entropy, which is considered as a useful measure in describing thermo-statistical properties of a certain class of physical system that entail long-range interactions, long-term memories and multi-fractals system. The Tsallis entropy is involved into much closer connections with physics, since it has been postulated and tested to form the ground of a non-extensive generalization to statistical mechanics, refer to Refs. [16,15,17]. Many researchers have used it in many physical applications, such as developing the statistical mechanics of large scale astrophysical system [18], investigating thermodynamic properties of stellar self-gravitating system [19], describing fully developed turbulence [20], image processing [21] and signal processing [22]. Several authors have studied the subject of characterization of distribution function F (x) based on entropies of order statistics and record values. Raqab and Awad [23,24] obtained a characterization of the generalization Pareto distribution based on Shannon entropy of k-record statistics. Baratpour et al. [25,26] obtained several characterization based on Shannon entropy and Renyi entropy of order statistics and record values. Fashandi and Ahmadi [27] have derived characterization result for the symmetric distributions based on Renyi entropy of order statistics, k-record statistics and the FGM family of bivariate distributions. Thapliyal et al. [28] and Kumar [29] established a characterization based on generalized entropy of order statistics and record value. An interesting question is whether we can determine the amount of information contained in a collection of record statistics from a sequence of i.i.d. random variables. In this paper, we extend the Tsallis entropy measure (1.4) based on k-record values and obtain some similar results. The paper is organized as follows. The Tsallis entropy of k-record values associated with uniform, exponential, Weibull, Pareto and finite range distributions are presented in Section 2. Section 3 is devoted to the characterization result. Generalized residual entropy of series system and k-record values have been studied in Section 4.
47
2. Tsallis entropy of k-record values obtained for specific distributions
25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45
48 49 50
51
52
We will use the probability integral transformation of the random variable U = F (X ), where the distribution of U is the standard uniform distribution. The probability integral transformation provides the following useful representation of the Tsallis entropy measure (1.4) for the random variable X Hα ( X ) =
1 1−α
1
f
α−1
(F
−1
(u))du − 1 .
0
Next, we state the following result. Next we prove the following result.
(2.1)
V. Kumar / Physica A xx (xxxx) xxx–xxx
3
Lemma 2.1. The entropy measure (1.4) of the nth upper k-record values Yn,k can be expressed as knα Γ ((n − 1)α + 1)
1
Hα (Yn,k ) =
[(k − 1)α + 1](n−1)α+1 (Γ (n))α
(1 − α)
1
E f α−1 {F −1 (1 − e−v )} − 1
(2.2)
2
where v follows gamma distribution with parameter ((n − 1)α + 1) and ((k − 1)α + 1), and E denotes the expectation.
3
Proof. Generalized entropy (1.4) of the k-record values is defined as
4
1
Hα (Yn,k ) =
∞
1−α
(fn,k (x))α dx − 1 ,
α ̸= 1, α > 0.
5
0
Using (1.1), this can be rewritten as 1
Hα (Yn,k ) =
(1 −
6
α)(Γ (n))α
knα
∞
(n−1)α (F¯ (x))(k−1)α f α (x)dx − (Γ (n))α . − log F¯ (x)
7
0
Substituting − log F¯ (x) = u, and hence, x = F −1 (1 − e−u ), we have
1
Hα (Yn,k ) =
(1 −
α)(Γ (n))α
knα
∞
8
u(n−1)α e−u[(k−1)α+1] f α−1 {F −1 (1 − e−u )} du − (Γ (n))α .
9
0
It can be rewritten as
10
1
Hα (Yn,k ) =
(1 − α)
So, the result follows.
knα Γ ((n − 1)α + 1)
[(k − 1)α +
1](n−1)α+1
(Γ (n))α
E f α−1 {F −1 (1 − e−u )} − 1 .
11
12
Example 2.1. A non-negative random variable X is Weibull distributed, if its pdf is f (x) = λβ x
β−1
exp −λx
β
,
13
λ, β > 0, x > 0
14
where λ and β are scale and shape parameters, respectively. The survival function is
15
β F¯ (x) = 1 − F (x) = e−λx .
16
1 Substituting − log F¯ (x) = u, we observe that x = F −1 (1 − e−u ) = ( λu ) β and for computing Hα (Xn,k ), we have 1
f α−1 {F −1 (1 − e−u )} = βλ β
α−1
( u)
(α−1)(β−1) β
17
e−u(α−1) .
18
Lemma 2.1 gives
Hα (Yn,k ) =
19
1 α−1 knα βλ β Γ nα −
1
α−1 β
1 nα− α− β
(1 − α)(Γ (n))α
(kα)
− (Γ (n))α
.
(2.3)
20
When k = 1, α → 1, (2.3) reduces to lim Hα (Yn ) = log Γ (n) −
α→1
n−
21
1
β
ψ(n) + n − log β −
1
β
log λ
22
the Shannon entropy of nth record values for Weibull variate. The logarithmic derivative ψ of the gamma function is known as the psi or digamma function, that is, it is given by
ψ(x) =
d dx
{log Γ (x)} =
23 24
Γ ′ (x) . Γ (x)
25
The derivatives ψ ′ , ψ ′′ , ψ ′′′ · · · of the digamma function are called polygamma functions in the literature. For some specific univariate continuous distributions, the expression (2.2) is evaluated as given in Table 2.1.
26
Q4
27
4
V. Kumar / Physica A xx (xxxx) xxx–xxx Table 2.1 Distribution function
Tsallis entropy; Hα (Yn,k )
Uniform, X ∼ U (a, b)
=
(1−α)(Γ (n))α
Exponential, X ∼ exp(θ)
=
(1−α)(Γ (n))α
Pareto, X ∼ P (a)
=
1
1
knα Γ ((n−1)α+1) (b−a)α−1 ((k−1)α+1)(n−1)α+1
1
knα θ α−1 Γ ((n−1)α+1) (kα)(n−1)α+1
1
knα aα−1 Γ ((n−1)α+1) (n−1)(α)+1 1 kα+ α− a
(1−α)(Γ (n))α
Finite Range, X ∼ FR(a, b)
=
(1−α)(Γ (n))α
Rayleigh, X ∼ Rayl(a)
=
(1−α)(Γ (n))α
1
1
− (Γ (n))α − (Γ (n))α − (Γ (n))α
(ka)nα Γ ((n−1)α+1) bα−1 (kaα−α+1)(n−1)(α)+1
knα 2 λ
(
√
)
α−1
1 ) Γ (nα− α− 2
nα− α−1
(kα)
− (Γ (n))α
− (Γ (n))α
2
3. Characterization problem
3
In this section, we show that the distribution function F can be uniquely specified up to a location change by the equality of Tsallis’s entropy of k-record values. First, we state the following lemma, due to Goffman and Pedrick (p.p 192–193) [30].
4
Lemma 3.1. A complete orthogonal system for the space L2 (0, ∞) is given by a sequence of Laguerre function
2
5
6
7
8 9
φn (x) =
1 −x e 2 Ln (x),
n!
n ≥ 0,
where Ln (x) is the Laguerre polynomial, defined as the sum of coefficients of e−x in the nth derivative of xn e−x , that is Ln (x) = ex
dn dx
(xn e−x ) = n
n (−1)k n(n − 1) · · · (k + 1)xk . k=0
The completeness of Laguerre functions in L2 (0, ∞) means that if f ∈ L2 (0, ∞) and is zero almost everywhere.
∞ 0
x
f (x)e− 2 Ln (x)dx = 0, ∀n ≥ 0, then f
11
Theorem 3.1. Let X and Y be two non-negative random variables having common support. Let Hα (Xn,k ) < ∞ and Hα (Yn,k ) < ∞ be their Tsallis’s entropies of upper k-records, respectively. Then F and G belong to the same location family of distribution, if,
12
and only if
10
13
14 15 16
17
Hα (Xn,k ) = Hα (Yn,k ),
∀n, k ≥ 1.
Proof. The necessity part is clear. We need to prove the sufficiency part only. Let Hα (Xn,k ) = Hα (Yn,k ),
∀n, k ≥ 1.
We know that Hα (Xn,k ) =
1
(1 − α)(Γ (n))α
nα
∞
k
(n−1)α (k−1)α α α ¯ ¯ − log F (x) (F (x)) f (x)dx − (Γ (n)) .
18
19
20
1 Substituting {− log F¯ (x)}α = u, and hence, x = F −1 1 − exp −u α
Hα (Xn,k ) =
Hα (Yn,k ) =
, in (3.1) we get
k 1 ∞ − u α n+ 1 −2 α−1 −1 n α k 0 e u α f F 1 − exp −u α du
1
(Γ (n))α
(1 − α)
k 1 ∞ − u α n+ 1 −2 α−1 −1 n α k 0 e u α g G 1 − exp −u α du
1
(3.1)
0
(Γ (n))α
(1 − α)
−1 .
−1 .
If for two cdfs F and G, these differences coincide, we can conclude that ∞
0
1
f α−1 F −1 1 − exp −u α
1
− g α−1 G−1 1 − exp −u α
e
k − uα
1
× u α −1 un−1 du = 0,
(3.2)
V. Kumar / Physica A xx (xxxx) xxx–xxx
5
for all n ≥ 1. Eq. (3.2) can be converted into a sequence of equations as given in Lemma 3.1. Therefore, we can rewrite as ∞
1
f α−1 F −1 1 − exp −u α
1 u k u 1 α − g α−1 G−1 1 − exp −u α e 2 −u × u α −1 e− 2 Ln (u)du = 0
0
for all n ≥ 1, where Ln (u) is a Laguerre polynomial given in Lemma 3.1. Applying Lemma 3.1 in the above equation and after some simplifications, we obtain
1 1 1 − exp −u α f F = g G−1 1 − exp −u α , 1 Let 1 − exp −u α = v . As ddv F −1 (v) = f (F −11 (v)) , it follows that
−1
(F −1 )′ (v) = (G−1 )′ (v), F
−1
∀u ∈ (0, 1).
1 2
3
∀ v ∈ (0, 1)
(v) = G (v) + c , −1
where c is a constant. Since the random variables X and Y have the common support, the desired result follows.
4
4. Residual Tsallis entropy of k-record values
5
In survival analysis and life testing, the current age of the system under consideration is also taken into account. The residual lifetime of the system when it is still operating at time t is Xt = (X − t |X > t ); Ebrahimi [31] proposed the entropy of the residual lifetime Xt as H (X ; t ) = −
∞
f (x)
log
F¯ (t )
t
f ( x) F¯ (t )
dx,
t > 0.
(4.1)
Similar results in case of a generalized residual entropy have been derived by Belzunce et al. [32] and Nanda and Paul [33]. The Tsallis residual entropy was given by Hα ( X ; t ) =
∞
1 1−α
α
f (x) F¯ α (t )
t
dx − 1 ,
α > 0, α ̸= 1,
(4.2)
for more details, refer to Kumar and Taneja [34]. The role of residual entropy as a measure of uncertainty in order statistics and record values has been studied by many researchers, refer to Zarezadeh and Asadi [35]. Next, we derive residual Tsallis entropy (4.2) for the upper k-record values. Before the main result we state the following two results which are easy to prove, and hence omitted. First, we need the following notation. Notation. We use the notation X ∼ Γt (n, λ) to show that X has a truncated Gamma distribution with density function f ( x) =
λ xn−1 e−λx , Γ (n; t ) n
x > t > 0,
∞ t
xn−1 e−x dx, and n , λ > 0.
1
knα
Γ [(n − 1)α + 1; −k log(1 − t )] −1 . Γ α (n; −k log(1 − t ))
(4.3)
Lemma 4.2. Let U¯ n,k be the nth upper k-record values for a sequence of observations from standard exponential distribution. Then Hα (U¯ n,k ; t ) =
1
nα
k
Γ ((n − 1)α + 1; kt ) −1 . Γ α (n; kt )
(4.4)
Now we state the main result.
Hα (Yn,k ; t ) =
1 1−α
11
12
13 14 15 16
17
20
21
22 23
24
25
Theorem 4.1. Let Yn,k , n, k > 1 be a sequence of i.i.d. continuous random variable from the distribution F (x) with density function f (x) and the quantile function F −1 (.). Let Un,k denote the nth upper k-record. Then the Tsallis residual entropy (4.2) of nth upper k-record value can be expressed as
10
(kα)(n−1)α+1
1−α
9
[(k − 1)α + 1](n−1)α+1
1−α
8
19
Lemma 4.1. Let Un∗,k be the nth upper k-record value for a sequence of observations from uniform distribution on (0, 1). Then Hα (Un∗,k ; t ) =
7
18
where Γ (n; t ), the incomplete gamma function, is defined as Γ (n; t ) =
6
knα
((k − 1)α + 1)(n−1)α+1
Γ (n − 1)α + 1; −k log F¯ (t ) α−1 −1 −Vz E {f {F (1 − e )}} − 1 , Γ α (n; −k log F¯ (t ))
(4.5)
26 27 28
29
6
1 2
V. Kumar / Physica A xx (xxxx) xxx–xxx
where z = −k log F¯ (t ) and Vz follows truncated gamma distribution with parameter Γ ((n − 1)α + 1; −k log F¯ (t )) and E is the expectation. Proof. Tsallis residual entropy (4.2) of nth upper k-record value is defined as Hα (Yn,k ; t ) =
=
∞
1
t
F¯nα,k (t )
(1 − α)
1
fnα,k (x)dx
(1 − α)
knα
−1 ;
α > 0, α ̸= 1
∞ (n−1)α − log F¯ (x) (F¯ (x))(k−1)α f α (x)dx t −1 , Γ α (n; −k log F¯ (t )) u
3 4
5
6
7
where fn,k is the pdf of nth upper k-record value defined in (1.1). Substituting −k log F¯ (x) = u and x = F −1 (1 − e− k ), we get
Hα (Yn,k ; t ) =
(k−1)α+1 u k knα −∞k log F¯ (t ) u(n−1)α e−u du f α−1 F −1 1 − e− k
1
(1 − α)
k(n−1)α+1 Γ α (n; −k log F¯ (t ))
Hα (Yn,k ; t ) =
1
knα
((k − 1)α + 1)(n−1)α+1
1−α
Γ (n − 1)α + 1; −k log F¯ (t ) α−1 −1 −Vz E {f {F (1 − e )}} − 1 . Γ α n; −k log F¯ (t )
So, the result follows.
9
Example 4.1. Let X be a random variable having Pareto distribution with pdf
θ βθ
f (x) = θ+1 , x
x ≥ β > 0, θ > 0.
(4.6) u
11
12
13
14
15
16
−1 .
It can be rewritten as
8
10
u
Substituting −k ln F¯ (x) = u, we observe that x = F −1 (1 − e− k ) = β e kθ and for computing Hα (Yn,k ; t ), we have f
α−1
F
−1
1−e
− uk
α−1 θ − u (α−1) 1+ θ1 = e k . β
Thus, Tsallis residual entropy (4.2) of nth upper k-record value for Pareto distribution is given as Hα (Yn,k ; t ) =
1
knα k(n−1)α+1
(1 − α)
α−1 ∞ u kαθ+α−1 θ 1 (n−1)α − k θ du − 1 , u e β Γ α n; −k log F¯ (t ) −k log F¯ (t )
which gives, Hα (Yn,k ; t ) =
1
(kθ )nα
(1 − α) k2 αθ + k(α − 1) (n−1)α+1
α−1 Γ (n − 1)α + 1; −kθ log β t 1 −1 . β Γ α n; −kθ log βt
17
Remark 4.1. For k = 1, the results reported in this article reduces to the nth record value distribution for Tsallis entropy.
18
5. Conclusion
24
Records and order statistics play a crucial role in statistical physics. The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. We have studied the Tsallis entropy and its dynamic version based on k-record values. We discussed the Tsallis entropy measure for the k-record values associated with various distributions, which are commonly used in the reliability modeling. The results studied in this paper can be useful for further exploring the concept of information measures based on k-record values.
25
Acknowledgments
19 20 21 22 23
26 27
The author is grateful to the Executive editor of the journal and two anonymous referees for their careful reading of the manuscript and constructive comments, which enhanced the presentation of the paper.
V. Kumar / Physica A xx (xxxx) xxx–xxx
7
References [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35]
W. Dziubdziela, B. Kopocinski, Limiting properties of the kth record values, Appl. Math. 15 (1976) 187–190. B.C. Arnold, N. Balakrishnan, H.N. Nagaraja, Records, Wiley, New York, 1998. M. Ahsanullah, Record Values-Theory and Applications, University Press of America Inc., New York, 2004. V. Nevzorov, Records: Mathematical Theory, in: Translation of Mathematical Monographs, vol. 194, American Mathematical Society Providence, RI, USA, 2001. S.N. Majumdar, A. Comtet, Exact maximal height distribution of fluctuating interfaces, Phys. Rev. Lett. 92 (2004) 225501. G. Schehr, S.N. Majumdar, Universal asymptotic statistics of maximal relative height in one-dimensional solid-on-solid models, Phys. Rev. E 73 (2006) 056103. G. Deco, An Information-Theoretic Approach to Neural Computing, Springer, Berlin, 1996. R.B. Frieden, Physics From Fisher Information: A Unification, Cambridge University Press, Cambridge (MA), 1998. E.T. Jaynes, Information theory and statistical mechanics, Phys. Rev. 106 (1957) 620–630. S. Kim, Computability of entropy and information in classical Hamiltonian systems, Phys. Lett. A 373 (2009) 1409–1414. R. Landauer, The physical nature of information, Phys. Lett. A 217 (1996) 188–193. G. Nicolis, C. Nicolis, Foundations of Complex Systems: Nonlinear Dynamics, Statistical Physics, Information and Prediction, World Scientific, Singapore, 2007. C.E. Shannon, A mathematical theory of communication, Bell Syst. Tech. J. 27 (1948) 379–432. J. Havrda, F. Charvat, Quantification method of classification process: concept of structural α -entropy, Kybernetika 3 (1967) 30–35. C. Tsallis, Possible generalization of Boltzmann–Gibbs statistics, J. Stat. Phys. 52 (1988) 479–487. A.S. Parvan, Extensive statistical mechanics based on nonadditive entropy: Canonical ensemble, Phys. Lett. A 360 (2006) 26–34. C. Tsallis, Introduction to Nonextensive Statistical Mechanics, Springer, New York, 2009. A. Nakamichi, I. Joichi, O. Iguchi, M. Morikawa, Non-extensive galaxy distributions-Tsallis statistical mechanics, Chaos Solitons Fractals 13 (2002) 595–601. A. Taruya, M. Sakagami, Gravothermal catastrophe and Tsallis generalized entropy of sergravitating systems (II) thermodynamic properties of stellar plytrope, Physica A 318 (2003) 387–413. T. Armitsu, N. Armitsu, PDF of velocity fluctuation in turbulence by a statistics based on generalized entropy, Physica A 305 (2002) 218–226. M. Yu, C. Zhanfang, Z. Hongbiao, Research of automatic medical image segmentation algorithm based on Tsallis entropy and improved PCNN. in: IEEE proceddings on ICMA, 2009, pp. 1004-1008. S. Tong, A. Bezerianos, J. Paul, Y. Zhu, N. Thakor, Nonextensive entropy measure of EEG following brain injury from cardiac arrest, Physica A 305 (2002) 619–628. M.Z. Raqab, A.M. Awad, Characterizations of the Pareto and related distributions, Metrika 52 (2000) 63–67. M.Z. Raqab, A.M. Awad, A note on characterization based on Shannon entropy of record statistics, Statistics 35 (2001) 411–413. S. Baratpour, J. Ahmadi, N.R. Arghami, Entropy properties of record statistics, Statist. Papers 48 (2007) 197–213. S. Baratpour, J. Ahmadi, N.R. Arghami, Characterizations based on Renyi entropy of order statistics and record values, J. Statist. Plann. Inference 138 (2008) 2544–2551. M. Fashandi, J. Ahmadi, Characterizations of symmatric distributions based on Renyi entropy, Statist. Probab. Lett. 82 (2012) 798–804. R. Thapliyal, H.C. Taneja, V. Kumar, Characterization results based on non-additive entropy of order statistics, Physica A 417 (2015) 297–303. V. Kumar, Generalized entropy measure in record values and its applications, Statist. Probab. Lett. 106 (2015) 46–51. C. Goffman, G. Pedrick, First Course in Functional Analysis, Prentice Hall Inc., London, 1965. N. Ebrahimi, How to measure uncertainty in the residual lifetime distributions, Sankhya¯ Ser. A 58 (1996) 48–57. F. Belzunce, J. Navarro, J.M. Ruiz, Y. Aguila, Some results on residual entropy function, Metrika 59 (2004) 147–161. A.K. Nanda, P. Paul, Some results on generalized residual entropy, Inform. Sci. 176 (2006) 27–47. V. Kumar, H.C. Taneja, A generalized entropy-based residual lifetime distributions, Int. J. Biomath. 4 (2) (2011) 171–184. S. Zarezadeh, M. Asadi, Results on residual Renyi entropy of order statistics and record values, Inform. Sci. 180 (2010) 4195–4206.
1
2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37