Let $ B(n, p) $ denote a binomial random variable with parameters $ n $ and $ p $. Janson in [1] introduced the following conjecture suggested by Va\v{s}k Chvátal.
Conjecture 1 (Chvátal). For any fixed $ n\geq 2 $, as $ m $ ranges over $ \{0, \ldots, n\} $, the probability $ q_m: =P(B(n, m/n)\leq m) $ is the smallest when $ m $ is closest to $ \frac{2n}{3} $.
Conjecture 1 has applications in machine learning, such as the analysis of generalized boundaries of relative deviation bounds and unbounded loss functions ([2] and [3]). As to the probability of a binomial random variable exceeding its expectation, we refer to Doerr [2], Greenberg and Mohri [3], Pelekis and Ramon [4]. Janson [1] proved that Conjecture 1 holds for large $ n $. Barabesi et al. [5] and Sun [6] gave an affirmative answer to Conjecture 1 for general $ n\geq 2 $. Hereafter, we call Conjecture 1 by Chvátal's theorem.
Motivated by Chvátal's theorem, Li et al. [7] considered the infimum value problem on the probability that a random variable is not more than its expectation, when its distribution is the Poisson distribution, the geometric distribution or the Pascal distribution. Sun et al. [8] investigated the corresponding infimum value problem for the Gamma distribution among other things. Hu et al. [9] studied the corresponding problem for some infinitely divisible distributions including the inverse Gaussian, log-normal, Gumbel and logistic distributions. In this note, we consider the infimum value problem for the Weibull distribution and the Pareto distribution in Sections 2 and 3, respectively. Before presenting the main results, we give two remarks.
Remark 1.1 (ⅰ) In virtue of Chvátal's theorem, there is a natural question as follows:
For any fixed integer $ n\geq 2 $, how about the minimum value of the probability $ P(B(n, p)\leq np) $ for $ p\in (0, 1] $?
For small fixed $ n $, we may find the solution. Up to now, we don't know the solution for general $ n\geq 2 $. In fact, it was posed as an open question in the first version of [7] (i.e., [10]).
(ⅱ) Motivated by Chvátal's theorem, Li et al. [7] initiated the study on the infimum value problem on the probability that a random variable is not more than its expectation. In this topic, [8] is the second paper, this note is the third one and [9] is the fourth one.
Remark 1.2 Let $ X $ be a random variable with expectation $ EX $. Assume that the distribution of $ X $ contains some parameters $ \alpha $ and $ \beta $. The motivation to study $ \inf_{\alpha, \beta}P(X\leq EX) $ is that from it we can get $ \sup_{\alpha, \beta}P(X > EX) $. Obviously, if we wish the probability $ P(X > EX) $ is as large as possible, we should find $ \sup_{\alpha, \beta}P(X > EX) $ or equivalently $ \inf_{\alpha, \beta}P(X\leq EX) $. We wish that our work on this topic may find some applications in machine learning, statistics, finance and economics etc.
Let $ X $ be a Weibull random variable with parameters $ \alpha $ and $ \theta\, (\alpha > 0, \theta > 0) $ and the density function
We know that its expectation $ EX=\theta ^{\frac{1}{\alpha }}\Gamma \left (\frac{1}{\alpha } +1\right), $ where $ \Gamma \left (\frac{1}{\alpha } +1\right) $ is the Gamma function, i.e., $ \Gamma \left (\frac{1}{\alpha } +1\right)=\int_0^{\infty}u^{\frac{1}{\alpha}}e^{-u}du $. For any given real number $ \kappa > 0, $ we have
By taking the change of variable $ t= \left (\theta x \right)^{\frac{1}{\alpha }} $, we get
which shows that $ P\left (X\leq \kappa EX \right) $ is independent of $ \theta $.
Define a function
The main result of this section is
Proposition 2.1 (ⅰ) If $ \kappa \leq 1 $, then
where $ \gamma $ is the Euler's constant, i.e., $ \gamma=\sum_{n=1}^{\infty }\left [\frac{1}{n} -\ln\left (1+\frac{1}{n} \right)\right] $.
(ⅱ) If $ \kappa > 1 $, then
where $ \alpha _{0}\left (\kappa \right)=\frac{1}{x_{0}\left (\kappa \right) -1} $, and $ x_{0}\left (\kappa \right) $ is the unique null point of function $ \varphi _{\kappa }\left (x \right): = \left (x-1 \right)\psi \left (x \right)-\ln\left (\kappa \Gamma \left (x \right) \right) $ on $ \left (1, \infty \right), $ where $ \psi(x) $ is the digamma function (see Definition 2.3 below).
Note that $ \left (\kappa \Gamma \left (\frac{1}{\alpha } +1\right)\right)^{\alpha }=e^{\alpha \ln\left (\kappa \Gamma \left (\frac{1}{\alpha }+1 \right)\right)}. $ Let $ x=\frac{1}{\alpha }+1, $ and define function
Then
and in order to finish the proof of Proposition 2.1, it is enough to prove the following lemma.
Lemma 2.2 (ⅰ) If $ \kappa \leq 1, $ then
where $ \gamma $ is the Euler's constant.
where $ x_{0}\left (\kappa \right) $ is the unique null point of function $ \varphi _{\kappa }\left (x \right): = \left (x-1 \right)\psi \left (x \right)-\ln\left (\kappa \Gamma \left (x \right) \right) $ on $ \left (1, \infty \right), $ where $ \psi(x) $ is the digamma function.
Before giving the proof of Lemma 2.2, we need some preliminaries on the ploygamma function.
Definition 2.3([11, 1.16]) Let $ m $ be any nonnegative integers. $ m $-order ploygamma function $ \psi^{(m)} $ is defined by
When $ m=0 $, $ \psi(z): =\psi ^{(0)}(z)=\frac{\mathrm{d} }{\mathrm{d} z}\ln\Gamma(z)=\frac{{\Gamma}'(z)}{\Gamma(z)} $ is called digamma function.
By [11, 1.7(3)] and [11, 1.9(10)], we know that
Proof of Lemma 2.2 By (2.2) and Definition 2.3, we have
By (2.5), we get
It follows that the function $ \varphi _{\kappa } \left (x \right) $ is strictly increasing on the interval $ \left (1, \infty \right) $.
Thus, if $ \kappa \leq 1 $, we have $ \varphi _{\kappa }\left (x \right) > \varphi _{\kappa }\left (1 \right)=-\ln\kappa \geq 0, \quad \forall x > 1. $ Then, by (2.6) we get $ {h}'_{\kappa }\left (x \right) > 0, \quad \forall x > 1, $ which implies that the function $ h_{\kappa}\left (x \right) $ is strictly increasing on $ \left (1, +\infty \right). $ Hence the function $ h_{\kappa}\left (x \right) $ has no minimum value on $ \left (1, \infty \right) $ and
By the L'Hospital's rule and (2.4), we have
Thus,
If $ \kappa > 1 $, then $ \varphi _{\kappa }\left (1 \right)=-\ln\kappa < 0. $ By [11, 1.18(1)] (Stirling formula) and [11, 1.18(7)], when $ z\rightarrow \infty $ we have
Since the function $ \varphi _{\kappa } \left (x \right) $ is continuous, by the zero point theorem, there exists $ x_{0}\left (\kappa \right) \in (1, \infty) $ which depends on $ \kappa $ satisfying that $ \varphi _{\kappa } \left (x_{0}\left (\kappa \right) \right)=0. $ Moreover, combining with the monotonicity of the function $ \varphi _{\kappa } \left (x \right) $ on the interval $ \left (1, \infty \right) $, we know that $ x_{0}\left (\kappa \right) $ is the unique null point of the function $ \varphi _{\kappa } \left (x \right) $ and
Then, by (2.6) we get
Thus, the function $ h_{\kappa }\left (x \right) $ is strictly decreasing on $ \left (1, x_{0}\left (\kappa \right) \right) $ and strictly increasing on $ \left (x_{0}\left (\kappa \right), +\infty \right) $, which implies that
Therefore,
The proof is complete.
Let $ X $ be a Pareto random variable with parameters $ a $ and $ \theta\, (a > 0, \theta > 0) $ and the density function
When $ \theta > 1 $, the expectation of $ X $ is $ EX=\frac{\theta a}{\theta -1} $. Then, for any given real number $ \kappa > 0 $, we have
which shows that $ P\left (X\leq \kappa EX \right) $ is independent of $ a $. Note that, in order to make sense of the above equality, if $ \kappa < 1 $, the parameter $ \theta $ should satisfy that $ 1 < \theta\le \frac{1}{1-\kappa}; $ and if $ \kappa\ge 1 $, the parameter $ \theta $ should satisfy that $ \theta > 1. $
Proposition 3.1 (ⅰ) If $ \kappa < 1 $, then
(ⅱ) If $ \kappa=1 $, then $ \inf\limits_{\theta \in (1, \infty)}g_1(\theta)=\lim\limits_{\theta\to \infty}g_1(\theta)=1-e^{-1} $.
(ⅲ) If $ \kappa > 1 $, then $ \min\limits_{\theta \in \left (1, \infty \right) }g_{\kappa }\left (\theta \right)= g_{\kappa}\left (\theta_{0}\left (\kappa \right) \right), $ where $ \theta _{0}\left (\kappa \right)=\frac{1}{1-x_{0}\left (\kappa \right) } $, and $ x_{0}\left (\kappa \right) $ is the unique null point of function $ \varphi _{\kappa }\left (x \right): = 1-\frac{1}{x}-\ln\frac{x}{\kappa } $ on the interval $ \left (0, 1 \right) $.
Note that $ \left (\frac{\theta -1}{\kappa \theta }\right)^{\theta }=e^{\theta \ln\frac{\theta -1}{\kappa \theta }} $. Let $ x=1-\frac{1}{\theta } $ and define function
and in order to finish the proof of Proposition 3.1, it is enough to prove the following lemma.
Lemma 3.2
(ⅰ) If $ \kappa < 1 $, then $ \min\limits_{x\in (0, \kappa] }h_{\kappa}(x)=h_{\kappa}(\kappa)=0. $
(ⅱ) If $ \kappa=1 $, then $ \inf\limits_{x\in (0, 1) }h_1(x)=\lim\limits_{x\to 1^-}h_1(x)=1 $.
(ⅲ) If $ \kappa > 1 $, then $ \min\limits_{x\in (0, 1) }h_{\kappa}(x)=h_{\kappa}\left (x_{0}\left (\kappa \right) \right), $ where $ x_{0}\left (\kappa \right) $ is the unique null point of function $ \varphi _{\kappa }\left (x \right): = 1-\frac{1}{x}-\ln\frac{x}{\kappa } $ on the interval $ \left (0, 1 \right) $.
Proof (ⅰ) If $ \kappa < 1 $, by (3.1) we have
By the definition of $ \varphi _{\kappa }\left (x \right) $, we get that
It follows that function $ \varphi _{\kappa }\left (x \right) $ is strictly increasing on $ \left (0, \kappa \right] $ and thus
Then, by (3.3) and (3.4) we get $ {h}'_{\kappa }\left (x \right) < 0, \quad \forall 0 < x\leq \kappa, $ which implies that function $ h_{\kappa}\left (x \right) $ is strictly decreasing on $ \left (0, \kappa \right] $. Thus $ \underset{x\in \left (0, \kappa \right]}{\min}h_{\kappa}\left (x \right)=h_{\kappa}\left (\kappa \right)=0. $
If $ \kappa \geq 1 $, by (3.1) and the definition of $ \varphi_{\kappa}(x) $ again, we also have
and
It follows that function $ \varphi _{\kappa }\left (x \right) $ is strictly increasing on $ \left (0, 1 \right) $.
(ⅱ) If $ \kappa = 1 $, then
By (3.5), we get that $ {h}'_{\kappa }\left (x \right) < 0, \quad \forall 0 < x < 1, $ which implies that function $ h_{\kappa}\left (x \right) $ is strictly decreasing on $ \left (0, 1 \right) $. Thus,
(ⅲ) If $ \kappa > 1 $, then $ \varphi _{\kappa }\left (1 \right)=\ln \kappa > 0 $. Moreover,
Since the function $ \varphi _{\kappa } \left (x \right) $ is continuous on $ (0, 1) $, by the zero point theorem, there exists $ x_{0}\left (\kappa \right) \in (0, 1) $ depending on parameter $ \kappa $ fulfills that $ \varphi _{\kappa } \left (x_{0}\left (\kappa \right) \right)=0. $ By the monotonicity of function $ \varphi _{\kappa } \left (x \right) $ on $ \left (0, 1 \right) $, we know that $ x_{0}\left (\kappa \right) $ is the unique null point of $ \varphi _{\kappa } \left (x \right) $ and
Then, by (3.5) we have
Therefore, the function $ h_{\kappa }\left (x \right) $ is strictly decreasing on $ \left (0, x_{0}\left (\kappa \right) \right) $ and is strictly increasing on $ \left (x_{0}\left (\kappa \right), 1 \right) $. Thus