数学杂志  2024, Vol. 44 Issue (4): 309-316   PDF    
扩展功能
加入收藏夹
复制引文信息
加入引用管理器
Email Alert
RSS
本文作者相关文章
HU Ze-chun
LU Peng
ZHOU Qian-qian
ZHOU Xing-wang
THE INFIMUM VALUES OF THE PROBABILITY FUNCTIONS FOR SOME INFINITELY DIVISIBLE DISTRIBUTIONS MOTIVATED BY CHVÁTAL'S THEOREM
HU Ze-chun1, LU Peng1, ZHOU Qian-qian2, ZHOU Xing-wang1    
1. College of Mathematics, Sichuan University, Chengdu 610065, China;
2. School of Science, Nanjing University of Posts and Telecommunications, Nanjing 210023, China
Abstract: Motivated by Chvátal's theorem, in this paper, we consider the infimum value of the probability $P(X\leq \kappa E[X])$, where $\kappa$ is a positive real number, and $X$ is a random variable. By employing the analytical method, the results for a random variable whose distribution belongs to some infinitely divisible distributions including the inverse Gaussian, log-normal, Gumbel and logistic distributions are obtained.
Keywords: Chvátal's theorem     infinitely divisible distribution     inverse gaussian distribution     log-normal distribution     gumbel distribution     logistic distribution    
由Chvátal定理引出的关于一些无穷可分分布的概率函数的下确界
胡泽春1, 陆鹏1, 周倩倩2, 周兴旺1    
1. 四川大学数学学院, 四川 成都 610065;
2. 南京邮电大学理学院, 江苏 南京 210023
摘要:受Chvátal定理的启发, 本文研究了概率$ P(X\le \kappa E[X]) $的下确界, 其中$ \kappa $是一个正实数, $ X $是一个随机变量. 利用分析的方法, 我们得到了当随机变量的分布属于无穷可分分布, 包括逆高斯分布, 对数正态分布, Gumble分布和Logistic分布, 时该随机变量不超过其期望的概率的下确界.
关键词Chvátal定理    无穷可分分布    逆高斯分布    对数正态分布    Gumbel分布    Logistic分布    
1 Introduction

Let $ B(n, p) $ denote a binomial random variable with parameters $ n $ and $ p $. Janson [1] introduced the following conjecture suggested by Vašk Chvátal.

Conjecture 1  (Chvátal). For any fixed $ n\geq 2 $, as $ m $ ranges over $ \{0, \ldots, n\} $, the probability $ q_m:=P(B(n, m/n)\leq m) $ is the smallest when $ m $ is closest to $ \frac{2n}{3} $.

Chvátal's conjecture has applications in machine learning (see Doerr [2], Greenberg and Mohri [3] and the references therein). Janson [1] showed that Conjecture 1 holds for large $ n $. Barabesi et al. [4] and Sun [5] proved that Conjecture 1 is true for general $ n\geq 2 $. Hereafter, we call Conjecture 1 by Chvátal's theorem.

Motivated by Chvátal's theorem, Li et al. [6] considered the infimum value problem on the probability that a random variable is not more than its expectation, when its distribution is the Poisson distribution, the geometric distribution or the Pascal distribution, where as to the Pascal distribution, only some partial results have been obtained. Sun et al. [7] investigated the corresponding infimum value problem for the Gamma distribution among other things. Li et al. [8] studied the infimum value problem for the Weibull distribution and the Pareto distribution. Guo et al. [9] considered the infimum value of the probability $ P(X\leq E[X]) $, where $ X $ is a negative binomial random variable, and gave an affirmative answer to the conjecture on the Pascal distribution posed in [6].

In this paper, we consider the infimum value of the probability $ P(X\leq \kappa E[X]) $, where $ \kappa $ is a positive real number, and $ X $ is a random variable whose distribution belongs to some infinitely divisible distributions including the inverse Gaussian, log-normal, Gumbel and logistic distributions in Section 2 and Section 3. Before presenting the main results, we give a remark.

Remark 1.1  Let $ X_{\alpha} $ be a random variable with finite expectation $ E[X_{\alpha}] $, where $ \alpha $ stands for some parameters in the distribution of $ X_{\alpha} $, and it is a real number or a vector in $ \mathbf{R}^n $ for some positive integers $ n\geq 2 $. We have the following two motivations to study $ \inf_{\alpha}P(X_{\alpha}\leq \kappa E[X_{\alpha}]) $:

● From $ \inf_{\alpha}P(X_{\alpha}\leq \kappa E[X_{\alpha}]) $, we can get $ \sup_{\alpha}P(X_{\alpha}>\kappa E[X_{\alpha}]) $. Obviously, if we wish the probability $ P(X_{\alpha}>\kappa E[X_{\alpha}]) $ is as large as possible, we should consider $ \sup_{\alpha}P(X_{\alpha}>\kappa E[X_{\alpha}]) $ or equivalently $ \inf_{\alpha}P(X_{\alpha}\leq \kappa E[X_{\alpha}]) $. Based on this observation, we wish that our work on this topic may find some applications in machine learning, statistics, finance and economics etc.

● Assume that $ X_{\alpha} $ is nonnegative and denote by $ \mu_{\alpha} $ the distribution of $ X_{\alpha} $. If $ \inf_{\alpha}P(X_{\alpha}\leq \kappa E[X_{\alpha}])=\beta>0 $, then for any $ \alpha $, we have

$ \begin{align*} \mu_{\alpha}([0, \kappa E[X_{\alpha}])\geq \beta, \end{align*} $

which tells us that the family of the distributions $ \{\mu_{\alpha}\} $ possesses a kind of measure concentration phenomenon.

2 Inverse Gaussian Distribution

Let $ X_{\mu, \lambda} $ be an inverse Gaussian random variable with parameters $ \mu $ and $ \lambda $ $ (\mu>0, \lambda>0). $ By [10, Chapter 27], we know that the density function of $ X_{\mu, \lambda} $ is given by

$ f_{\mu , \lambda}(x)=\sqrt{\frac {\lambda}{2\pi x^3}}\exp\left(-\frac{\lambda(x-\mu)^2}{2{\mu}^2x}\right), \ \ x>0, $

and $ E[X_{\mu, \lambda}]=\mu $. Then, for any given real number $ \kappa>0 $, by [10, Chapter 27], we know that

$ \begin{eqnarray*} P(X_{\mu, \lambda}\leq \kappa E[X_{\mu, \lambda}])&=& \int_0^{\kappa \mu} \sqrt{\frac {\lambda}{2\pi x^3}}\exp\left(-\frac{\lambda(x-\mu)^2}{2{\mu}^2x}\right) dx\\ &=&\Phi\left(\sqrt{\frac{\lambda}{\kappa \mu }}(\kappa-1)\right)+\exp\left(\frac{2\lambda}{\mu}\right)\Phi\left(-\sqrt{\frac{\lambda}{\kappa \mu}}(\kappa+1)\right). \end{eqnarray*} $

Hereafter, $ \Phi(\cdot) $ stands for the distribution function of the standard normal distribution.

Denote $ x:=\sqrt{\frac{\lambda}{\mu}} $. Then

$ P(X_{\mu, \lambda}\leq \kappa E[X_{\mu, \lambda}])=\Phi\left(\sqrt{\frac{1}{\kappa}}(\kappa-1)x\right)+e^{2x^2} \Phi\left(-\sqrt{\frac{1}{\kappa}}(\kappa+1)x\right). $

Define a function

$ \begin{gather} g_{\kappa}(x):=\Phi\left(\sqrt{\frac{1}{\kappa}}(\kappa-1)x\right)+e^{2x^2} \Phi\left(-\sqrt{\frac{1}{\kappa}}(\kappa+1)x\right), \ x>0. \end{gather} $ (1.1)

The main result of this section is

Theorem 2.1  () If $ \kappa \leq 1 $, then

$ \begin{eqnarray*} \inf\limits_{x \in \left ( 0, \infty \right ) }g_{\kappa }\left ( x \right )=\lim\limits_{x\to \infty}g_{\kappa}(x)= \left\{ \begin{array}{cl} 0, & \mbox{if}\ \kappa < 1, \\ \frac{1}{2}, & \mbox{if}\ \kappa = 1. \end{array} \right. \end{eqnarray*} $

() If $ \kappa>1 $, then $ \min\limits_{x\in \left ( 0, \infty \right ) }g_{\kappa }\left ( x \right )=g_{\kappa }\left ( x_{0}(\kappa) \right ), $ where $ x_{0}(\kappa) $ is the unique zero point of the function

$ \begin{eqnarray} h_{\kappa}(x):=2\int^{\infty}_{\sqrt{\frac{1}{\kappa}}(\kappa+1)x} \exp\left(-\frac{1}{2}t^2\right)dt-\sqrt{\frac{1}{\kappa}}\frac{\exp(-\frac{{(\kappa+1)}^2} {2\kappa}x^2)}{x}, \ x\in (0, \infty). \end{eqnarray} $ (1.2)

Proof  By taking the derivative of the function $ g_{\kappa}(x) $ defined in (1.1) and in virtue of (1.2), we have

$ \begin{eqnarray} g_{\kappa }^{'}(x)&=&\sqrt{\frac{1}{\kappa}}(\kappa-1)\Phi'\left(\sqrt{\frac{1}{\kappa}}(\kappa-1)x\right) +4xe^{2x^2} \Phi\left(-\sqrt{\frac{1}{\kappa}}(\kappa+1)x\right)\\ &&-\sqrt{\frac{1}{\kappa}}(\kappa+1)e^{2x^2} \Phi'\left(-\sqrt{\frac{1}{\kappa}}(\kappa+1)x\right)\\ &=&\frac{4xe^{2x^2}}{\sqrt{2\pi}}\int_{\sqrt{\frac{1}{\kappa}}(\kappa+1)x}^{\infty} \exp\left(-\frac{1}{2}t^2\right)dt-2\sqrt{\frac{1}{2\pi\kappa}} \exp\left(-\frac{{(\kappa-1)}^2}{2\kappa}x^2\right)\\ &=&\frac{2xe^{2x^2}}{\sqrt{2\pi}}\left[ 2\int_{\sqrt{\frac{1}{\kappa}}(\kappa+1)x}^{\infty}\exp\left(-\frac{1}{2}t^2\right)dt- \sqrt{\frac{1}{\kappa}}\frac{\exp\left(-\frac{{(\kappa+1)}^2}{2\kappa}x^2\right)}{x}\right]\\ &=&\frac{2xe^{2x^2}}{\sqrt{2\pi}}h_{\kappa}(x). \end{eqnarray} $ (1.3)

By taking the derivative of the function $ h_{\kappa}(x), $ we get that

$ \begin{eqnarray} h_{\kappa }^{'}(x)&=&\sqrt{\frac{1}{\kappa}}\exp\left(-\frac{(\kappa+1)^2}{2\kappa}x^2\right) \left(\frac{1}{\kappa}-\kappa+\frac{1}{x^2}\right)\\ &:=&\sqrt{\frac{1}{\kappa}}\exp\left(-\frac{(\kappa+1)^2}{2\kappa}x^2\right) \varphi_{\kappa}(x), \end{eqnarray} $ (1.4)

where $ \varphi_{\kappa}(x)=\frac{1}{\kappa}-\kappa+\frac{1}{x^2}. $

() If $ \kappa \leq 1 $, then $ \varphi_{\kappa}(x)\geq \frac{1}{x^2}>0 $ for any $ x \in \left (0 , \infty \right) $. Thus, we have $ h_{\kappa }^{'}(x)>0, \ \ \forall x\in (0, \infty), $ which implies that the function $ h_{\kappa}(x) $ is strictly increasing on interval $ (0, \infty) $. We have

$ \begin{eqnarray*} \lim\limits_{x \rightarrow \infty} h_{\kappa }\left ( x \right ) & =& \lim\limits_{x \rightarrow \infty}\left(2\int_{\sqrt{\frac{1}{\kappa}}(\kappa+1)x}^{\infty} \exp\left(-\frac{1}{2}t^2\right)dt- \sqrt{\frac{1}{\kappa}}\frac{\exp(-\frac{{(\kappa+1)}^2}{2\kappa}x^2)}{x}\right)=0. \end{eqnarray*} $

Thus for any $ x\in (0, \infty) $, $ h_{\kappa }(x)< 0 $. Then by (1.3), we get

$ g_{\kappa }^{'}\left ( x \right ) < 0, \ \forall x\in (0, \infty). $

Therefore, $ g_{\kappa }\left ( x \right ) $ is a strictly decreasing function on interval $ \left (0, \infty \right) $ and thus

$ \begin{eqnarray*} &&\inf\limits_{x\in (0, \infty)}g_{\kappa}(x)=\lim\limits_{x\rightarrow \infty}g_{\kappa}(x)\\ &&=\lim\limits_{x\rightarrow \infty}\Phi\left(\sqrt{\frac{1}{\kappa}}(\kappa-1)x\right)+\lim\limits_{x\rightarrow \infty}e^{2x^2}\frac{1}{\sqrt{2\pi}}\int_{\sqrt{\frac{1}{\kappa}}(\kappa+1)x}^{\infty} \exp\left(-\frac{t^2}{2}\right)dt\\ &&=\lim\limits_{x\rightarrow \infty}\Phi\left(\sqrt{\frac{1}{\kappa}}(\kappa-1)x\right)+\lim\limits_{x\rightarrow \infty}\frac{1}{\sqrt{2\pi}}e^{2x^2}\int_{0}^{\infty} \exp\left(-\frac{{\left(u+\sqrt{\frac{1}{\kappa}}(\kappa+1)x\right)}^2}{2}\right)du\\ &&=\lim\limits_{x\rightarrow \infty}\Phi\left(\sqrt{\frac{1}{\kappa}}(\kappa-1)x\right)+\lim\limits_{x\rightarrow \infty}\frac{1}{\sqrt{2\pi}}\int_{0}^{\infty}\exp\left(-\frac{1}{2}\left(u^2 +\frac{2(\kappa+1)}{\sqrt {\kappa}}xu+\frac{({\kappa-1})^2}{\kappa}x^2\right)\right)du\\ &&=\lim\limits_{x\rightarrow \infty}\Phi\left(\sqrt{\frac{1}{\kappa}}(\kappa-1)x\right)\\ &&=\left \{ \begin{array}{ll} 0, & \mbox{if}\ \kappa < 1, \\ \frac {1}{2}, & \mbox{if}\ \kappa = 1. \end{array} \right. \end{eqnarray*} $

() If $ \kappa > 1 $, then by $ h^{'}_{\kappa}(x)=0 $, i.e., $ \varphi_{\kappa}(x)=\frac{1}{\kappa}-\kappa+\frac{1}{x^2}=0 $, we get $ x=\sqrt{\frac{\kappa}{\kappa^2-1}}. $ Then, by (1.4), we know that $ h^{'}_{\kappa}(x)>0 $ for any $ x\in (0, \sqrt{\frac{\kappa}{\kappa^2-1}}) $ and $ h^{'}_{\kappa}(x)<0 $ for any $ x\in (\sqrt{\frac{\kappa}{\kappa^2-1}}, \infty) $. It follows that the function $ h_{\kappa}(x) $ is strictly increasing on $ (0, \sqrt{\frac{\kappa}{\kappa^2-1}}) $ and strictly decreasing on $ (\sqrt{\frac{\kappa}{\kappa^2-1}}, \infty) $. Furthermore, by the fact that $ \lim_{x \rightarrow 0+}h_{\kappa}(x)=-\infty <0 $ and $ \lim_{x \rightarrow \infty}h_{\kappa}(x) = 0 $, we know that the continuous function $ h_{\kappa}(x) $ has a unique zero point $ x_{0}(\kappa)\in (0, \sqrt{\frac{\kappa}{\kappa^2-1}}) $. It follows that $ h_{\kappa}(x)<0 $ for any $ x\in (0, x_0(\kappa)) $ and $ h_{\kappa}(x)>0 $ for any $ x\in (x_0(\kappa), \infty) $.

Hence, by (1.3), we get that $ g^{'}_{\kappa}(x)<0 $ on $ (0, x_0(\kappa)) $ and $ g^{'}_{\kappa}(x)>0 $ on $ (x_0(\kappa), \infty) $. It follows that

$ \min\limits_{x\in (0, \infty)}g_{\kappa}(x)=g_{\kappa}(x_0(\kappa)). $

The proof is complete.

Remark 2.2  () For any $ \kappa>1, \mu>0, \lambda>0 $, it is easy to know that

$ P(X_{\mu, \lambda}\le \kappa E[X_{\mu, \lambda}])>P(X_{\mu, \lambda}\le E[X_{\mu, \lambda}])>\frac{1}{2}, $

where the second inequality follows from the proof of Theorem 2.1. Then for any $ \kappa>1 $, we have

$ \begin{eqnarray*} \min\limits_{x\in \left ( 0, \infty \right ) }g_{\kappa }\left ( x \right )=g_{\kappa }\left ( x_{0}(\kappa) \right )>\frac{1}{2}. \end{eqnarray*} $

() The above analysis shows that there is an interesting phase transition phenomenon in the infimum value problem for the inverse Gaussian distribution. The critical point is $ \kappa=1 $.

3 Log-normal, Gumbel, and Logistic Distributions
3.1 Log-normal

Let $ X_{\mu, \sigma} $ be a log-normal random variable with parameters $ \mu $ and $ \sigma $ $ (\mu \in \mathbb{R}, \sigma>0). $ By [10, Chapter 22], we know that the density function of $ X_{\mu, \sigma} $ is

$ f_{\mu , \sigma}(x)=\frac {1}{\sqrt {2\pi }\sigma x}\exp\left(-\frac {(\ln x-\mu)^2}{2{\sigma}^2}\right), \ \ x>0, $

and the expectation is $ E[X_{\mu, \sigma}]=\exp(\mu +\frac{{\sigma}^2}{2}) $. Then, for any given real number $ \kappa>0, $ by [10, (22.1.2)], we have

$ \begin{eqnarray*} P(X_{\mu, \sigma}\leq \kappa E[X_{\mu, \sigma}])=\Phi\left(\frac{\ln\left(\kappa e^{\mu+\frac{\sigma^2}{2}}\right)-\mu}{\sigma}\right) =\Phi\left(\frac{\ln\kappa}{\sigma}+\frac{\sigma}{2}\right), \end{eqnarray*} $

which shows that $ P(X_{\mu, \sigma}\leq \kappa E[X_{\mu, \sigma}]) $ is independent of $ \mu. $ Define a function

$ g_\kappa(\sigma):=\Phi\left(\frac{\ln\kappa}{\sigma}+\frac{\sigma}{2}\right), \ \sigma>0. $

The main result of this section is

Proposition 3.1  () If $ \kappa \leq 1 $, then

$ \begin{eqnarray} \inf\limits_{\sigma \in (0, \infty)}g_{\kappa }( \sigma)=\lim\limits_{\sigma\to 0+}g_{\kappa}(\sigma)= \left\{ \begin{array}{cl} 0, & \mbox{if}\ \kappa < 1, \\ \frac{1}{2}, & \mbox{if}\ \kappa = 1; \end{array} \right. \end{eqnarray} $ (1.5)

() If $ \kappa>1 $, then

$ \begin{eqnarray} \min\limits_{\sigma \in \left ( 0, \infty \right ) }g_{\kappa }\left ( \sigma \right )=g_{\kappa }\left ( \sqrt{2 \ln \kappa} \right )=\Phi(\sqrt{2\ln \kappa })>\frac{1}{2}. \end{eqnarray} $ (1.6)

Proof  Define a function

$ \begin{gather} h_{\kappa}(\sigma):=\frac{\ln \kappa}{\sigma}+\frac{\sigma}{2}, \ \sigma>0. \end{gather} $ (1.7)

Then $ g_{\kappa}(\sigma)=\Phi(h_{\kappa}(\sigma)). $ Therefore, in order to prove this proposition, it is enough to investigate the value of $ h_{\kappa}(\sigma) $.

If $ \kappa\le 1, $ then $ \ln \kappa\le 0. $ Thus, by (1.7), we have $ h^{'}_{\kappa}(\sigma)=-\frac{\ln \kappa}{\sigma^2}+\frac{1}{2}>0 $, which implies that $ h_{\kappa}(\sigma) $ is a strictly increasing function of $ \sigma $. Thus,

$ \begin{eqnarray*} \inf\limits_{\sigma \in \left ( 0, \infty \right ) }h_{\kappa }\left ( \sigma \right )=\lim\limits_{\sigma\to 0+}h_{\kappa}(\alpha)= \left\{ \begin{array}{cl} -\infty, & \mbox{if}\ \kappa < 1, \\ 0, & \mbox{if} \ \kappa = 1, \end{array} \right. \end{eqnarray*} $

It follows that (1.5) holds.

If $ \kappa>1, $ then $ \ln \kappa>0. $ If $ h^{'}_{\kappa}(\sigma)=-\frac{\ln \kappa}{\sigma^2}+\frac{1}{2}=0 $, then $ \sigma=\sqrt{ 2\ln \kappa}. $ It is easy to check that the function $ h_{\kappa}(\sigma) $ is strictly decreasing on $ (0, \sqrt{ 2\ln \kappa}) $ and strictly increasing on $ (\sqrt{ 2\ln \kappa}, \infty) $. Thus,

$ \min\limits_{\sigma\in (0, \infty)}h_{\kappa}(\sigma)=h_{\kappa}(\sqrt{ 2\ln \kappa})=\sqrt{ 2\ln \kappa}. $

It follows that (1.6) holds. The proof is complete.

3.2 Gumbel Distribution

Let $ X_{\mu, \beta} $ be a Gumbel random variable with parameters $ \mu $ and $ \beta $ $ (\mu\in \mathbb{R}, \beta>0) $. By [11], the density function of $ X_{\mu, \beta} $ is given by

$ f_{\mu, \beta}(x)=\frac{1}{\beta}e^{-z-e^{-z}}, \ z=\frac{x-\mu}{\beta}, \ x\in \mathbb{R} $

and its expectation is $ E[X_{\mu, \beta}]=\mu +\beta \gamma, $ where $ \gamma $ is the Euler's constant. Then, for any given real number $ \kappa>0, $ we have

$ P(X_{\mu, \beta}\leq \kappa E[X_{\mu, \beta}])=e^{-e^{-\frac{\kappa{(\mu+\beta \gamma)}-\mu}{\beta}}}=e^{-e^{-\frac{(\kappa-1)\mu}{\beta}-\kappa \gamma}}. $

Let $ x:=\frac{\mu}{\beta} $. Then

$ P(X_{\mu, \beta}\leq \kappa E[X_{\mu, \beta}])=e^{-e^{-[(\kappa-1)x+\kappa \gamma]}}. $

Define a function

$ \begin{eqnarray} g_{\kappa}(x):=e^{-e^{-[(\kappa-1)x+\kappa \gamma]}}, \ \ x\in \mathbb{R}. \end{eqnarray} $ (1.8)

The main result of this section is

Proposition 3.2  () If $ \kappa <1 $, then $ \inf\limits_{x \in \mathbb{R} }g_{\kappa }\left (x \right )=\lim\limits_{x\to \infty}g_{\kappa}(x)=0; $

() If $ \kappa> 1 $, then $ \inf\limits_{x \in \mathbb{R} }g_{\kappa }\left (x \right )=\lim\limits_{x\to -\infty}g_{\kappa}(x)=0; $

() If $ \kappa=1, $ then $ g_{\kappa}(x)\equiv e^{-e^{- \gamma}}>\frac{1}{2}. $

Proof  Define a function

$ h_{\kappa}(x):=(\kappa-1)x+\kappa \gamma, \ x\in \mathbb{R}. $

Then

$ g_{\kappa}(x)=e^{-e^{-h_{\kappa}(x)}}, \ x\in \mathbb{R}. $

By taking the derivative of $ g_{\kappa}(x) $, we have

$ \begin{gather*} \label{e1} g^{'}_{\kappa}(x)=(\kappa-1)e^{-h_{\kappa}(x)}e^{-e^{-h_{\kappa}(x)}}. \end{gather*} $

If $ \kappa <1, $ then $ g^{'}_{\kappa}(x)< 0, $ which implies that $ g_{\kappa}(x) $ is a strictly decreasing function of $ x. $ Then

$ \inf\limits_{x \in \mathbb{R} }g_{\kappa }\left (x \right )=\lim\limits_{x\to \infty}g_{\kappa}(x)=0. $

If $ \kappa> 1, $ then $ g^{'}_{\kappa}(x)> 0. $ Thus, $ g_{\kappa}(x) $ is a strictly increasing function of $ x $ and

$ \inf\limits_{x\in \mathbb{R}}g_{\kappa}(x)=\lim\limits_{x \rightarrow -\infty}g_{\kappa}(x)=0. $

If $ \kappa=1, $ then by (1.8), we get that $ g_{\kappa}(x)\equiv e^{-e^{- \gamma}}>\frac{1}{2} $. The proof is complete.

3.3 Logistic Distribution

Let $ X_{\mu, \beta} $ be a logistic random variable with parameters $ \mu $ and $ \beta $ $ (\mu \in \mathbb{R}, \beta>0) $. By [10, Chapter 21], we know that the distribution function of $ X_{\mu, \beta} $ is given by

$ P(X\leq x)=\frac{1}{1+e^{-\frac{x-\mu}{\beta}}}, \ x \in \mathbb{R}, $

and the expectation of $ X_{\mu, \beta} $ is $ E[X]=\mu $. Then, for any given real number $ \kappa>0, $ we have

$ P(X_{\mu, \beta}\leq \kappa E[X_{\mu, \beta}])= \frac{1}{1+e^{-\frac{\kappa \mu-\mu}{\beta}}}. $

Let $ y:=\frac{\mu}{\beta} $. Then

$ P(X_{\mu, \beta}\leq \kappa E[X_{\mu, \beta}])= \frac{1}{1+e^{-(\kappa-1)y}}. $

Define a function

$ \begin{gather} g_{\kappa}(y):=\frac{1}{1+e^{-(\kappa-1)y}}, \ \ y\in \mathbb{R}. \end{gather} $ (1.9)

The main result of this section is

Proposition 3.3  () If $ \kappa < 1 $, then $ \inf\limits_{y \in \mathbb{R} }g_{\kappa }\left (y \right )=\lim\limits_{y\to \infty}g_{\kappa}(y)=0; $

() If $ \kappa> 1 $, then $ \inf\limits_{y \in \mathbb{R} }g_{\kappa }\left ( y \right )=\lim\limits_{y\to -\infty}g_{\kappa}(y)=0; $

() If $ \kappa=1, $ then $ g_{\kappa}(y)\equiv \frac{1}{2}. $

Proof  By the definition $ g_{\kappa}(y) $ in (1.9), we have

$ g^{'}_{\kappa}(y)=\frac{(\kappa-1)e^{-(\kappa-1)y}}{(1+e^{-(\kappa-1)y})^2}, \ \ y\in \mathbb{R}. $

If $ \kappa<1, $ then $ g^{'}_{\kappa}(y)<0 $, which implies that the function $ g_{\kappa}(y) $ is strictly decreasing. Then

$ \inf\limits_{y\in \mathbb{R}} g_{\kappa}(y)=\lim\limits_{y\rightarrow \infty}g_{\kappa}(y)=0. $

If $ \kappa> 1, $ then $ g^{'}_{\kappa}(y)> 0. $ Thus, $ g_{\kappa}(y) $ is a strictly increasing function of $ y $ and

$ \inf\limits_{y\in \mathbb{R}} g_{\kappa}(y)=\lim\limits_{y\rightarrow -\infty}g_{\kappa}(y)=0. $

If $ \kappa=1, $ then $ g_{\kappa}(y)\equiv\frac{1}{2} $ by (1.9). The proof is complete.

References
[1]
Janson S. On the probability that a binomial variable is at most its expectation[J]. Statis. Probab. Lett., 2021, 171: 109020. DOI:10.1016/j.spl.2020.109020
[2]
Doerr B. An elementary analysis of the probability that a binomial random variable exceeds its expectation[J]. Statis. Probab. Lett., 2018, 139: 67-74. DOI:10.1016/j.spl.2018.03.016
[3]
Greenberg S, Mohri M. Tight lower bound on the probability of a binomial exceeding its expectation[J]. Statis. Probab. Lett., 2014, 86: 91-98. DOI:10.1016/j.spl.2013.12.009
[4]
Bababesi L, Pratelli L, Rigo P. On the Chvátal-Janson conjecture[J]. Statis. Probab. Lett., 2023, 194: 109744. DOI:10.1016/j.spl.2022.109744
[5]
Sun Ping. Strictly unimodality of the probability that the binomial distribution is more than its expectation[J]. Discrete Appl. Math., 2021, 301: 1-5. DOI:10.1016/j.dam.2021.05.013
[6]
Li Fubo, Xu Kun, Hu Zechun. A study on the Poisson, geometric and Pascal distributions motivated by Chvátal's conjecture[J]. Statis. Probab. Lett., 2023, 200: 109871. DOI:10.1016/j.spl.2023.109871
[7]
Sun Ping, Hu Zechun, Sun Wei. The infimum values of two probability functions for the Gamma distribution[J]. J. Inequal. Appl., 2024, 2024(1): 5. DOI:10.1186/s13660-024-03081-w
[8]
Li Cheng, Hu Zechun, Zhou Qianqian. A study on the Weibull and Pareto distributions motivated by Chvátal's theorem[J]. J. of Math. (PRC), 2024, 44(3): 195-202.
[9]
Guo Zhengyan, Tao Zeyu, Hu Zuchun. A study on the negative binomial distribution motivated by Chvátal's theorem[J]. Statis. Probab. Lett., 2024, 207: 110037. DOI:10.1016/j.spl.2024.110037
[10]
Krishnamoorthy K. Handbook of statistical distributions with applications[M]. New York: CRC Press, 2020.
[11]
Steutel F W. Some recent results in infinite divisibility[J]. Stoch. Proc. Appl., 1973, 1(2): 125-143.