数学杂志  2019, Vol. 39 Issue (5): 705-712   PDF    
扩展功能
加入收藏夹
复制引文信息
加入引用管理器
Email Alert
RSS
本文作者相关文章
XU Ming-zhou
CHENG Kun
DING Yun-zheng
ZHOU Yong-zheng
PROBABILITY INEQUALITIES AND ROSENTHAL INEQUALITIES FOR THE SEQUENCE OF MARTINGALE DIFFERENCES
XU Ming-zhou, CHENG Kun, DING Yun-zheng, ZHOU Yong-zheng    
School of Information Engineering, Jingdezhen Ceramic Institute, Jingdezhen 333403, China
Abstract: In this paper, we discuss some inequalities for the sequence of martingale differences. By using properties of conditional expectation and elementary inequalities, we obtain the basic inequalities of Bernstein, Kolomogrov, Hoeffding for the sequence of martingale differences, which extend the results on the case of bounded random vectors. Moreover, we obtain classical Kolmogorov and Rosenthal inequalities for maximum partial sums of martingale differences, which complement the results on the case of independent and negatively dependent random variables under sub-linear expectations.
Keywords: martingale differences     Bernstein inequality     Kolmogorov inequality     Hoeffding inequality     Rosenthal inequality    
鞅差序列的概率不等式和Rosenthal不等式
徐明周, 程琨, 丁云正, 周永正    
景德镇陶瓷大学信息工程学院, 江西 景德镇 333403
摘要:本文研究了鞅差序列的一些不等式.利用条件期望性质和基本不等式,获得了鞅差序列的Bernstein,Kolmogorov和Hoeffding不等式,推广了有界随机向量相应的结果.另外,得到了鞅差序列的最大部分和的经典Kolmogorov和Rosenthal不等式,补充了次线性期望下独立和负相依随机变量的相应结果.
关键词鞅差序列    Bernstein不等式    Kolmogorov不等式    Hoeffding不等式    Rosenthal不等式    
1 Introduction and Main Results

To prove limit theorems in probability theory such as laws of large numbers, central limit theorem, etc., one need to use necessary probability inequalities, which attract attentions of many authors. Ahmad and Amezziane [1] proved extensions of the basic inequalities of Bernstein, Kolmogorov and Hoeffding for the sums of bounded random vectors. Li [2] established Bernstein inequality of the sequence of the martingale differences. Bercu and Touati [3] proved several exponential inequalities for self-normalized martingale by introducing a new notion of random variable heavy on left or right. Fan et al. [4] obtained exponential inequalities of Bennett Freedman, de la Peña, Pinelis and van de Geer. Fan et al. [5] proved martingale inequalities of type Dzhaparidze and van Zanten. Zhang obtained Rosenthal inequalities for independent and negatively dependent random variables under sub-linear expectations. Xu and Miao [6] proved almost sure convergence of weighted sums for martingale differences. Yu [7] obtained complete convergence of weighted sums for martingale differences. Wang and Hu [8] established complete convergence and complete moment convergence for martingale difference sequence. It is natural to ask whether or not the basic inequalities of Bernstein, Kolmogorov and Hoeffding, and Rosenthal inequalities for the sequence of the martingale differences hold. Here we give an affirmative answer to this problem. We state the main results in this section, and present the proofs in Section 2.

Let $ \{\xi_i,{\mathcal F}_i, i\ge 1\} $ be martingale differences on the probability space $ (\Omega, {\mathcal F}, P) $ such that $ |\xi_i|\le B<\infty $, $ i = 1,2,\cdots $, where $ B $ is nonrandom. Set $ \|\xi\| = \inf\{c>0,P(|\xi|\le c) = 1\} $, which is called the essential supremum of random variable $ \xi $. Denote $ \bar{\sigma}_i^2 = \|E(\xi_i^2|{\mathcal F}_{i-1})\| $, $ i\ge 1 $, where $ {\mathcal F}_0 = \{\emptyset, \Omega\} $. Let $ a_1, \cdots, a_n $ be positive real numbers such that $ m = \max\limits_{1\le i\le n}a_i $ and $ A_n^2 = \sum\limits_{i = 1}^{n}a_i^2\bar{\sigma}_i^2 $. Write $ S_0 = 0 $, $ S_n = \sum\limits_{i = 1}^{n}a_i\xi_i = :\sum\limits_{i = 1}^{n}X_i $. The following are the main results.

Theorem 1.1 (i) For any $ x>0 $, $ P(S_n\ge x)\le \exp(-\frac{x^2}{2(A_n^2+xBm/3)}). $

(ii) For any $ x>0 $, $ P(S_n\ge x)\le \exp(-\frac{x^2}{2A_n^2}(1-xmB^3(\sum\limits_{i = 1}^{n}a_i^2)/A_n^4)). $

(iii) For any $ x>0 $, $ P(S_n\ge x)\le \exp(\frac{x}{Bm})(1+\frac{xBm}{A_n^2})^{-(\frac{x}{Bm}+(\frac{A_n}{Bm})^2)}. $

Remark 1.1 (i), (ii), and (iii) in Theorem 1.1 is called to be Bernstein inequality, Kolmogorov inequality, and Hoeffding inequality, respectively. As pointed out in Fan et al. [5], in Theorem 1.1 Bernstein inequality (i) is implied by Hoeffding inequality (iii).

Theorem 1.2 (Kolmogorov inequality) $ E[\max\limits_{k\le n}(S_n-S_k)^2]\le \sum\limits_{k = 1}^{n}E[X_k^2]. $ In particular, $ E[S_n^{+}]\le \sum\limits_{k = 1}^{n}E[X_k^2]. $

Theorem 1.3 (Rosenthal inequality) (a)

$ \begin{equation} E[|\max\limits_{0\le k\le n}(S_n-S_k)|^p]\le 2^{2-p}\sum\limits_{k = 1}^{n}E[|X_k|^p] \mbox{ for $1\le p\le 2$} \end{equation} $ (1.1)

and

$ \begin{equation} E[|\max\limits_{0\le k\le n}(S_n-S_k)|^p]\le C_p n^{p/2-1}\sum\limits_{k = 1}^{n}E[|X_k|^p] \mbox{ for $ p\ge 2$}. \end{equation} $ (1.2)

In particular,

$ \begin{eqnarray} E[(S_n^{+})^p]\le\begin{cases} 2^{2-p}\sum\limits_{k = 1}^{n}E[|X_k|^p]& \text{ for $1\le p\le 2$,}\\ C_p n^{p/2-1}\sum\limits_{k = 1}^{n}E[|X_k|^p] & \text{ for $ p\ge 2$.} \end{cases} \end{eqnarray} $ (1.3)

(b)

$ \begin{equation} E[|\max\limits_{k\le n}(S_n-S_k)|^p]\le C_p\{\sum\limits_{k = 1}^{n}E[|X_k|^p]+A_n^p\} \mbox{ for $1\le p\le 2$}. \end{equation} $ (1.4)

In particular,

$ E[(S_n^{+})^p]\le C_p\{\sum\limits_{k = 1}^{n}E[|X_k|^p]+A_n^p\} \mbox{ for $1\le p\le 2$}, $

here $ C_p $ is a positive constant depending only on $ p $.

2 The Proof of Main Results

Proof of Theorem 1.1 For any $ \alpha>0 $, by Chebyshev inequality, we obtain

$ \begin{equation} P\left(S_n\ge x\right)\le \frac{E\left(\exp(\alpha S_n)\right)}{\exp(\alpha x)}. \end{equation} $ (2.1)

As in the proof of Ahmad and Amezziane [1], Li [2], to prove the result in the theorem, we first obtain an upper bound for $ E(\exp(\alpha S_n) $ for all $ \alpha>0 $ and then choose $ \alpha $ that the upper bound is minimized. The ideas originally come from Bernstein, Kolmogoroff, Hoeffding [9], Bennett [10]. Here the ideas also come from Bentkus [11], Ahmad and Amezziane [1], Li [2], and Gao and Wu [12].

(i) For any real $ z $, the power expansion of $ \exp(z) $ is

$ \begin{equation} \exp(z) = 1+z+\sum\limits_{j = 2}^{\infty}\frac{z^j}{j!}. \end{equation} $ (2.2)

Note that $ E(\alpha a_i\xi_i|{\mathcal F}_{i-1}) = 0 $, $ E(|\alpha a_i\xi_i|^k|{\mathcal F}_{i-1})\le \frac{k!}{2!}E(|\alpha^2a_i^2\xi_i|^2|{\mathcal F}_{i-1})\left(\frac{\alpha a_i B}{3}\right)^{k-2} $, $ k\ge 2 $. For $ \frac{\alpha a_i B}{3}<1 $, we have

$ \begin{equation} E(\exp(\alpha a_i\xi_i)|{\mathcal F}_{i-1})\le 1+\frac{\alpha^2a_i^2\bar{\sigma}_i^2}{2}\sum\limits_{j = 0}^{\infty}\left(\frac{\alpha a_i B}{3}\right)^j\le \exp\left(\frac{\alpha^2a_i^2\bar{\sigma}_i^2}{2}\left(1-\frac{\alpha a_i B}{3}\right)^{-1}\right). \end{equation} $ (2.3)

By the properties of conditional expectations, for $ \frac{\alpha a_i B}{3}<1 $, we get

$ \begin{eqnarray} E\left(\exp\left(\alpha\sum\limits_{i = 1}^{n}a_i\xi_i\right)\right) = E\left(E\left(\exp\left(\alpha\sum\limits_{i = 1}^{n}a_i\xi_i\right)|{\mathcal F}_{n-1}\right)\right)\\ \le E\left(\exp\left(\alpha\sum\limits_{i = 1}^{n-1}a_i\xi_i\right)\exp\left(\frac{\alpha^2a_n^2\bar{\sigma}_n^2}{2}\left(1-\frac{\alpha a_n B}{3}\right)^{-1}\right)\right)\\ \le \cdots\le \exp\left(\sum\limits_{i = 1}^{n}\frac{\alpha^2a_i^2\bar{\sigma}_i^2}{2}\left(1-\frac{\alpha a_i B}{3}\right)^{-1}\right). \end{eqnarray} $ (2.4)

Substituting (2.4) to (2.1), we obtain

$ \begin{equation} P(S_n\ge x)\le \exp\left(-\alpha x+\frac{\alpha^2}{2}\sum\limits_{i = 1}^{n}a_i^2\bar{\sigma}_i^2\left(1-\frac{\alpha a_i B}{3}\right)^{-1}\right). \end{equation} $ (2.5)

We choose

$ \alpha = \frac{x}{\frac{mBx}{3}+\sum\limits_{i = 1}^{n}a_i^2\bar{\sigma}_i^2} = \frac{x}{\frac{mBx}{3}+A_n^2}, $

the above inequality becomes

$ \begin{aligned} P(S_n\ge x)&\le \exp\left(-\frac{x^2}{\frac{mBx}{3}+A_n^2}+\frac{x^2}{2\left(\frac{mBx}{3}+A_n^2\right)^2}\sum\limits_{i = 1}^{n}a_i^2\bar{\sigma}_i^2\left(1-\frac{ xa_i B/3}{\frac{mBx}{3}+A_n^2}\right)^{-1}\right)\\ &\le\exp\left(-\frac{x^2}{\frac{mBx}{3}+A_n^2}+\frac{x^2}{2\left(\frac{mBx}{3}+A_n^2\right)}\sum\limits_{i = 1}^{n}\frac{a_i^2\bar{\sigma}_i^2}{\frac{mBx}{3}+A_n^2-a_iBx/3}\right)\\ &\le \exp\left(-\frac{x^2}{\frac{mBx}{3}+A_n^2}+\frac{x^2}{2\left(\frac{mBx}{3}+A_n^2\right)}\right)\\ & = \exp\left(-\frac{x^2}{2\left(\frac{mBx}{3}+A_n^2\right)}\right). \end{aligned} $

(ii) For any $ t $, $ |\log(1+t+\frac{t^2}{2})-t|\le \frac{|t|^3}{2} $, thus we obtain with probability one,

$ \begin{aligned} \exp(\alpha a_i \xi_i) = (1+\alpha a_i \xi_i+\frac12\alpha^2 a_i^2 \xi_i^2 )\exp(R_n), \end{aligned} $

where

$ \begin{aligned} |R_n|\le \frac12 |\alpha a_i \xi_i|^3\le \frac12 |\alpha a_i B|^3. \end{aligned} $

Therefore

$ \begin{aligned} E(\exp(\alpha a_i \xi_i)|{\mathcal F}_{i-1})\le [1+\frac12 \alpha^2a_i^2\bar{\sigma}_i^2]\exp(\frac12 |\alpha a_i B|^3). \end{aligned} $

Hence by the properties of conditional expectations, we have

$ \begin{aligned} E(\exp(\alpha S_n))& = E\left(E\left(\exp\left(\alpha\sum\limits_{i = 1}^{n}a_i\xi_i\right)|{\mathcal F}_{n-1}\right)\right)\\ &\le E\left(\exp\left(\alpha\sum\limits_{i = 1}^{n-1}a_i\xi_i\right)\right)[1+\frac12 \alpha^2a_i^2\bar{\sigma}_i^2]\exp(\frac12 |\alpha a_i B|^3)\\ &\le\cdots \le \exp\left(\frac12 |\alpha B|^3 m \sum\limits_{i = 1}^{n}a_i^2\right)\prod\limits_{i = 1}^{n}[1+\frac12 \alpha^2a_i^2\bar{\sigma}_i^2]. \end{aligned} $

Because $ 1+t\le \exp(t) $ when $ t>0 $, the right hand side above is bounded by

$ \begin{eqnarray} \exp\left(\sum\limits_{i = 1}^{n}\frac12 \alpha^2a_i^2\bar{\sigma}_i^2\right)\exp\left(\frac12 mB^3\alpha^3\sum\limits_{i = 1}^{n}a_i^2\right). \end{eqnarray} $ (2.6)

Substituting (2.6) into (2.1), we obtain

$ \begin{aligned} P(S_n\ge x)\le \exp\left(\frac12\alpha^2A_n^2+\frac12 mB^3\alpha^3\sum\limits_{i = 1}^{n}a_i^2-\alpha x\right). \end{aligned} $

We choose $ \alpha = \frac{x}{A_n^2} $. Thus the last inequality becomes

$ \begin{aligned} P(S_n\ge x)\le \exp\left(-\frac{x^2}{2A_n^2}+\frac{x^3mB^3\sum\limits_{i = 1}^{n}a_i^2}{2A_n^6}\right) = \exp\left(-\frac{x^2}{2A_n^2}\left(1-\frac{xmB^3\sum\limits_{i = 1}^{n}a_i^2}{A_n^4}\right)\right). \end{aligned} $

(iii) Note that $ \exp(x)\le 1+x+cx^2 $ if and only if $ c\ge (\exp(x)-1-x)/x^2 $, we obtain

$ \begin{aligned} E(\exp(\alpha a_i\xi_i)|{\mathcal F}_{i-1})\le E(1+\alpha a_i\xi_i+c\alpha^2 a_i^2\xi_i^2|{\mathcal F}_{i-1}), \end{aligned} $

where $ c $ is a bound on

$ \frac{\exp(\alpha a_i\xi_i)-\alpha a_i\xi_i-1}{(\alpha a_i\xi_i)^2}. $

Because $ (\exp(x)-1-x)/x^2 $ is increasing in $ x $ and $ \alpha a_i\xi_i\le \alpha mB $, we set

$ c = \frac{\exp(\alpha mB)-\alpha mB-1}{(\alpha mB)^2}. $

By the properties of conditional expectations, the above inequality becomes

$ \begin{aligned} E(\exp(\alpha a_i\xi_i)|{\mathcal F}_{i-1})\le 1+c\alpha^2 a_i^2\bar{\sigma}_i^2\le \exp(c\alpha^2 a_i^2\bar{\sigma}_i^2). \end{aligned} $

Hence

$ \begin{eqnarray} E\left(\exp\left(\alpha\sum\limits_{i = 1}^{n}a_i\xi_i\right)\right) = E\left(E\left(\exp\left(\alpha\sum\limits_{i = 1}^{n}a_i\xi_i\right)|{\mathcal F}_{n-1}\right)\right)\\ \le E\left(\exp\left(\alpha\sum\limits_{i = 1}^{n-1}a_i\xi_i\right)\right)\exp(c\alpha^2 a_n^2\bar{\sigma}_n^2)\\ \le \cdots \le \exp\left(c\alpha^2\sum\limits_{i = 1}^{n} a_i^2\bar{\sigma}_i^2\right) = \exp(c\alpha^2A_n^2). \end{eqnarray} $ (2.7)

Substituting (2.7) into (2.1), we obtain

$ \begin{eqnarray} P(S_n\ge x)\le \exp(c\alpha^2A_n^2-\alpha x). \end{eqnarray} $ (2.8)

By choosing $ \alpha = \frac{1}{mB}\log\left(1+\frac{mBx}{A_n^2}\right) $, (2.8) becomes

$ \begin{aligned} P(S_n\ge x)&\le \exp\left(\frac{mBx/A_n^2-\log(1+mBx/A_n^2)}{1}\frac{A_n^2}{(mB)^2}-\frac{x}{mB}\log\left(1+\frac{mBx}{A_n^2}\right)\right)\\ &\le\exp\left(\frac{x}{Bm}\right)\left(1+\frac{xBm}{A_n^2}\right)^{-\left(\frac{x}{Bm}+\left(\frac{A_n}{Bm}\right)^2\right)}. \end{aligned} $

The ideas of the proofs of Theorem 1.2, Theorem 1.3 come from that of Zhang [13].

Proof of Theorem 1.2 Set $ T_k: = \max\{X_{n+1-k},X_{n+1-k}+X_{n-k},\cdots,X_{n+1-k}+\cdots+X_{1}\} $. Then $ T_k = X_{n+1-k}+T_{k+1}^{+} $, $ T_k^2 = X_{n+1-k}^2+2X_{n+1-k}T_{k+1}^{+}+(T_{k+1}^{+})^2 $. It follows that

$ E[T_k^2]\le E[X_{n+1-k}^2]+2E[E[X_{n+1-k}T_{k+1}^{+}|{\mathcal F}_{n-k}]]+E[(T_{k+1}^{+})^2]. $

Note $ E[X_{n+1-k}T_{k+1}^{+}|{\mathcal F}_{n-k}] = E[X_{n+1-k}|{\mathcal F}_{n-k}]T_{k+1}^{+} = 0 $. We see that

$ E[T_k^2]\le E[X_{n+1-k}^2]+E[(T_{k+1}^{+})^2]\le E[X_{n+1-k}^2]+E[T_{k+1}^2] . $

Thus $ E[T_1^2]\le \sum\limits_{k = 1}^{n}E[X_{n+1-k}^2] $. The proof is completed.

Proof of Theorem 1.3 (a) Let $ T_k $ be defined as in the proof of Theorem 1.2. We first prove (1.1). Substituting $ x = X_{n+1-k} $ and $ y = T_{k+1}^{+} $ to the following elementary inequality

$ \begin{aligned} |x+y|^p\le 2^{2-p}|x|^p+|y|^p+px|y|^{p-1}{\rm sgn}(y), \mbox{ $1\le p\le 2$} \end{aligned} $

yields

$ \begin{aligned} E[|T_k|^p] &\le 2^{2-p}E[|X_{n+1-k}|^p]+E[|T_{k+1}^{+}|^p]+pE[E[X_{n+1-k}|{\mathcal F}_{n-k}](T_{k+1}^{+})^{p-1}]\\ &\le 2^{2-p}E[|X_{n+1-k}|^p]+E[|T_{k+1}^{+}|^p] \end{aligned} $

by the property of martingale differences. Hence

$ E[|T_1|^p]\le 2^{2-p}\sum\limits_{k = 1}^{n-1}E[|X_{n+1-k}|^p]+E[|X_1|^p] . $

So (1.1) is proved.

For (1.2), by the following elementary inequality

$ \begin{aligned} |x+y|^p\le 2^pp^2|x|^p+|y|^p+px|y|^{p-1}{\rm sgn}(y)+2^pp^2x^2|y|^{p-2}, \mbox{ $ p\ge 2$}, \end{aligned} $

we have

$ \begin{aligned} |T_k|^p\le 2^pp^2|X_{n+1-k}|^p+|T_{k+1}|^p+pX_{n+1-k}(T_{k+1}^{+})^{p-1}+2^pp^2X_{n+1-k}^2(T_{k+1}^{+})^{p-2}. \end{aligned} $

It follows that

$ \begin{equation} |T_i|^p\le 2^pp^2\sum\limits_{k = i}^{n}|X_{n+1-k}|^p+p\sum\limits_{k = i}^{n-1}X_{n+1-k}(T_{k+1}^{+})^{p-1}+2^pp^2\sum\limits_{k = i}^{n-1}X_{n+1-k}^2(T_{k+1}^{+})^{p-2}. \end{equation} $ (2.9)

Therefore by Hölder inequality

$ \begin{aligned} E[|T_i|^p]\le & 2^pp^2E\sum\limits_{k = i}^{n}\left[|X_{n+1-k}|^p\right]+p\sum\limits_{k = i}^{n-1}E[E[X_{n+1-k}|{\mathcal F}_{n-k}](T_{k+1}^{+})^{p-1}]\\ &+2^pp^2\sum\limits_{k = i}^{n-1}E[X_{n+1-k}^2(T_{k+1}^{+})^{p-2}]\\ \le & 2^pp^2\sum\limits_{k = 1}^{n}E[|X_{n+1-k}|^p]+2^pp^2\sum\limits_{k = 1}^{n-1}(E[|X_{n+1-k}|^p])^{2/p}(E[|T_{k+1}|^p])^{1-2/p}. \end{aligned} $

Let $ D_n = \max\limits_{k\le n}E[|T_k|^p] $. Then $ D_n\le 2^pp^2\sum\limits_{k = 1}^{n}E[|X_{k}|^p]+2^pp^2\sum\limits_{k = 1}^{n-1}(E[|X_{n+1-k}|^p])^{2/p}D_n^{1-2/p} $. From the above inequalities, we obtain

$ \begin{aligned} D_n\le C_p[\sum\limits_{k = 1}^{n}E[|X_{k}|^p]+(\sum\limits_{k = 1}^{n-1}(E[|X_{n+1-k}|^p])^{2/p})^{p/2}]\le C_p n^{p/2-1}\sum\limits_{k = 1}^{n}E[|X_{k}|^p]. \end{aligned} $

(1.2) is proved.

(b) From (2.9) it follows that

$ \begin{aligned} E[|T_i|^p]\le& 2^pp^2\sum\limits_{k = i}^{n}E[|X_{n+1-k}|^p]+p\sum\limits_{k = i}^{n-1}E[X_{n+1-k}(T_{k+1}^{+})^{p-1}]\\ &+2^pp^2\sum\limits_{k = i}^{n-1}E[E[X_{n+1-k}^2|{\mathcal F}_{n-k}](T_{k+1}^{+})^{p-2}]\\ \le& 2^pp^2\sum\limits_{k = i}^{n}E[|X_{n+1-k}|^p]+p\sum\limits_{k = i}^{n-1}E[E[X_{n+1-k}|{\mathcal F}_{n-k}](T_{k+1}^{+})^{p-1}]\\ &+2^pp^2\sum\limits_{k = i}^{n-1}a_{n+1-k}^2\bar{\sigma}_{n+1-k}^2E[(T_{k+1}^{+})^{p-2}]\\ \le& 2^pp^2\sum\limits_{k = 1}^{n}E[|X_{n+1-k}|^p]+2^pp^2\sum\limits_{k = 1}^{n-1}a_{n+1-k}^2\bar{\sigma}_{n+1-k}^2(E[(T_{k+1}^{+})^{p}])^{1-2/p}. \end{aligned} $

Let $ D_n = \max\limits_{k\le n}E[|T_k|^p] $. Then

$ D_n\le 2^pp^2\sum\limits_{k = 1}^{n}E[|X_{k}|^p]+2^pp^2A_n^2D_n^{1-2/p}. $

By the above inequality, we see that

$ D_n\le C_p \{ \sum\limits_{k = 1}^{n}E[|X_{k}|^p]+A_n^p\}. $

(1.4) is proved.

References
[1] Ahmad I A, Amezziane M. Probability inequalities for bounded random vectors[J]. Stat. Prob. Lett., 2013, 83(4): 1136–1142. DOI:10.1016/j.spl.2012.11.023
[2] Li G L. Bernstein inequality of the sequence of martingale differences and its applications (in Chinese)[J]. J. Math., 2006, 26(1): 103–108.
[3] Bercu B A, Touati A. Exponential inequalities for self-normalized martingales with applications[J]. Ann. Appl. Prob., 2008, 18(5): 1848–1869. DOI:10.1214/07-AAP506
[4] Fan X Q, Grama I, Liu Q S. Exponential inequalities for martingales with applications[J]. Electron. J. Prob., 2015, 20(1): 1–22.
[5] Fan X Q, Grama I, Liu Q S. Martingale inequalities of type Dzhaparidze and van Zanten[J]. Stati., 2017, 51(6): 1200–1213. DOI:10.1080/02331888.2017.1318138
[6] Xu S F, Mian Y. Almost sure convergence of weighted sums for martingale differences[J]. J. Math., 2014, 34(4): 627–632.
[7] Yu K F. Complete convergence of weighted sums of martingale differences[J]. J. Theor. Prob., 1990, 3(2): 339–347. DOI:10.1007/BF01045165
[8] Wang X J, Hu S H. Complete convergence and complete moment convergence for martingale difference sequence[J]. Acta Math. Sin., 2014, 30(1): 119–132. DOI:10.1007/s10114-013-2243-8
[9] Hoeffding W. Probability inequalities for sums of bounded random variables[J]. J. Am. Stat. Assoc., 1963, 58(301): 13–30. DOI:10.1080/01621459.1963.10500830
[10] Bennett G. Probability inequalities for the sum of independent random variables[J]. J. Am. Stat. Assoc., 1962, 57(297): 33–45. DOI:10.1080/01621459.1962.10482149
[11] Bentkus V. On Hoeffding's inequalities[J]. Ann. Prob., 2004, 32(2): 1650–1673. DOI:10.1214/009117904000000360
[12] Gao F Q, Wu L M. Large deviations theory and methods (in Chinese)[M]. Wuhan: Wuhan University Press, 2007.
[13] Zhang L X. Rosenthal's inequalities for independent and negative dependent random variables under sub-linear expectations with applications[J]. Sci. China Math., 2016, 59(4): 751–768. DOI:10.1007/s11425-015-5105-2