To prove limit theorems in probability theory such as laws of large numbers, central limit theorem, etc., one need to use necessary probability inequalities, which attract attentions of many authors. Ahmad and Amezziane [1] proved extensions of the basic inequalities of Bernstein, Kolmogorov and Hoeffding for the sums of bounded random vectors. Li [2] established Bernstein inequality of the sequence of the martingale differences. Bercu and Touati [3] proved several exponential inequalities for self-normalized martingale by introducing a new notion of random variable heavy on left or right. Fan et al. [4] obtained exponential inequalities of Bennett Freedman, de la Peña, Pinelis and van de Geer. Fan et al. [5] proved martingale inequalities of type Dzhaparidze and van Zanten. Zhang obtained Rosenthal inequalities for independent and negatively dependent random variables under sub-linear expectations. Xu and Miao [6] proved almost sure convergence of weighted sums for martingale differences. Yu [7] obtained complete convergence of weighted sums for martingale differences. Wang and Hu [8] established complete convergence and complete moment convergence for martingale difference sequence. It is natural to ask whether or not the basic inequalities of Bernstein, Kolmogorov and Hoeffding, and Rosenthal inequalities for the sequence of the martingale differences hold. Here we give an affirmative answer to this problem. We state the main results in this section, and present the proofs in Section 2.
Let $ \{\xi_i,{\mathcal F}_i, i\ge 1\} $ be martingale differences on the probability space $ (\Omega, {\mathcal F}, P) $ such that $ |\xi_i|\le B<\infty $, $ i = 1,2,\cdots $, where $ B $ is nonrandom. Set $ \|\xi\| = \inf\{c>0,P(|\xi|\le c) = 1\} $, which is called the essential supremum of random variable $ \xi $. Denote $ \bar{\sigma}_i^2 = \|E(\xi_i^2|{\mathcal F}_{i-1})\| $, $ i\ge 1 $, where $ {\mathcal F}_0 = \{\emptyset, \Omega\} $. Let $ a_1, \cdots, a_n $ be positive real numbers such that $ m = \max\limits_{1\le i\le n}a_i $ and $ A_n^2 = \sum\limits_{i = 1}^{n}a_i^2\bar{\sigma}_i^2 $. Write $ S_0 = 0 $, $ S_n = \sum\limits_{i = 1}^{n}a_i\xi_i = :\sum\limits_{i = 1}^{n}X_i $. The following are the main results.
Theorem 1.1 (i) For any $ x>0 $, $ P(S_n\ge x)\le \exp(-\frac{x^2}{2(A_n^2+xBm/3)}). $
(ii) For any $ x>0 $, $ P(S_n\ge x)\le \exp(-\frac{x^2}{2A_n^2}(1-xmB^3(\sum\limits_{i = 1}^{n}a_i^2)/A_n^4)). $
(iii) For any $ x>0 $, $ P(S_n\ge x)\le \exp(\frac{x}{Bm})(1+\frac{xBm}{A_n^2})^{-(\frac{x}{Bm}+(\frac{A_n}{Bm})^2)}. $
Remark 1.1 (i), (ii), and (iii) in Theorem 1.1 is called to be Bernstein inequality, Kolmogorov inequality, and Hoeffding inequality, respectively. As pointed out in Fan et al. [5], in Theorem 1.1 Bernstein inequality (i) is implied by Hoeffding inequality (iii).
Theorem 1.2 (Kolmogorov inequality) $ E[\max\limits_{k\le n}(S_n-S_k)^2]\le \sum\limits_{k = 1}^{n}E[X_k^2]. $ In particular, $ E[S_n^{+}]\le \sum\limits_{k = 1}^{n}E[X_k^2]. $
Theorem 1.3 (Rosenthal inequality) (a)
and
In particular,
(b)
here $ C_p $ is a positive constant depending only on $ p $.
Proof of Theorem 1.1 For any $ \alpha>0 $, by Chebyshev inequality, we obtain
As in the proof of Ahmad and Amezziane [1], Li [2], to prove the result in the theorem, we first obtain an upper bound for $ E(\exp(\alpha S_n) $ for all $ \alpha>0 $ and then choose $ \alpha $ that the upper bound is minimized. The ideas originally come from Bernstein, Kolmogoroff, Hoeffding [9], Bennett [10]. Here the ideas also come from Bentkus [11], Ahmad and Amezziane [1], Li [2], and Gao and Wu [12].
(i) For any real $ z $, the power expansion of $ \exp(z) $ is
Note that $ E(\alpha a_i\xi_i|{\mathcal F}_{i-1}) = 0 $, $ E(|\alpha a_i\xi_i|^k|{\mathcal F}_{i-1})\le \frac{k!}{2!}E(|\alpha^2a_i^2\xi_i|^2|{\mathcal F}_{i-1})\left(\frac{\alpha a_i B}{3}\right)^{k-2} $, $ k\ge 2 $. For $ \frac{\alpha a_i B}{3}<1 $, we have
By the properties of conditional expectations, for $ \frac{\alpha a_i B}{3}<1 $, we get
Substituting (2.4) to (2.1), we obtain
We choose
the above inequality becomes
(ii) For any $ t $, $ |\log(1+t+\frac{t^2}{2})-t|\le \frac{|t|^3}{2} $, thus we obtain with probability one,
where
Therefore
Hence by the properties of conditional expectations, we have
Because $ 1+t\le \exp(t) $ when $ t>0 $, the right hand side above is bounded by
Substituting (2.6) into (2.1), we obtain
We choose $ \alpha = \frac{x}{A_n^2} $. Thus the last inequality becomes
(iii) Note that $ \exp(x)\le 1+x+cx^2 $ if and only if $ c\ge (\exp(x)-1-x)/x^2 $, we obtain
where $ c $ is a bound on
Because $ (\exp(x)-1-x)/x^2 $ is increasing in $ x $ and $ \alpha a_i\xi_i\le \alpha mB $, we set
By the properties of conditional expectations, the above inequality becomes
Hence
Substituting (2.7) into (2.1), we obtain
By choosing $ \alpha = \frac{1}{mB}\log\left(1+\frac{mBx}{A_n^2}\right) $, (2.8) becomes
The ideas of the proofs of Theorem 1.2, Theorem 1.3 come from that of Zhang [13].
Proof of Theorem 1.2 Set $ T_k: = \max\{X_{n+1-k},X_{n+1-k}+X_{n-k},\cdots,X_{n+1-k}+\cdots+X_{1}\} $. Then $ T_k = X_{n+1-k}+T_{k+1}^{+} $, $ T_k^2 = X_{n+1-k}^2+2X_{n+1-k}T_{k+1}^{+}+(T_{k+1}^{+})^2 $. It follows that
Note $ E[X_{n+1-k}T_{k+1}^{+}|{\mathcal F}_{n-k}] = E[X_{n+1-k}|{\mathcal F}_{n-k}]T_{k+1}^{+} = 0 $. We see that
Thus $ E[T_1^2]\le \sum\limits_{k = 1}^{n}E[X_{n+1-k}^2] $. The proof is completed.
Proof of Theorem 1.3 (a) Let $ T_k $ be defined as in the proof of Theorem 1.2. We first prove (1.1). Substituting $ x = X_{n+1-k} $ and $ y = T_{k+1}^{+} $ to the following elementary inequality
yields
by the property of martingale differences. Hence
So (1.1) is proved.
For (1.2), by the following elementary inequality
we have
It follows that
Therefore by Hölder inequality
Let $ D_n = \max\limits_{k\le n}E[|T_k|^p] $. Then $ D_n\le 2^pp^2\sum\limits_{k = 1}^{n}E[|X_{k}|^p]+2^pp^2\sum\limits_{k = 1}^{n-1}(E[|X_{n+1-k}|^p])^{2/p}D_n^{1-2/p} $. From the above inequalities, we obtain
(1.2) is proved.
(b) From (2.9) it follows that
Let $ D_n = \max\limits_{k\le n}E[|T_k|^p] $. Then
By the above inequality, we see that
(1.4) is proved.