数学杂志  2014, Vol. 34 Issue (4): 627-632   PDF    
扩展功能
加入收藏夹
复制引文信息
加入引用管理器
Email Alert
RSS
本文作者相关文章
XU Shou-fang
MIAO Yu
ALMOST SURE CONVERGENCE OF WEIGHTED SUMS FOR MARTINGALE DIFFERENCES
XU Shou-fang1, MIAO Yu2    
1. Dept. of Math. and Infor. Sci., Xinxiang University, Xinxiang 453003, China;
2. College of Math. and Infor. Sci., Henan Normal University, Xinxiang 453007, China
Abstract: In this paper, we discuss a class of weighted sums for martingale difference sequence based on some elementary inequalities and the truncation technique.The almost sure convergence is obtained, which extends the result on the case of independent identically distributed random variables.
Key words: almost sure convergence     weighted sums     martingale difference    
鞅差序列加权和的几乎处处收敛
许寿方1, 苗雨2    
1. 新乡学院数学与信息科学系, 河南 新乡 453003;
2. 河南师范大学数学与信息科学学院, 河南 新乡 453007
摘要:本文研究了一类鞅差序列加权和的收敛性的问题.利用一些基本不等式和截尾技术, 获得了加权和的几乎处处收敛性, 推广了关于独立同分布的随机变量序列的相关结果.
关键词几乎处处收敛    加权和    鞅差    
1 Introduction and Main Results

Let $\{X_n, n\ge 1\}$ be a sequence of random variables defined on a probability space $(\Omega, \mathscr F, \mathbb P)$ and let $\{a_{ni}, 1 \le i \le n, n\ge 1 \}$ be a triangular array of constants. Weighted sums of the form $\sum_{i=1}^n a_{ni}X_i$ arise in a variety of applications, such as least squares estimation, nonparametric regression function estimate and jackknife estimate. Because of the wide applications of weighted sums in statistics, many researchers have paid much attention to its properties.

When $\{X_n, n\ge 1\}$ is a sequence of independent identically distributed (i.i.d.) random variables, the almost sure convergence of weighted sums $\sum_{i=1}^n a_{ni}X_i$ has been founded in Choi and Sung [1], Chow [2], Chow and Lai [3], Li et al. [7], Stout [8], Sung [9, 10], Teicher [11], Thrum [12] and so on.

This paper focus on the above almost sure convergence of the weighted sums for the case which $\{X_n, n\ge 1\}$ is a martingale difference sequence. To describe the results of this paper, suppose that $(\Omega, \mathscr F, \mathbb P)$ is a probability space and let $\{\mathscr F_n, n\ge 0\}$ be a family of $\sigma$-algebras, such that $\mathscr F_0 \subseteq \mathscr F_1 \subseteq \cdots \subseteq \mathscr F.$

Let $\{X_n, n\ge 1\}$ be a martingale difference sequence with respect to the filtration $\{\mathscr F_n, n\ge0\}$ with $X_0=0$, i.e.,

ⅰ) $\mathbb E |X_n| < \infty$, for all $n \ge 1$,

ⅱ) $X_n$ is $\mathscr F_n$-measurable,

ⅲ) $\mathbb E (X_n|\mathscr F_{n-1})=0$ a.s. for every $n \ge 1$.

Throughout this paper, $C$ denotes a positive constant, which may take different values whenever it appears in different expressions. $1_{A}$ denotes the indicator function of the event $A$. $\log x$ denotes $\ln \max \{x, e\}, $ where ln is the natural logarithm.

Now we state our main result as follows.

Theorem 1.1  Let $\{X_n, n\ge 1\}$ be a stationary martingale difference sequence with respect to the filtration $\{\mathscr F_n, n\ge0\}$ with $X_0=0$. In addition, assume that

$ \begin{eqnarray} \label{aa1} \mathbb E (X_n^2| \mathscr F_{n-1}) \le 1 \ \ \text{a.s.}. \end{eqnarray} $ (1.1)

Let $\{a_{ni}, 1\le i \le n, n\ge 1\}$ be a triangular array of constants satisfying

$ \begin{eqnarray} \label{aa2} \max\limits_{1\le i \le n}|a_{ni}|\le \frac{1}{\sqrt{2n \log n}}. \end{eqnarray} $ (1.2)

Then we have

$ \begin{eqnarray} \label{aa3} \limsup\limits_{n\to \infty}\sum\limits_{i=1}^n a_{ni}X_i \le 1\ \ \text{a.s.}. \end{eqnarray} $ (1.3)

We will obtain the following better consequence than Theorem 1.1 as condition (1.2) is replaced by the stronger condition (1.4).

Corollary 1.1  Let $\{X_n, n\ge 1\}$ be a stationary martingale difference sequence with respect to the filtration $\{\mathscr F_n, n\ge0\}$ with $X_0=0$. In addition, assume that there exists a constant $C>0$ such that $ \mathbb E (X_n^2| \mathscr F_{n-1}) < C \ \ \text{a.s.}. $ If $\{a_{ni}, 1\le i \le n, n\ge 1\}$ is a triangular array of constants satisfying

$ \begin{eqnarray}\label{aa4} \max\limits_{1\le i \le n}|a_{ni}|= o\left(\frac{1}{\sqrt{n \log n}}\right), \end{eqnarray} $ (1.4)

then we have $ \sum\limits_{i=1}^n a_{ni}X_i \to 0\ \ \text{a.s.}. $

2 The Proof of Main Results

In order to prove our main result, we need the following lemma.

Lemma 2.1  If $\mathbb E X^2 < \infty$, then for any $\epsilon >0, $ $\sum\limits_{n=1}^{\infty}\frac{1}{\sqrt{n \log n}} \mathbb E|X|1_{\{|X|> \epsilon \sqrt{\frac{n}{\log n}}\}} <\infty.$

Proof  Noting that $\left\{\frac{n}{\log n}\right\}$ is an increasing sequence, we have

$ \begin{eqnarray} &&\sum\limits_{n=1}^{\infty}\frac{1}{\sqrt{n\log n}}\mathbb{E} |X| 1_{\{|X| > \epsilon \sqrt{\frac{n}{\log n}}\}} = \sum\limits_{n=1}^{\infty}\frac{1}{\sqrt{n\log n}}\sum\limits_{i=n}^{\infty} \mathbb{E} |X| 1_{\{\epsilon \sqrt{\frac{i}{\log i}} <|X| \le \epsilon \sqrt{\frac{i+1}{\log (i+1)}}\}}\nonumber\\ &=& \sum\limits_{i=1}^{\infty} \mathbb{E} |X| 1_{\{\epsilon \sqrt{\frac{i}{\log i}} <|X| \le \epsilon \sqrt{\frac{i+1}{\log (i+1)}}\}}\sum\limits_{n=1}^{i} \frac{1}{\sqrt{n\log n}} \le C \sum\limits_{i=1}^{\infty} \mathbb{E} |X| 1_{\{\epsilon \sqrt{\frac{i}{\log i}} <|X| \le \epsilon \sqrt{\frac{i+1}{\log (i+1)}}\}} \\ &&\begin{matrix}\sqrt{\frac{i}{\log i}}\end{matrix}\nonumber\\ & \le&C \sum\limits_{i=1}^{\infty} \mathbb{P} \left(\epsilon \begin{matrix}\sqrt{\frac{i}{\log i}}\end{matrix} <|X| \le \epsilon \begin{matrix}\sqrt{\frac{i+1}{ \log (i+1)}}\end{matrix}\right) \frac{i}{\log i} \le C \mathbb{E} X^2 < \infty, \nonumber \end{eqnarray} $

since the first inequality follows from the following fact:

$ \sum\limits_{n=1}^{i}\frac{1}{\sqrt{n\log n}} \le C \begin{matrix} \int_{1}^{i} \frac{1}{\sqrt{x\log x}}\, dx \end{matrix} \le C \begin{matrix} \sqrt{\frac{i}{\log i}}\end{matrix}. $

Proof of Theorem 1.1  By Lemma 2.1 there exists a sequence of positive real numbers $\{\epsilon_n\}$ satisfying $\epsilon_n \downarrow 0$ such that

$ \begin{eqnarray}\label{a1} \sum\limits_{n=1}^{\infty}\frac{1}{\sqrt{n \log n}} \mathbb E|X_n|1_{\{|X_n|> \epsilon_n \sqrt{\frac{n}{\log n}}\}} <\infty. \end{eqnarray} $ (2.1)

For $1\le i \le n$, define $ Y_i=X_i1_{\{|X_i|\le \epsilon_i \sqrt{\frac{i}{\log i}}\}}, \ \ \ Z_i=X_i1_{\{|X_i|> \epsilon_i \sqrt{\frac{i}{\log i}}\}}, $ and

$ X_i^{'}=Y_i- \mathbb E (Y_i| \mathscr F_{i-1}), \ \ X_i^{''}=Z_i- \mathbb E (Z_i| \mathscr F_{i-1}), $

then $X_i=X_i^{'}+X_i^{''}$ and $\{X_i^{'}, i\ge 1\}$, $\{X_i^{''}, i\ge 1\}$ are two martingale difference sequences with respect to the filtration $\{\mathscr F_i, i\ge0\}$.

For any $1 \le i \le n$, let us define $ X_{ni}= a_{ni} b_n X_i^{'}, $ where $b_n=\sqrt{2n \log n}. $ By (1.1) and (1.2), it is easy to check

$ \begin{align} \mathbb E (X_{ni}^2|\mathscr F_{i-1})\le& \mathbb E ({X_i^{'}}^2|\mathscr F_{i-1}) = \mathbb E (Y_i^2| \mathscr F_{i-1})-(\mathbb E (Y_i| \mathscr F_{i-1}))^2 \\ \le&\mathbb E (Y_i^2| \mathscr F_{i-1}) \le \mathbb E (X_i^2| \mathscr F_{i-1}) \le 1 \ \ \text{a.s.} \end{align} $ (2.2)

and

$\begin{eqnarray*} |X_{ni}|\le 2 \max\limits_{1\le i \le n}\epsilon_i \sqrt{\frac{i}{\log i}}=o\left(\sqrt{\frac{n}{\log n}}\right).\end{eqnarray*} $

So there exists a positive decreasing sequence $k_n\to 0$ such that

$\begin{eqnarray*} |X_{ni}|\le 2 \max\limits_{1\le i \le n}\epsilon_i \sqrt{\frac{i}{\log i}}= k_n \sqrt{\frac{n}{\log n}}.\end{eqnarray*} $

For any $\lambda >0, $ from the following elementary inequalities

$ e^x \le 1+x+\frac{x^2}{2}e^{|x|}, \ \ \forall x\in\mathbb{R}\ \ \text{and}\ \ e^x \ge 1+x, \ \ \forall \ x>0, $

and the definition of martingale difference, we have

$ \begin{align} &\ \mathbb{E} \left(\exp \left\{ \frac{\lambda X_{ni}}{b_n} \right\} |\mathscr F_{i-1}\right)\\ \le &\ \mathbb{E} \left(1+\frac{\lambda X_{ni}}{b_n}+\frac{\lambda^2X_{ni}^2}{2b_n^2}\exp\left\{ \frac{\lambda |X_{ni}|}{b_n}\right\}|\mathscr F_{i-1}\right)\\ =&\ 1+ \frac{\lambda^2}{2b_n^2} \mathbb{E} \left( X_{ni}^2 \exp\left\{ \frac{\lambda |X_{ni}|}{b_n}\right\}|\mathscr F_{i-1} \right)\\ \le &\ 1+ \frac{\lambda^2}{2b_n^2}\exp\left\{\frac{\lambda k_n}{b_n} \sqrt{\frac{n}{\log n}}\right\}\\ \le &\ \exp\left\{\frac{\lambda^2}{2b_n^2}\exp\left\{\frac{\lambda k_n}{b_n} \sqrt{\frac{n}{\log n}}\right\}\right\}. \end{align} $

By iterating the same procedure and using the properties of conditional expectation, we can give

$ \begin{eqnarray} \mathbb{E} \exp \left\{\frac{\lambda}{b_n} \sum\limits_{i=1}^n X_{ni}\right\} &=&\mathbb{E} \left\{\mathbb E \left(\exp \left\{\frac{\lambda}{b_n} \sum\limits_{i=1}^n X_{ni}\right\}|\mathscr F_{n-1}\right)\right\}\nonumber\\ &=&\mathbb{E} \left\{\exp \left\{\frac{\lambda}{b_n} \sum\limits_{i=1}^{n-1} X_{ni}\right\}\mathbb E \left(\exp \left\{\lambda \frac{X_{nn}}{b_n}\right\}|\mathscr F_{n-1}\right)\right\}\nonumber\\ &\le& \exp\left\{n \frac{\lambda^2}{2b_n^2}\exp\left\{\frac{\lambda k_n}{b_n} \sqrt{\frac{n}{\log n}}\right\}\right\}\nonumber. \end{eqnarray} $

Hence, by Markov's inequality, it follows that

$ \begin{align} &\ \mathbb{P} \left( \frac{1}{b_n}\sum\limits_{i=1}^n X_{ni}>1+r \right)\\ \le&\ \inf\limits_{\lambda>0}e^{-\lambda(1+r)}\mathbb{E} \exp \left\{\frac{\lambda}{b_n} \sum\limits_{i=1}^n X_{ni}\right\} \\ \le&\ \inf\limits_{\lambda>0} \exp\left\{-\lambda(1+r)+n \frac{\lambda^2}{2b_n^2}\exp\left\{\frac{\lambda k_n}{b_n} \sqrt{\frac{n}{\log n}}\right\}\right\}\\ =&\ n^{ -(1+r)^2(2-\exp\{\sqrt 2(1+r)k_n\})}, \end{align} $

where we choose $\lambda =2(1+r)\log n$. Since $k_n \to 0$ implies that

$ (1+r)^2(2-\exp\{\sqrt 2(1+r)k_n\}) > 1, $

for $n$ sufficiently large. Then we have

$\begin{eqnarray*} \sum\limits_{n=1}^{\infty} \mathbb P \left(\sum\limits_{i=1}^n a_{ni}X_i^{'} > 1+r \right)=\sum\limits_{n=1}^{\infty} \mathbb{P} \left(\frac{1}{b_n}\sum\limits_{i=1}^n X_{ni} > 1+r \right) < \infty.\end{eqnarray*} $

Therefore by the Borel-Cantelli lemma, we have

$\begin{eqnarray*} \limsup\limits_{n\to \infty}\sum\limits_{i=1}^n a_{ni}X_i^{'} \le 1\ \ \text{a.s.}.\end{eqnarray*} $

To finish the proof, it is enough to show that

$ \begin{eqnarray} \label{a2} \sum\limits_{i=1}^n a_{ni}X_i^{''} \to 0\ \ \text{a.s.}. \end{eqnarray} $ (2.3)

From Markov's inequality, for any $r >0$, we can obtain

$ \begin{eqnarray} &&\sum\limits_{k=1}^{\infty} \mathbb P \left( \frac{1}{\sqrt{2^k \log 2^k}} \sum\limits_{i=1}^{2^k}( |Z_i|+ \mathbb E (|Z_i| | \mathscr F_{i-1})) > r \right)\nonumber\\ &\le&\frac{2}{r} \sum\limits_{k=1}^{\infty} \frac{1}{\sqrt{2^k \log 2^k}} \sum\limits_{i=1}^{2^k} \mathbb E |Z_i| \nonumber\\ &=& \frac{2}{r} \sum\limits_{i=1}^{\infty} \mathbb E |Z_i| \sum\limits_{\{k:2^k \ge i\}} \frac{1}{\sqrt{2^k \log 2^k}}.\nonumber \end{eqnarray} $

Note the fact

$\begin{eqnarray*} \sum\limits_{\{k:2^k \ge i\}} \frac{1}{\sqrt{2^k \log 2^k}} \le \frac{1}{\sqrt{\log i}}\sum\limits_{\{k:2^k \ge i\}} \frac{1}{\sqrt{2^k}} \le \frac{\sqrt{2}}{\sqrt{2}-1}\frac{1}{\sqrt{i \log i}}.\end{eqnarray*} $

This implies

$\begin{eqnarray*} \sum\limits_{k=1}^{\infty} \mathbb P \left( \frac{1}{\sqrt{2^k \log 2^k}} \sum\limits_{i=1}^{2^k}( |Z_i|+ \mathbb E (|Z_i| | \mathscr F_{i-1})) > r \right) \le C \sum\limits_{i=1}^{\infty} \frac{\mathbb E |Z_i|}{\sqrt{i \log i}} < \infty, \end{eqnarray*} $

where the last inequality follows from (2.1). Thus, by the Borel-Cantelli lemma, we have

$ \begin{eqnarray}\label{a3} \frac{1}{\sqrt{2^k \log 2^k}} \sum\limits_{i=1}^{2^k}( |Z_i|+ \mathbb E (|Z_i| | \mathscr F_{i-1})) \to 0 \ \ \text{a.s.}. \end{eqnarray} $ (2.4)

From the condition (1.2), we get the following inequality

$ \begin{eqnarray*}\label{a4} &&\max\limits_{2^k \le n < 2^{k+1}} |\sum\nolimits_{\mathit{i}{\rm{ = 1}}}^\mathit{n} {} a_{ni}X_i^{''}|\nonumber\\ &\le&\frac{1}{\sqrt{2^{k+1} \log 2^k}} \sum\limits_{i=1}^{2^{k+1}-1}( |Z_i|+ \mathbb E (|Z_i| | \mathscr F_{i-1})) \nonumber\\ &\le&\frac{C}{\sqrt{2^{k+1} \log 2^{k+1}}} \sum\limits_{i=1}^{2^{k+1}}( |Z_i|+ \mathbb E (|Z_i| | \mathscr F_{i-1})) \ \ \text{a.s.}. \end{eqnarray*} $ (2.5)

From the inequalities (2.4) and (2.5), we get

$\begin{eqnarray*} \max\limits_{2^k \le n < 2^{k+1}} |\sum\nolimits_{\mathit{i}{\rm{ = 1}}}^\mathit{n} {} a_{ni}X_i^{''}| \to 0 \ \ \text{a.s.}, \end{eqnarray*} $

which implies (2.3). So the proof is completed.

Proof of Corollary 1.1  From the condition (1.4), there exists a sequence of real numbers denoted by $\{t_n\}$ such that $ t_n \downarrow 0$ and

$\begin{eqnarray*} \max\limits_{1 \le i \ n} | a_{ni} | \le \frac{t_n}{\sqrt{2n \log n}}.\end{eqnarray*} $

By Theorem 1.1, we get

$\begin{eqnarray*} \limsup\limits_{n\to \infty}\frac{\sum\nolimits_{\mathit{i}{\rm{ = 1}}}^\mathit{n} {} a_{ni}X_i}{t_n} \le 1\ \ \ \text{a.s.}, \end{eqnarray*} $

which implies

$\begin{eqnarray*} \limsup\limits_{n\to \infty}\sum\limits_{i=1}^n a_{ni}X_i \le 0\ \ \ \text{a.s.}.\end{eqnarray*} $

Replacing $X_i$ by $- X_i$, we can get

$\begin{eqnarray*} \liminf\limits_{n\to \infty}\sum\limits_{i=1}^n a_{ni}X_i \ge 0\ \ \ \text{a.s.}.\end{eqnarray*} $

Thus the desired result can be obtained.

References
[1] Choi B D, Sung S H. Almost sure convergence theorems of weighted sums of random variables[J]. Stoch. Anal. Appl., 1987, 5(4): 365–377. DOI:10.1080/07362998708809124
[2] Chow Y S. Some convergence theorems for independent random variables[J]. Ann. Math. Statist., 1966, 37(6): 1482–1493. DOI:10.1214/aoms/1177699140
[3] Chow Y S, Lai T L. Limiting behavior of weighted sums of independent random variables[J]. Ann. Probab., 1973, 1(5): 810–824. DOI:10.1214/aop/1176996847
[4] Deng Aijiao, Liu Jingjun. Almost sure convergence and complete convergence for the weighted sums of martingale differences[J]. Wuhan Univ. J. Nat. Sci., 1999, 4(2): 278–284.
[5] Joag-Dev K, Proschan F. Negative association of random variables, with applications[J]. Ann. Statist., 1983, 11(1): 286–295. DOI:10.1214/aos/1176346079
[6] Li Guoliang. Bernstein inequality of the sequence of martingale differences and its applications[J]. J. of Math. (PRC)., 2006, 26(1): 103–108.
[7] Li Deli, Bhaskara Rao M, Jiang Tiefeng, Wang Xiangchen. Complete convergence and almost sure convergence of weighted sums of random variables[J]. J. Theor. Probab., 1995, 8(1): 49–76. DOI:10.1007/BF02213454
[8] Stout W F. Some results on the complete and almost sure convergence of linear combinations of independent random variables and martingale differences[J]. Ann. Math. Statist., 1968, 39(5): 1549–1562. DOI:10.1214/aoms/1177698136
[9] Sung S H. Almost sure conbergence for weighted sums of i.i.d. random variables (Ⅱ)[J]. Bull. Korean Math. Soc., 1996, 33(3): 419–425.
[10] Sung S H. Almost sure conbergence for weighted sums of i.i.d. random variables[J]. J. Korean Math. Soc., 1997, 34(1): 57–67.
[11] Teicher H. Almost certain convergence in double arrays[J]. Probab. Theory Rel., 1985, 69(3): 331–345.
[12] Thrum R. A remark on almost sure convergence of weighted sums[J]. Probab. Theory Rel., 1987, 75(3): 425–430. DOI:10.1007/BF00318709
[13] Tien N D, Hung N V. On the convergence of weighted sums of martingale differences[J]. Lect. Notes. Math., 1989, 1391: 293–307. DOI:10.1007/BFb0083374
[14] Yin Xiaohong, Miao Yu, Yang Qinglong. The estimate of regression function in nonparametric regression model based on exponential martingale difference[J]. J. of Math. (PRC)., 2007, 27(3): 279–284.