Let $\{X_n, n\ge 1\}$ be a sequence of random variables defined on a probability space $(\Omega, \mathscr F, \mathbb P)$ and let $\{a_{ni}, 1 \le i \le n, n\ge 1 \}$ be a triangular array of constants. Weighted sums of the form $\sum_{i=1}^n a_{ni}X_i$ arise in a variety of applications, such as least squares estimation, nonparametric regression function estimate and jackknife estimate. Because of the wide applications of weighted sums in statistics, many researchers have paid much attention to its properties.
When $\{X_n, n\ge 1\}$ is a sequence of independent identically distributed (i.i.d.) random variables, the almost sure convergence of weighted sums $\sum_{i=1}^n a_{ni}X_i$ has been founded in Choi and Sung [1], Chow [2], Chow and Lai [3], Li et al. [7], Stout [8], Sung [9, 10], Teicher [11], Thrum [12] and so on.
This paper focus on the above almost sure convergence of the weighted sums for the case which $\{X_n, n\ge 1\}$ is a martingale difference sequence. To describe the results of this paper, suppose that $(\Omega, \mathscr F, \mathbb P)$ is a probability space and let $\{\mathscr F_n, n\ge 0\}$ be a family of $\sigma$-algebras, such that $\mathscr F_0 \subseteq \mathscr F_1 \subseteq \cdots \subseteq \mathscr F.$
Let $\{X_n, n\ge 1\}$ be a martingale difference sequence with respect to the filtration $\{\mathscr F_n, n\ge0\}$ with $X_0=0$, i.e.,
ⅰ) $\mathbb E |X_n| < \infty$, for all $n \ge 1$,
ⅱ) $X_n$ is $\mathscr F_n$-measurable,
ⅲ) $\mathbb E (X_n|\mathscr F_{n-1})=0$ a.s. for every $n \ge 1$.
Throughout this paper, $C$ denotes a positive constant, which may take different values whenever it appears in different expressions. $1_{A}$ denotes the indicator function of the event $A$. $\log x$ denotes $\ln \max \{x, e\}, $ where ln is the natural logarithm.
Now we state our main result as follows.
Theorem 1.1 Let $\{X_n, n\ge 1\}$ be a stationary martingale difference sequence with respect to the filtration $\{\mathscr F_n, n\ge0\}$ with $X_0=0$. In addition, assume that
Let $\{a_{ni}, 1\le i \le n, n\ge 1\}$ be a triangular array of constants satisfying
Then we have
We will obtain the following better consequence than Theorem 1.1 as condition (1.2) is replaced by the stronger condition (1.4).
Corollary 1.1 Let $\{X_n, n\ge 1\}$ be a stationary martingale difference sequence with respect to the filtration $\{\mathscr F_n, n\ge0\}$ with $X_0=0$. In addition, assume that there exists a constant $C>0$ such that $ \mathbb E (X_n^2| \mathscr F_{n-1}) < C \ \ \text{a.s.}. $ If $\{a_{ni}, 1\le i \le n, n\ge 1\}$ is a triangular array of constants satisfying
then we have $ \sum\limits_{i=1}^n a_{ni}X_i \to 0\ \ \text{a.s.}. $
In order to prove our main result, we need the following lemma.
Lemma 2.1 If $\mathbb E X^2 < \infty$, then for any $\epsilon >0, $ $\sum\limits_{n=1}^{\infty}\frac{1}{\sqrt{n \log n}} \mathbb E|X|1_{\{|X|> \epsilon \sqrt{\frac{n}{\log n}}\}} <\infty.$
Proof Noting that $\left\{\frac{n}{\log n}\right\}$ is an increasing sequence, we have
since the first inequality follows from the following fact:
Proof of Theorem 1.1 By Lemma 2.1 there exists a sequence of positive real numbers $\{\epsilon_n\}$ satisfying $\epsilon_n \downarrow 0$ such that
For $1\le i \le n$, define $ Y_i=X_i1_{\{|X_i|\le \epsilon_i \sqrt{\frac{i}{\log i}}\}}, \ \ \ Z_i=X_i1_{\{|X_i|> \epsilon_i \sqrt{\frac{i}{\log i}}\}}, $ and
then $X_i=X_i^{'}+X_i^{''}$ and $\{X_i^{'}, i\ge 1\}$, $\{X_i^{''}, i\ge 1\}$ are two martingale difference sequences with respect to the filtration $\{\mathscr F_i, i\ge0\}$.
For any $1 \le i \le n$, let us define $ X_{ni}= a_{ni} b_n X_i^{'}, $ where $b_n=\sqrt{2n \log n}. $ By (1.1) and (1.2), it is easy to check
and
So there exists a positive decreasing sequence $k_n\to 0$ such that
For any $\lambda >0, $ from the following elementary inequalities
and the definition of martingale difference, we have
By iterating the same procedure and using the properties of conditional expectation, we can give
Hence, by Markov's inequality, it follows that
where we choose $\lambda =2(1+r)\log n$. Since $k_n \to 0$ implies that
for $n$ sufficiently large. Then we have
Therefore by the Borel-Cantelli lemma, we have
To finish the proof, it is enough to show that
From Markov's inequality, for any $r >0$, we can obtain
Note the fact
This implies
where the last inequality follows from (2.1). Thus, by the Borel-Cantelli lemma, we have
From the condition (1.2), we get the following inequality
From the inequalities (2.4) and (2.5), we get
which implies (2.3). So the proof is completed.
Proof of Corollary 1.1 From the condition (1.4), there exists a sequence of real numbers denoted by $\{t_n\}$ such that $ t_n \downarrow 0$ and
By Theorem 1.1, we get
which implies
Replacing $X_i$ by $- X_i$, we can get
Thus the desired result can be obtained.