数学杂志  2014, Vol. 34 Issue (5): 810-819   PDF    
扩展功能
加入收藏夹
复制引文信息
加入引用管理器
Email Alert
RSS
本文作者相关文章
GUO Ming-le
WU Sheng-ping
XU Chun-yu
COMPLETE MOMENT CONVERGENCE OF WEIGHTED SUMS FOR SEQUENCES OF φ-MIXING RANDOM VARIABLES
GUO Ming-le, WU Sheng-ping, XU Chun-yu    
School of Mathematics and Computer Science, Anhui Normal University, Wuhu 241003, China
Abstract: The complete moment convergence of weighted sums for φ-mixing sequences is investigated. By using moment inequality and truncation method, the sufficient conditions for complete moment convergence of weighted sums for φ-mixing sequences are obtained, which generalize the corresponding results of Ahmed et al.(2002) and Chen and Wang (2010).
Key words: φ-mixing     weighted sums     complete moment convergence     complete convergence    
φ-混合序列加权和的矩完全收敛性
郭明乐, 吴升平, 徐春宇    
安徽师范大学数学计算机科学学院, 安徽 芜湖 241003
摘要:本文研究了φ -混合序列加权和的矩完全收敛性.利用矩不等式和截尾的方法, 获得了φ -混合序列加权和的矩完全收敛性的充分条件.所得结果推广了Ahmed等(2002) 及陈平炎和王定成(2010) 的结论.
关键词φ-混合    加权和    矩完全收敛性    完全收敛性    
1 Introduction

Let $\{X_n, \, \, n\geq 1\}$ be a sequence of random variables defined on probability space $(\Omega, \mathscr{F}, P)$. Write $\mathscr{F}_{j}^{k}=\sigma\{X_i; j\leq i\leq k\}$, $1\leq j\leq k\leq \infty, $

$ \varphi(m)=\sup\limits_{k\geq 1}\{|P(B|A)-P(B)|;\, \, \, A\in \mathscr{F}_{1}^{k}, \, \, \, B\in \mathscr{F}_{k+m}^{\infty}, \, \, \, P(A)>0\}, \, \, m\geq 1. $

We call $\{X_n, \, \, n\geq 1\}$ a $\varphi$-mixing sequence if $\displaystyle\lim_{m\rightarrow\infty}\varphi(m)=0.$

It is obvious that $\varphi(m)=0$ for any $m\geq 1$ for independent sequences. So independent sequences are the special case of $\varphi$-mixing sequences. $\varphi$-mixing is a wide range of dependent sequence and has valuable applications. Many authors studied the convergence properties for sequences of $\varphi$-mixing random variables. We refer the reader to Shao [1] for moment inequality, Wang et al. [2] for strong law of large numbers and growth rate, Kim and Ko [3], Chen and Wang [4], Guo and He [5] for complete moment convergence.

A sequence of random variables $\{X_n, n\geq 1\}$ is said to converge completely to a constant $a$ if for any $\varepsilon>0$, $\sum\limits_{n=1}^{\infty}P(|X_n-a|>\varepsilon) < \infty.$ This notion was given firstly by Hsu and Robbins [6].

In view of the Borel-Cantelli lemma, the above result implies that $X_n\rightarrow a$ almost surely. Therefore, the complete convergence is a very important tool in establishing almost sure convergence of summation of random variables as well as weighted sums of random variables. The converse theorem was proved by Erdös [7]. This results has been generalized and extended in several directions, see Baum and Katz [8], Gut [9], Taylor et al. [10] and Cai and Xu [11]. In particular, Ahmed et al. [12] obtained the following result in Banach space.

Theorem A  Let $\{X_{ni}; i\geq 1, n\geq 1 \}$ be an array of rowwise independent random elements in a separable real Banach space $(B, \parallel\cdot\parallel).$ Let $P(\parallel X_{ni}\parallel>x)\leq CP(| X|>x)$ for some random variable $X$, constant $C$ and all $n, i$ and $x>0$. Suppose that $\{a_{ni}, i\geq 1, n\geq 1 \}$ be an array of constants such that $\displaystyle \sup_{i\geq 1}|a_{ni}|=O(n^{-r})$ for some $r>0, $ and $ \displaystyle \sum_{i=1}^\infty |a_{ni}|=O(n^{\alpha})$ for some $\alpha\in [0, r).$ Let $\beta$ be such that $\alpha+\beta\neq-1$ and fix $\delta>0$ such that $1+\alpha /r < \delta\leq 2$. Denote $s=\max(1+(\alpha+\beta+1)/r, \, \, \delta).$ If $ E|X|^s < \infty\, \, \, \, {\rm and} \, \, \, \, S_n=\sum\limits_{i=1}^\infty a_{ni}X_{ni}\rightarrow 0\, \, {\rm in } \, \, {\rm probability}, $ then $\displaystyle \sum_{n=1}^\infty n^\beta P(\parallel S_n\parallel>\epsilon) < \infty$ for all $\epsilon >0$.

The concept of complete moment convergence was given firstly by Chow [13]. Wang and Su [14] extended and generalized Chow's result to a Rademacher type $p(1 < p < 2)$ Banach space. Recently, Chen and Wang [4] obtained the following complete $q$th moment convergence result for $\varphi$-mixing sequences.

Theorem B  Let $\{X, X_n, n\geq 1\}$ be a sequence of identically distributed $\varphi$-mixing random variables and denote $S_n=\sum\limits_{i=1}^nX_i, n\geq 1$. Suppose that $r>1, 0 < p < 2, q>0$. Then the following statements are equivalent:

$ \begin{eqnarray}&& \left\{\begin{aligned}&E|X|^{q}<\infty, &\mbox{if}\, \, q>rp, \\ &E|X|^{rp}\log(1+|X|)<\infty, &\mbox{if}\, \, q=rp, \\&E|X|^{rp}<\infty, &\mbox{if}\, \, 0<q<rp, \end{aligned} \right. \end{eqnarray} $ (1.1)
$ \begin{eqnarray} \sum\limits_{n=1}^\infty n^{r-2-q/p}E(\max\limits_{1\leq k\leq n}|S_k-kb|-\epsilon n^{1/p} )_+^q<\infty, \, \, \, \, \, \, \, \, \, \forall \epsilon>0, \end{eqnarray} $ (1.2)

where and in the following $x_+=x$ if $x \geq 0$ and $x_+=0$ if $x < 0$, and $x^q_ +$means $(x_+)^q$, $b=EX$ if $rp\geq 1$ and $b = 0$ if $0 < rp < 1$.

The main purpose of this paper is to discuss again the above results for weighted sums of $\varphi$-mixing sequences. The result of Ahmed et al. [12] is extended to $\varphi$-mixing case. The result of Chen and Wang [4] is extended to the case of weighted sums.

For the proofs of the main results, we need to restate a few definitions and lemmas for easy reference. Throughout this paper, the symbol $C$ denotes a positive constant which is not necessarily the same one in each appearance, $I(A)$ denotes the indicator function of $A$. $[x]$ denotes the maximum integer not larger than $x$. For a finite set $B$, the symbol $\sharp B$ denotes the number of elements in the set $B$. Let $a_n\ll b_n$ denote that there exists a constant $C>0$ such that $a_n\leq C b_n$ for sufficiently large $n$.

Definition 1.1  A real-valued function $l(x)$, positive and measurable on $[A, \infty)$ for some $A>0$, is said to be slowly varying if $\displaystyle \lim_{x\rightarrow\infty} \dfrac{l(x\lambda)}{l(x)}=1$ for each $\lambda>0$.

By the properties of slowly varying function and Fubini's theorem, we can easily prove the following lemma. Here we omit the details of the proof.

Lemma 1.1  Let $X$ be a random variable and $l(x)>0$ be a slowly varying function. Then

(ⅰ) $ \displaystyle \sum_{n=1}^\infty n^{-1} E|X|^\alpha I(|X|>n^\gamma)\leq CE|X|^{\alpha}\log(1+|X|) $ for any $ \alpha\geq 0, \gamma>0$;

(ⅱ) $ \displaystyle \sum_{n=1}^\infty n^\beta l(n) E|X|^\alpha I(|X|>n^\gamma)\leq CE|X|^{\alpha+(\beta+1)/\gamma}l(|X|^{1/\gamma}) $ for any $\beta>-1, \, \alpha\geq 0, \gamma>0$;

(ⅲ) $\displaystyle \sum_{n=1}^\infty n^\beta l(n) E|X|^\alpha I(|X|\leq n^\gamma)\leq CE|X|^{\alpha+(\beta+1)/\gamma}l(|X|^{1/\gamma})$ for any $\beta < -1, \, \alpha\geq 0, \gamma>0.$

The following two lemmas will play an important role in the proof of our main results. The proof is due to Shao [1].

Lemma 1.2  Let $\{X_i, i\geq 1\}$ be a $\varphi$-mixing sequence with mean zero and $E|X_i|^2 < \infty$ for all $ i\geq 1$. Then for all $n\geq 1$ and $k\geq 0$, we have

$ \begin{equation} E(\sum\limits_{i=k+1}^{k+n}X_i)^2\leq 8000n\exp\{6\sum\limits_{i=1}^{[\log n]}\varphi^{ 1/2}(2^i)\}\max\limits_{k<i\leq k+n}EX_i^2.\nonumber \end{equation} $

Lemma 1.3  Let $\{X_i, i\geq 1\}$ be a $\varphi$-mixing sequence. Suppose that there exists an array $\{C_{nk}, \, \, k\geq 0, n\geq 1\}$ of positive numbers such that $E(\sum_{i=k+1}^{k+m}X_i)^2\leq C_{nk}$ for any $k\geq 0, n\geq 1, m\leq n$. Then for any $q\geq 2$, there exists $C=C(q, \varphi(\cdot))$ such that

$ \begin{equation} E\max\limits_{1\leq j\leq n}|\sum\limits_{i=k+1}^{k+j}X_i|^q\leq C[C_{nk}^{q/2}+E(\max\limits_{k<i\leq k+n}|X_i|^q)].\nonumber \end{equation} $

Lemma 1.4  (see [15]) Let $\{X_i, i\geq 1\}$ be a $\varphi$-mixing sequence with $EX_i=0, \, \, EX_i^2 < \infty$ and $ \displaystyle\sum_{n=1}^\infty\varphi^{1/2}(n) < \infty$. Then $ E|\sum\limits_{i=1}^nX_i|^2\leq C\sum\limits_{i=1}^nEX_i^2.$

By Lemma 1.3 and Lemma 1.4, we deduce the following lemma.

Lemma 1.5  Under the conditions of Lemma 1.4, then for any $q\geq 2$, there exists $C=C(q, \varphi(\cdot))$ such that $\displaystyle E\sup_{1\leq j\leq n}|\sum_{i=1}^{j}X_i|^q\leq C\{(\sum_{i=1}^n EX_i^2)^{q/2}+\sum_{i=1}^n E|X_i|^q\}.$

By monotone convergence theorem and Lemma 1.5, we can obtain the following lemma.

Lemma 1.6  Under the conditions of Lemma 1.4, then for any $q\geq 2$, there exists $C=C(q, \varphi(\cdot))$ such that $\displaystyle E\sup_{j\geq 1}|\sum_{i=1}^{j}X_i|^q\leq C\{(\sum_{i=1}^\infty EX_i^2)^{q/2}+\sum_{i=1}^\infty E|X_i|^q\}.$

2 Main Results and Proofs

Theorem 2.1  Let $\{X, X_{n}, n\geq 1 \}$ be a sequence of identically distributed $\varphi$-mixing random variables with $EX=0$ and $\displaystyle\sum_{n=1}^\infty\varphi^{1/2}(n) < \infty$. Let $\{a_{ni}, 1\leq i\leq n, n\geq 1 \}$ be an array of constants such that

$ \begin{eqnarray}\displaystyle \max\limits_{1\leq i\leq n}|a_{ni}|\ll n^{-r} \, \, \, \, {\rm for} \, \, \, {\rm some} \, \, \, r>0\end{eqnarray} $ (2.1)

and

$ \begin{eqnarray} \displaystyle \sum\limits_{i=1}^n |a_{ni}|\ll n^{\alpha} \, \, \, \, {\rm for} \, \, \, {\rm some} \, \, \, \alpha\in [0, r).\end{eqnarray} $ (2.2)

Let $\beta>-1$ and $s=1+\displaystyle {(\alpha+\beta+1)}/r$. If

$ \left\{ \begin{array}{l} E|X{|^q}   \infty ,\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;{\rm{if}}{\mkern 1mu} {\mkern 1mu} q > s,\;\;\;\;\;\;\\ E|X{|^s}\log (1 + |X|)   \infty ,\;\;\;\;{\rm{if}}{\mkern 1mu} {\mkern 1mu} q = s,\\ E|X{|^s}   \infty ,\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;{\rm{if}}{\mkern 1mu} {\mkern 1mu} 1 \le q   s, \end{array} \right. $ (2.3)

then

$ \begin{equation} \displaystyle\sum\limits_{n=1}^{\infty}n^\beta E(\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^{k}a_{ni}X_{i}|-\epsilon)^q_+<\infty\, \, \, {\rm for}\, \, \, {\rm all}\, \, \, \epsilon >0.\end{equation} $ (2.4)

Proof  Without loss of generality, from (2.1) and (2.2), we can assume

$ \begin{eqnarray}\displaystyle \max\limits_{1\leq i\leq n}|a_{ni}|\leq n^{-r}, \, \, \, \displaystyle \sum\limits_{i=1}^n |a_{ni}|\leq n^{\alpha}.\end{eqnarray} $ (2.5)

It is obvious that

$ \begin{eqnarray} &&\sum\limits_{n=1}^{\infty}n^\beta E(\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^{k}a_{ni}X_{i}|-\epsilon)^q_+ =\sum\limits_{n=1}^{\infty}n^\beta\int_0^\infty P(\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^{k}a_{ni}X_{i}|>\epsilon+x^{1/q})\mathrm{d}x\nonumber\\ &\leq& \sum\limits_{n=1}^{\infty}n^\beta P(\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^{k}a_{ni}X_{i}|>\epsilon)+\sum\limits_{n=1}^{\infty}n^\beta\int_1^\infty P(\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^{k}a_{ni}X_{i}|>x^{1/q})\mathrm{d}x =:I_1+I_2.\nonumber \end{eqnarray} $

Thus, it suffices to show that $I_1 < \infty$ and $I_2 < \infty$. We prove only $I_2 < \infty$, the proof of $I_1 < \infty$ is analogous. Set for all $n\geq 1$ and $1\leq i\leq n, $ $ X_{ni}=a_{ni}X_iI(|a_{ni}X_i|\leq x^{1/q}).$ Note that

$ \begin{eqnarray} I_2&\leq &\sum\limits_{n=1}^{\infty}n^\beta\int_1^\infty \sum\limits_{i=1}^{n}P(|a_{ni}X_i|> x^{1/q})\mathrm{d}x+ \sum\limits_{n=1}^{\infty}n^\beta\int_1^\infty P(\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^{k}X_{ni}|>x^{1/q})\mathrm{d}x\nonumber\\ &=:& I_3+I_4.\nonumber\end{eqnarray} $

For any $q\geq 1, $ by (2.5), we have

$ \begin{eqnarray} \sum\limits_{i=1}^{n}|a_{ni}|^q=\sum\limits_{i=1}^{n}|a_{ni}||a_{ni}|^{q-1}\leq n^{-r(q-1)}\sum\limits_{i=1}^{n}|a_{ni}|\leq n^{\alpha-r(q-1)}. \end{eqnarray} $ (2.6)

For $I_3, $ noting that $\displaystyle \int_1^\infty P\left(\left|a_{ni}X_i\right|> x^{1/q}\right)\mathrm{d}x\leq E|a_{ni}X_i|^qI(|a_{ni}X_i|>1), $ by Lemma 1.1, (2.3) and (2.6), we have

$ \begin{eqnarray}I_3&\leq &\sum\limits_{n=1}^{\infty}n^\beta\sum\limits_{i=1}^{n}E|a_{ni}X_i|^qI(|a_{ni}X_i|>1) \leq \sum\limits_{n=1}^{\infty}n^\beta\sum\limits_{i=1}^{n}|a_{ni}|^qE|X|^qI(|X|>n^r)\nonumber\\ &\leq&\sum\limits_{n=1}^{\infty}n^{\beta+ \alpha-r(q-1)}E|X|^qI(|X|>n^r)\nonumber \\ &\ll&\left\{\begin{aligned}& \sum\limits_{n=1}^{\infty}n^{\beta+ \alpha-r(q-1)}E|X|^q, &\mbox{if}\, \, q>s, \\ &\sum\limits_{n=1}^{\infty}n^{-1}E|X|^sI(|X|>n^r), &\mbox{if}\, \, q=s, \\ &\sum\limits_{n=1}^{\infty}n^{\beta+ \alpha-r(q-1)}E|X|^qI(|X|>n^r), &\mbox{if}\, \, 1\leq q<s, \end{aligned} \right.\nonumber\\ &\ll&\left\{\begin{aligned}& \sum\limits_{n=1}^{\infty}n^{\beta+ \alpha-r(q-1)}, &\mbox{if}\, \, q>s, \\ &E|X|^s\log(1+|X|), &\mbox{if}\, \, q=s, \\ &E|X|^s, &\mbox{if}\, \, 1\leq q<s, \end{aligned} \right.\nonumber\\ &<&\infty. \end{eqnarray} $ (2.7)

Next we deal with $I_4$. We first verify that

$ \begin{eqnarray} \sup\limits_{x\geq 1}x^{-1/q}\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^{k} EX_{ni}|\rightarrow 0\, \, \mbox{as}\, \, n\rightarrow \infty. \end{eqnarray} $ (2.8)

Since (2.3) implies $E|X|^{1+\alpha/r} < \infty, $ we have by $EX=0$ that

$ \begin{eqnarray} &&\sup\limits_{x\geq 1}x^{-1/q}\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^{k} EX_{ni}|=\sup\limits_{x\geq 1}x^{-1/q}\max\limits_{1\leq k\leq n}| \sum\limits_{i=1}^{k} Ea_{ni}X_iI(|a_{ni}X_i|> x^{1/q})|\nonumber\\ &\leq&\sum\limits_{i=1}^{n}E|a_{ni}X_i|I(|a_{ni}X_i|> 1)\leq n^{\alpha} E|X|I(|X|>n^{r})\nonumber\\ &\leq&E|X|^{1+\alpha/r}I(|X|>n^{r})\rightarrow 0\, \, \mbox{as}\, \, n\rightarrow \infty.\nonumber \end{eqnarray} $

Thus, to prove $I_4 < \infty, $ we need only to show that

$ \begin{eqnarray} I_5=:\sum\limits_{n=1}^{\infty}n^\beta\int_1^\infty P(\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^{k}(X_{ni}-EX_{ni})|>x^{1/q})\mathrm{d}x<\infty.\nonumber \end{eqnarray} $

By Lemma 1.5, Markov's inequality and $C_r$ inequality, for any $t\geq 2$, we have

$ \begin{eqnarray*} I_5&\leq&\sum\limits_{n=1}^{\infty}n^\beta\int_1^\infty x^{-t/q}E\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^{k}(X_{ni}-EX_{ni})|^t\mathrm{d}x\\ &\ll& \sum\limits_{n=1}^{\infty}n^\beta \int_1^\infty x^{-t/q} (\sum\limits_{i=1}^{n}Ea_{ni}^2X_i^2I(|a_{ni}X_i|\leq x^{1/q}) )^{t/2}\mathrm{d}x\\ &&+\sum\limits_{n=1}^{\infty}n^\beta \int_1^\infty x^{-t/q} \sum\limits_{i=1}^{n}E|a_{ni}X_i|^tI(|a_{ni}X_i|\leq x^{1/q}) \mathrm{d}x\nonumber\\ &=:&I_6+I_7. \nonumber\end{eqnarray*} $

For $I_6$, since $\beta>-1, $ we can choose $s' < 2$ such that $1+\alpha/r < s' < s.$ Taking sufficiently large $t $ such that $-s't/(2q) < -1$ and $\beta+(\alpha-r(s'-1))t/2 < -1$, by (2.3), (2.5) and (2.6) we obtain

$ \begin{eqnarray} I_6&=&\sum\limits_{n=1}^{\infty}n^\beta \int_1^\infty x^{-t/q} (\sum\limits_{i=1}^{n}E|a_{ni}X_i|^{s'}|a_{ni}X_i|^{2-s'}I(|a_{ni}X_i|\leq x^{1/q}) )^{t/2}\mathrm{d}x\nonumber\\ &\leq&\sum\limits_{n=1}^{\infty}n^\beta \int_1^\infty x^{-s't/(2q)} \left(\sum\limits_{i=1}^{n}|a_{ni}|^{s'}E|X|^{s'} \right)^{t/2}\mathrm{d}x \ll \sum\limits_{n=1}^{\infty}n^{\beta+(\alpha-r(s'-1))t/2 }<\infty. \nonumber \end{eqnarray} $

Finally, we deal with $I_7$. Set

$ I_{nj}=\{i\geq 1\left|\right. (n(j+1))^{-r}<|a_{ni}|\leq (nj)^{-r}\}, \, \, \, j=1, 2, \cdots. $

Then $\cup_{j\geq 1}I_{nj}=N$, where $N$ is the set of positive integers. Note also that for all $k\geq 1, \, \, n\geq 1, \, \, t\geq 1, $

$ \begin{eqnarray*} n^{\alpha}&\geq& \sum\limits_{i=1}^\infty |a_{ni}|=\sum\limits_{j=1}^\infty\sum\limits_{i\in I_{nj}}|a_{ni}| \geq \sum\limits_{j=1}^\infty(\sharp{I_{nj}})(n(j+1))^{-r}\\ &\geq& n^{-r}\sum\limits_{j=k}^\infty(\sharp{I_{nj}})(j+1)^{-rt}(k+1)^{rt-r}.\end{eqnarray*} $

Hence, we have

$ \begin{eqnarray}\sum\limits_{j=k}^\infty(\sharp I_{nj})j^{-rt}\leq C n^{\alpha+r}k^{r-rt}.\end{eqnarray} $ (2.9)

Note that

$ \begin{eqnarray} I_7&=&\sum\limits_{n=1}^{\infty}n^\beta \int_1^\infty x^{-t/q} \sum\limits_{j=1}^\infty\sum\limits_{i\in I_{nj}}E|a_{ni}X|^tI(|a_{ni}X|\leq x^{1/q}){d}x\nonumber\\ &\leq&\sum\limits_{n=1}^{\infty}n^{\beta}\sum\limits_{j=1}^\infty (\sharp I_{nj})(nj)^{-rt}\int_1^\infty x^{-t/q}E|X|^tI(|X|\leq x^{1/q}n^r(j+1)^r)\mathrm{d}x\nonumber\\& =&q\sum\limits_{n=1}^{\infty}n^{\beta-rq}\sum\limits_{j=1}^\infty (\sharp I_{nj})j^{-rt} \sum\limits_{k=n}^{\infty}\int_{k^r}^{(k+1)^r} y^{-t+q-1}E|X|^tI(|X|\leq y(j+1)^r)\mathrm{d}y \nonumber\\& \ll&\sum\limits_{n=1}^{\infty}n^{\beta-rq}\sum\limits_{j=1}^\infty (\sharp I_{nj})j^{-rt} \sum\limits_{k=n}^{\infty} k^{-rt+rq-1} E|X|^tI(|X|\leq (k+1)^r(j+1)^r) \nonumber\\& =&\sum\limits_{n=1}^{\infty}n^{\beta-rq}\sum\limits_{j=1}^\infty (\sharp I_{nj})j^{-rt} \sum\limits_{k=n}^{\infty} k^{-rt+rq-1} E|X|^tI(|X|\leq 2^r(k+1)^r) \nonumber\\ &&+\sum\limits_{n=1}^{\infty}n^{\beta-rq}\sum\limits_{j=1}^\infty (\sharp I_{nj})j^{-rt} \sum\limits_{k=n}^{\infty} k^{-rt+rq-1} E|X|^tI(2^r(k+1)^r<|X|\leq (k+1)^r(j+1)^r)\nonumber\\ &=:&I_8+I_9. \end{eqnarray} $ (2.10)

For $I_8$, we choose sufficiently large $t$ such that $-rt+rq-1 < -1, \, \, \alpha+\beta-rt+r < -1, $ by Lemma 1.1 and (2.9) we have

$ \begin{eqnarray} I_8&=& \sum\limits_{n=1}^{\infty}n^{\beta-rq}\sum\limits_{k=n}^{\infty} k^{-rt+rq-1} E|X|^tI(|X|\leq 2^r(k+1)^r) \sum\limits_{j=1}^\infty (\sharp I_{nj})j^{-rt}\nonumber\\ &\ll& \sum\limits_{n=1}^{\infty}n^{\alpha+\beta-rq+r}\sum\limits_{k=n}^{\infty} k^{-rt+rq-1} E|X|^tI(|X|\leq 2^r(k+1)^r)\nonumber\\ &=& \sum\limits_{k=1}^{\infty} k^{-rt+rq-1} E|X|^tI(|X|\leq 2^r(k+1)^r) \sum\limits_{n=1}^{k}n^{\alpha+\beta-rq+r}\nonumber \\ &\ll& \left\{\begin{aligned}& \sum\limits_{k=1}^{\infty} k^{-rt+rq-1} E|X|^tI(|X|\leq 2^r(k+1)^r), &\mbox{if}\, \, q>s, \\ &\sum\limits_{k=1}^{\infty} k^{-rt+rq-1}(\log k) E|X|^tI(|X|\leq 2^r(k+1)^r), &\mbox{if}\, \, q=s, \\ &\sum\limits_{k=1}^{\infty} k^{\alpha+\beta-rt+r} E|X|^tI(|X|\leq 2^r(k+1)^r), &\mbox{if}\, \, 1\leq q<s, \end{aligned} \right.\nonumber\\ &\ll&\left\{\begin{aligned}& E|X|^q, &\mbox{if}\, \, q>s, \\ &E|X|^s\log (1+|X|), &\mbox{if}\, \, q=s, \\ &E|X|^s, &\mbox{if}\, \, 1\leq q<s, \end{aligned} \right.\nonumber \\ &<&\infty. \end{eqnarray} $ (2.11)

For $I_9$, noting that $rq-r-1>-1$ for any $q\geq s>1$ and $\alpha+\beta>-1$, by Lemma 1.1 and (2.9) we have

$ \begin{eqnarray} I_9&\leq & \sum\limits_{n=1}^{\infty}n^{\beta-rq} \sum\limits_{k=n}^\infty k^{-rt+rq-1}\sum\limits_{j=1}^\infty (\sharp I_{nj})j^{-rt} \sum\limits_{i=2(k+1)}^{(j+1)(k+1)}E|X|^tI(i^r<|X|\leq (i+1)^{r})\nonumber\\&\leq& \sum\limits_{n=1}^{\infty}n^{\beta-rq} \sum\limits_{k=n}^\infty k^{-rt+rq-1}\sum\limits_{i=2(k+1)}^{\infty}E|X|^tI(i^r<|X|\leq (i+1)^{r}) \sum\limits_{j=[i(k+1)^{-1}]-1 }^\infty (\sharp I_{nj})j^{-rt}\nonumber \\&\ll& \sum\limits_{n=1}^{\infty}n^{\beta-rq} \sum\limits_{k=n}^\infty k^{-rt+rq-1}\sum\limits_{i=2(k+1)}^{\infty}n^{r+\alpha}i^{r(1-t)}k^{-r(1-t)}E|X|^tI(i^r<|X| \leq (i+1)^{r})\nonumber\\&=& \sum\limits_{k=1}^\infty k^{rq-r-1} \sum\limits_{i=2(k+1)}^{\infty}i^{r(1-t)} E|X|^tI(i^r<|X|\leq (i+1)^{r}) \sum\limits_{n=1}^{k}n^{\alpha+\beta-rq+r}\nonumber\\ &\ll& \left\{\begin{aligned}& \sum\limits_{k=1}^\infty k^{rq-r-1} \sum\limits_{i=2(k+1)}^{\infty}i^{r(1-t)} E|X|^tI(i^r<|X|\leq (i+1)^{r}), &\mbox{if}\, \, q>s, \\ &\sum\limits_{k=1}^\infty k^{rq-r-1}(\log k) \sum\limits_{i=2(k+1)}^{\infty}i^{r(1-t)} E|X|^tI(i^r<|X|\leq (i+1)^{r}), &\mbox{if}\, \, q=s, \\ &\sum\limits_{k=1}^{\infty} k^{\alpha+\beta} \sum\limits_{i=2(k+1)}^{\infty}i^{r(1-t)} E|X|^tI(i^r<|X|\leq (i+1)^{r}), &\mbox{if}\, \, 1\leq q<s, \end{aligned} \right.\nonumber \\ &\ll&\left\{\begin{aligned}& E|X|^q, &\mbox{if}\, \, q>s, \\ &E|X|^s\log (1+|X|), &\mbox{if}\, \, q=s, \\ &E|X|^s, &\mbox{if}\, \, 1\leq q<s, \end{aligned} \right.\nonumber \\ &<&\infty. \end{eqnarray} $ (2.12)

The proof of (2.4) is completed.

Remark 2.1  As in Remark 2.1 of Guo and Zhu [16], (2.4) implies

$ \begin{equation} \displaystyle\sum\limits_{n=1}^{\infty}n^\beta P(\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^{k}a_{ni}X_{i}|>\epsilon)<\infty\, \, \, {\rm for}\, \, \, {\rm all}\, \, \, \epsilon >0.\nonumber\end{equation} $

Without necessarily imposing any extra conditions, we not only promote and improve the result of Ahmed et al. [12] from i.i.d. to $ \varphi$-mixing setting but also obtain the complete moment convergence of maximum weighted sums for $ \varphi$-mixing sequences.

Theorem 2.2  Let $\{X, X_{n}, n\geq 1 \}$ be a sequence of identically distributed $\varphi$-mixing random variables with $EX=0$ and $\displaystyle\sum_{n=1}^\infty\varphi^{1/2}(n) < \infty$. Let $\{a_{ni}, 1\leq i\leq n, n\geq 1 \}$ be an array of constants satisfying (2.1) and (2.2). If $\alpha>0$ and

$ \begin{eqnarray}\left\{\begin{aligned}&E|X|^{q}<\infty, &\mbox{if}\, \, 1+\alpha/r<q<2, \\ &E|X|^{1+\alpha/r}\log(1+|X|)<\infty, &\mbox{if}\, \, q=1+\alpha/r, \\ &E|X|^{1+\alpha/r}<\infty, &\mbox{if}\, \, 1\leq q<1+\alpha/r, \end{aligned} \right.\end{eqnarray} $ (2.13)

then

$ \begin{equation} \displaystyle\sum\limits_{n=1}^{\infty}n^{-1} E(\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^{k}a_{ni}X_{i}|-\epsilon)^q_+<\infty\, \, \, {\rm for}\, \, \, {\rm all}\, \, \, \epsilon >0.\end{equation} $ (2.14)

Proof   Applying the same notation and method of Theorem 2.1, denoting $\beta=-1$, we need only to give the different parts. Note that $(2.13) $ implies that $I_3 < \infty.$ It is obvious that (2.13) implies $E|X|^{1+\alpha/r} < \infty.$ Therefore, we get that (2.8) holds. Thus, to complete the proof of (2.14), it suffices to show that

$ \begin{eqnarray} I_5=:\sum\limits_{n=1}^{\infty}n^{-1}\int_1^\infty P(\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^{k}(X_{ni}-EX_{ni})|>x^{1/q})\mathrm{d}x<\infty.\nonumber \end{eqnarray} $

In fact, noting that $\alpha+\beta+1>0, -2r+rq-1 < -1, \alpha-1-r < -1$ and (2.13), by taking t=2 in the proof of (2.10), (2.11) and (2.12), we deduce that

$ \begin{eqnarray} &&\sum\limits_{n=1}^{\infty}n^{-1} \int_1^\infty x^{-2/q} \sum\limits_{i=1}^{n}E|a_{ni}X_i|^2I(|a_{ni}X_i|\leq x^{1/q}) \mathrm{d}x\nonumber\\ &\ll& \left\{\begin{aligned}&E|X|^{q}<\infty, &\mbox{if}\, \, 1+\alpha/r<q<2, \\ &E|X|^{1+\alpha/r}\log(1+|X|)<\infty, &\mbox{if}\, \, q=1+\alpha/r, \\ &E|X|^{1+\alpha/r}<\infty, &\mbox{if}\, \, 1\leq q<1+\alpha/r, \end{aligned} \right. \end{eqnarray} $ (2.15)

Then, by Markov's inequality, (2.15) and Lemma 1.5, we have

$ \begin{eqnarray} I_5 &\ll& \sum\limits_{n=1}^{\infty}n^{-1}\int_1^\infty x^{-2/q} \sum\limits_{i=1}^{n}E|a_{ni}X_i|^2I(|a_{ni}X_i|\leq x^{1/q}) \mathrm{d}x<\infty.\nonumber \end{eqnarray} $

Corollary 2.1  Let $\{X, X_n, n\geq 1\}$ be a sequence of identically distributed $\varphi-$mixing random variables with $EX=0$ and $\displaystyle\sum_{n=1}^\infty\varphi^{1/2}(n) < \infty$. Suppose that $r>1, 1\leq p < 2, q\geq 1$. Then (1.1) implies

$ \begin{eqnarray} \sum\limits_{n=1}^\infty n^{r-2-q/p}E(\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^kX_i|-\epsilon n^{1/p} )_+^q<\infty, \, \, \, \, \, \, \, \, \, \forall \epsilon>0.\nonumber\end{eqnarray} $

Proof  Take $\beta=r-2, a_{ni}=n^{-1/p}$ for $1\leq i\leq n, \, \, n\geq 1, $ and $\alpha=1-1/p$ in Theorem 2.1. It is obvious that $a_{ni}$ satisfies (2.1) and (2.2). Thus, by (1.1) and Theorem 2.1, we have

$ \begin{eqnarray}&& \sum\limits_{n=1}^\infty n^{r-2-q/p}E(\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^kX_i|-\epsilon n^{1/p} )_+^q \nonumber\\ &=& \sum\limits_{n=1}^\infty n^{r-2}E(\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^ka_{ni}X_i|-\epsilon )_+^q <\infty. \end{eqnarray} $ (2.16)

Corollary 2.2   Let $\{X, X_n, n\geq 1\}$ be a sequence of identically distributed $\varphi$-mixing random variables with $EX=0$ and $\displaystyle\sum_{n=1}^\infty\varphi^{1/2}(n) < \infty$. Suppose that $ 1\leq p < 2, 1\leq q < 2$. Then

$ \begin{eqnarray}\left\{\begin{aligned}&E|X|^{q}<\infty, &\mbox{if}\, \, p<q<2, \\ &E|X|^{p}\log(1+|X|)<\infty, &\mbox{if}\, \, q=p, \\ &E|X|^{p}<\infty, &\mbox{if}\, \, 1\leq q<p\end{aligned} \right. \end{eqnarray} $ (2.17)

implies $\displaystyle \sum_{n=1}^\infty n^{-1-q/p}E(\max_{1\leq k\leq n}|\sum_{i=1}^kX_i|-\epsilon n^{1/p})_+^q < \infty, \forall \epsilon>0.$

Proof   Take $\beta=-1, a_{ni}=n^{-1/p}$ for $1\leq i\leq n, \, \, n\geq 1, $ and $r=1/p, \, \, \alpha=1-1/p$ in Theorem 2.2. It is obvious that $a_{ni}$ satisfies (2.1) and (2.2). Thus, by (2.17) and Theorem 2.2, we have

$ \begin{eqnarray} &&\sum\limits_{n=1}^\infty n^{-1-q/p}E(\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^kX_i|-\epsilon n^{1/p} )_+^q =\sum\limits_{n=1}^\infty n^{-1}E(\max\limits_{1\leq k\leq n}|\sum\limits_{i=1}^ka_{ni}X_i|-\epsilon )_+^q\nonumber\\ &<&\infty. \end{eqnarray} $ (2.18)

Remark 2.2  When $1\leq p < 2$, by Theorem 2.1 and Theorem 2.2, we establish the results of Chen and Wang [4]. Theorem 2.1 and Theorem 2.2 deal with more general weights, and generalize and extend those of Chen and Wang [4].

Theorem 2.3  Let $\{X, X_{n}, n\geq 1 \}$ be a sequence of identically distributed $\varphi$-mixing random variables with $EX=0$ and $\displaystyle\sum_{n=1}^\infty\varphi^{1/2}(n) < \infty$. Let $\{a_{ni}, i\geq 1, n\geq 1 \}$ be an array of constants such that

$ \begin{eqnarray}\displaystyle \max\limits_{i\geq 1}|a_{ni}|\ll n^{-r} \, \, \, \, {\rm for} \, \, \, {\rm some} \, \, \, r>0\end{eqnarray} $ (2.19)

and

$ \begin{eqnarray} \displaystyle \sum\limits_{i=1}^\infty |a_{ni}|\ll n^{\alpha} \, \, \, \, {\rm for} \, \, \, {\rm some} \, \, \, \alpha\in [0, r).\end{eqnarray} $ (2.20)

Let $\beta>-1$ and $s=1+\displaystyle {(\alpha+\beta+1)}/r$. Then (2.3) implies

$ \begin{equation} \displaystyle\sum\limits_{n=1}^{\infty}n^\beta E(\sup\limits_{k\geq 1}|\sum\limits_{i=1}^{k}a_{ni}X_{i}|-\epsilon)^q_+<\infty\, \, \, {\rm for}\, \, \, {\rm all}\, \, \, \epsilon >0.\nonumber\end{equation} $

Proof  The proof is the same as that of Theorem 2.1 except that we use Lemma 1.6 instead of Lemma 1.5 and it is omitted.

Theorem 2.4   Let $\{X, X_{n}, n\geq 1 \}$ be a sequence of identically distributed $\varphi$-mixing random variables with $EX=0$ and $\displaystyle\sum_{n=1}^\infty\varphi^{1/2}(n) < \infty$. Let $\{a_{ni}, i\geq 1, n\geq 1 \}$ be an array of constants satisfying (2.19) and (2.20). If $\alpha>0$, then (2.13) implies

$ \begin{equation} \displaystyle\sum\limits_{n=1}^{\infty}n^{-1} E(\sup\limits_{k\geq 1}|\sum\limits_{i=1}^{k}a_{ni}X_{i}|-\epsilon)^q_+<\infty\, \, \, {\rm for}\, \, \, {\rm all}\, \, \, \epsilon >0.\nonumber\end{equation} $

Proof   The proof is the same as that of Theorem 2.2 except that we use Lemma 1.6 instead of Lemma 1.5 and it is omitted.

Acknowledgement The authors are deeply grateful to the anonymous referee and the Editor for the careful reading, valuable comments and correcting some errors, which have greatly improved the quality of the paper.

References
[1] Shao Qiman. A moment inequality and its application[J]. Acta. Math. Sin., 1988, 31: 736–747.
[2] Wang Xuejun, Hu Shuhe, Shen Yan, Ling Nengxiang. Strong law of large numbers and growth rate for a class of random variable sequences[J]. Statist. Probab. Lett., 2008, 78: 3330–3337. DOI:10.1016/j.spl.2008.07.010
[3] Kim T S, Ko M H. Complete moment convergence of moving average processes under dependence assumptions[J]. Statist. Probab. Lett., 2008, 78: 839–846. DOI:10.1016/j.spl.2007.09.009
[4] Chen Pingyan, Wang Dingcheng. Complete moment convergence for sequence of identically distributed $\varphi$-mixing random variables[J]. Acta. Math. Sinica, English Series, 2010, 26: 679–690. DOI:10.1007/s10114-010-7625-6
[5] Guo Mingle, He Rundong. Complete moment convergence of moving average processes under dependence assumptions[J]. Journal of Math., 2013, 33: 203–212.
[6] Hsu P L, Robbins H. Complete convergence and the law of large numbers[J]. Proc. Nat. Acad. Sci. USA, 1947, 33: 25–31. DOI:10.1073/pnas.33.2.25
[7] Erd?s P. On a theorem of Hsu and Robbins[J]. Ann. Math. Statist., 1949, 20: 286–291. DOI:10.1214/aoms/1177730037
[8] Baum L E, Katz M. Convergence rates in the law of large numbers[J]. Trans. Amer. Math. Soc., 1965, 120: 108–123. DOI:10.1090/S0002-9947-1965-0198524-1
[9] Gut A. Complete convergence and Cesàro summation for i.i.d. random variables[J]. Probab. Theory Related Fields, 1993, 97: 169–178. DOI:10.1007/BF01199318
[10] Taylor R L, Patterson R F, Bozorgnia A. A strong law of large numbers for arrays of rowwise negatively dependent random variables[J]. Stochastic Anal. Appl., 2002, 20: 643–656. DOI:10.1081/SAP-120004118
[11] Cai Guanghui, Xu Bing. Complete convergence for weighted sums of $\rho$-mixing sequences and its application[J]. Journal of Math., 2006, 26: 419–422.
[12] Ahmed S E, Antonini R G, Volodin A. On the rate of complete convergence for weighted sums of arrays of Banach space valued random elements with application to moving-average processes[J]. Statist. Probab. Lett., 2002, 58: 185–194. DOI:10.1016/S0167-7152(02)00126-8
[13] Chow Y S. On the rate of moment convergence of sample sums and extremes[J]. Bull. Inst. Math. Acad. Sinica, 1988, 16: 177–201.
[14] Wang Dingcheng, Su Chun. Moment complete convergence for B-valued iid random elements sequence[J]. Acta Math. Appl. Sinica, 2004, 27: 440–448.
[15] Yang Shanchao. Almost sure convergence of weighted sums of mixing sequence[J]. J. Sys. Sci. Math. Scis., 1995, 7: 254–265.
[16] Guo Mingle, Zhu Dongjin. Equivalent conditions of complete moment convergence of weighted sums for $\rho^*$-mixing sequence of random variables[J]. Statist. Probab. Lett., 2013, 83: 13–20. DOI:10.1016/j.spl.2012.08.015