Let $\{X_n, \, \, n\geq 1\}$ be a sequence of random variables defined on probability space $(\Omega, \mathscr{F}, P)$. Write $\mathscr{F}_{j}^{k}=\sigma\{X_i; j\leq i\leq k\}$, $1\leq j\leq k\leq \infty, $
We call $\{X_n, \, \, n\geq 1\}$ a $\varphi$-mixing sequence if $\displaystyle\lim_{m\rightarrow\infty}\varphi(m)=0.$
It is obvious that $\varphi(m)=0$ for any $m\geq 1$ for independent sequences. So independent sequences are the special case of $\varphi$-mixing sequences. $\varphi$-mixing is a wide range of dependent sequence and has valuable applications. Many authors studied the convergence properties for sequences of $\varphi$-mixing random variables. We refer the reader to Shao [1] for moment inequality, Wang et al. [2] for strong law of large numbers and growth rate, Kim and Ko [3], Chen and Wang [4], Guo and He [5] for complete moment convergence.
A sequence of random variables $\{X_n, n\geq 1\}$ is said to converge completely to a constant $a$ if for any $\varepsilon>0$, $\sum\limits_{n=1}^{\infty}P(|X_n-a|>\varepsilon) < \infty.$ This notion was given firstly by Hsu and Robbins [6].
In view of the Borel-Cantelli lemma, the above result implies that $X_n\rightarrow a$ almost surely. Therefore, the complete convergence is a very important tool in establishing almost sure convergence of summation of random variables as well as weighted sums of random variables. The converse theorem was proved by Erdös [7]. This results has been generalized and extended in several directions, see Baum and Katz [8], Gut [9], Taylor et al. [10] and Cai and Xu [11]. In particular, Ahmed et al. [12] obtained the following result in Banach space.
Theorem A Let $\{X_{ni}; i\geq 1, n\geq 1 \}$ be an array of rowwise independent random elements in a separable real Banach space $(B, \parallel\cdot\parallel).$ Let $P(\parallel X_{ni}\parallel>x)\leq CP(| X|>x)$ for some random variable $X$, constant $C$ and all $n, i$ and $x>0$. Suppose that $\{a_{ni}, i\geq 1, n\geq 1 \}$ be an array of constants such that $\displaystyle \sup_{i\geq 1}|a_{ni}|=O(n^{-r})$ for some $r>0, $ and $ \displaystyle \sum_{i=1}^\infty |a_{ni}|=O(n^{\alpha})$ for some $\alpha\in [0, r).$ Let $\beta$ be such that $\alpha+\beta\neq-1$ and fix $\delta>0$ such that $1+\alpha /r < \delta\leq 2$. Denote $s=\max(1+(\alpha+\beta+1)/r, \, \, \delta).$ If $ E|X|^s < \infty\, \, \, \, {\rm and} \, \, \, \, S_n=\sum\limits_{i=1}^\infty a_{ni}X_{ni}\rightarrow 0\, \, {\rm in } \, \, {\rm probability}, $ then $\displaystyle \sum_{n=1}^\infty n^\beta P(\parallel S_n\parallel>\epsilon) < \infty$ for all $\epsilon >0$.
The concept of complete moment convergence was given firstly by Chow [13]. Wang and Su [14] extended and generalized Chow's result to a Rademacher type $p(1 < p < 2)$ Banach space. Recently, Chen and Wang [4] obtained the following complete $q$th moment convergence result for $\varphi$-mixing sequences.
Theorem B Let $\{X, X_n, n\geq 1\}$ be a sequence of identically distributed $\varphi$-mixing random variables and denote $S_n=\sum\limits_{i=1}^nX_i, n\geq 1$. Suppose that $r>1, 0 < p < 2, q>0$. Then the following statements are equivalent:
where and in the following $x_+=x$ if $x \geq 0$ and $x_+=0$ if $x < 0$, and $x^q_ +$means $(x_+)^q$, $b=EX$ if $rp\geq 1$ and $b = 0$ if $0 < rp < 1$.
The main purpose of this paper is to discuss again the above results for weighted sums of $\varphi$-mixing sequences. The result of Ahmed et al. [12] is extended to $\varphi$-mixing case. The result of Chen and Wang [4] is extended to the case of weighted sums.
For the proofs of the main results, we need to restate a few definitions and lemmas for easy reference. Throughout this paper, the symbol $C$ denotes a positive constant which is not necessarily the same one in each appearance, $I(A)$ denotes the indicator function of $A$. $[x]$ denotes the maximum integer not larger than $x$. For a finite set $B$, the symbol $\sharp B$ denotes the number of elements in the set $B$. Let $a_n\ll b_n$ denote that there exists a constant $C>0$ such that $a_n\leq C b_n$ for sufficiently large $n$.
Definition 1.1 A real-valued function $l(x)$, positive and measurable on $[A, \infty)$ for some $A>0$, is said to be slowly varying if $\displaystyle \lim_{x\rightarrow\infty} \dfrac{l(x\lambda)}{l(x)}=1$ for each $\lambda>0$.
By the properties of slowly varying function and Fubini's theorem, we can easily prove the following lemma. Here we omit the details of the proof.
Lemma 1.1 Let $X$ be a random variable and $l(x)>0$ be a slowly varying function. Then
(ⅰ) $ \displaystyle \sum_{n=1}^\infty n^{-1} E|X|^\alpha I(|X|>n^\gamma)\leq CE|X|^{\alpha}\log(1+|X|) $ for any $ \alpha\geq 0, \gamma>0$;
(ⅱ) $ \displaystyle \sum_{n=1}^\infty n^\beta l(n) E|X|^\alpha I(|X|>n^\gamma)\leq CE|X|^{\alpha+(\beta+1)/\gamma}l(|X|^{1/\gamma}) $ for any $\beta>-1, \, \alpha\geq 0, \gamma>0$;
(ⅲ) $\displaystyle \sum_{n=1}^\infty n^\beta l(n) E|X|^\alpha I(|X|\leq n^\gamma)\leq CE|X|^{\alpha+(\beta+1)/\gamma}l(|X|^{1/\gamma})$ for any $\beta < -1, \, \alpha\geq 0, \gamma>0.$
The following two lemmas will play an important role in the proof of our main results. The proof is due to Shao [1].
Lemma 1.2 Let $\{X_i, i\geq 1\}$ be a $\varphi$-mixing sequence with mean zero and $E|X_i|^2 < \infty$ for all $ i\geq 1$. Then for all $n\geq 1$ and $k\geq 0$, we have
Lemma 1.3 Let $\{X_i, i\geq 1\}$ be a $\varphi$-mixing sequence. Suppose that there exists an array $\{C_{nk}, \, \, k\geq 0, n\geq 1\}$ of positive numbers such that $E(\sum_{i=k+1}^{k+m}X_i)^2\leq C_{nk}$ for any $k\geq 0, n\geq 1, m\leq n$. Then for any $q\geq 2$, there exists $C=C(q, \varphi(\cdot))$ such that
Lemma 1.4 (see [15]) Let $\{X_i, i\geq 1\}$ be a $\varphi$-mixing sequence with $EX_i=0, \, \, EX_i^2 < \infty$ and $ \displaystyle\sum_{n=1}^\infty\varphi^{1/2}(n) < \infty$. Then $ E|\sum\limits_{i=1}^nX_i|^2\leq C\sum\limits_{i=1}^nEX_i^2.$
By Lemma 1.3 and Lemma 1.4, we deduce the following lemma.
Lemma 1.5 Under the conditions of Lemma 1.4, then for any $q\geq 2$, there exists $C=C(q, \varphi(\cdot))$ such that $\displaystyle E\sup_{1\leq j\leq n}|\sum_{i=1}^{j}X_i|^q\leq C\{(\sum_{i=1}^n EX_i^2)^{q/2}+\sum_{i=1}^n E|X_i|^q\}.$
By monotone convergence theorem and Lemma 1.5, we can obtain the following lemma.
Lemma 1.6 Under the conditions of Lemma 1.4, then for any $q\geq 2$, there exists $C=C(q, \varphi(\cdot))$ such that $\displaystyle E\sup_{j\geq 1}|\sum_{i=1}^{j}X_i|^q\leq C\{(\sum_{i=1}^\infty EX_i^2)^{q/2}+\sum_{i=1}^\infty E|X_i|^q\}.$
Theorem 2.1 Let $\{X, X_{n}, n\geq 1 \}$ be a sequence of identically distributed $\varphi$-mixing random variables with $EX=0$ and $\displaystyle\sum_{n=1}^\infty\varphi^{1/2}(n) < \infty$. Let $\{a_{ni}, 1\leq i\leq n, n\geq 1 \}$ be an array of constants such that
and
Let $\beta>-1$ and $s=1+\displaystyle {(\alpha+\beta+1)}/r$. If
then
Proof Without loss of generality, from (2.1) and (2.2), we can assume
It is obvious that
Thus, it suffices to show that $I_1 < \infty$ and $I_2 < \infty$. We prove only $I_2 < \infty$, the proof of $I_1 < \infty$ is analogous. Set for all $n\geq 1$ and $1\leq i\leq n, $ $ X_{ni}=a_{ni}X_iI(|a_{ni}X_i|\leq x^{1/q}).$ Note that
For any $q\geq 1, $ by (2.5), we have
For $I_3, $ noting that $\displaystyle \int_1^\infty P\left(\left|a_{ni}X_i\right|> x^{1/q}\right)\mathrm{d}x\leq E|a_{ni}X_i|^qI(|a_{ni}X_i|>1), $ by Lemma 1.1, (2.3) and (2.6), we have
Next we deal with $I_4$. We first verify that
Since (2.3) implies $E|X|^{1+\alpha/r} < \infty, $ we have by $EX=0$ that
Thus, to prove $I_4 < \infty, $ we need only to show that
By Lemma 1.5, Markov's inequality and $C_r$ inequality, for any $t\geq 2$, we have
For $I_6$, since $\beta>-1, $ we can choose $s' < 2$ such that $1+\alpha/r < s' < s.$ Taking sufficiently large $t $ such that $-s't/(2q) < -1$ and $\beta+(\alpha-r(s'-1))t/2 < -1$, by (2.3), (2.5) and (2.6) we obtain
Finally, we deal with $I_7$. Set
Then $\cup_{j\geq 1}I_{nj}=N$, where $N$ is the set of positive integers. Note also that for all $k\geq 1, \, \, n\geq 1, \, \, t\geq 1, $
Hence, we have
Note that
For $I_8$, we choose sufficiently large $t$ such that $-rt+rq-1 < -1, \, \, \alpha+\beta-rt+r < -1, $ by Lemma 1.1 and (2.9) we have
For $I_9$, noting that $rq-r-1>-1$ for any $q\geq s>1$ and $\alpha+\beta>-1$, by Lemma 1.1 and (2.9) we have
The proof of (2.4) is completed.
Remark 2.1 As in Remark 2.1 of Guo and Zhu [16], (2.4) implies
Without necessarily imposing any extra conditions, we not only promote and improve the result of Ahmed et al. [12] from i.i.d. to $ \varphi$-mixing setting but also obtain the complete moment convergence of maximum weighted sums for $ \varphi$-mixing sequences.
Theorem 2.2 Let $\{X, X_{n}, n\geq 1 \}$ be a sequence of identically distributed $\varphi$-mixing random variables with $EX=0$ and $\displaystyle\sum_{n=1}^\infty\varphi^{1/2}(n) < \infty$. Let $\{a_{ni}, 1\leq i\leq n, n\geq 1 \}$ be an array of constants satisfying (2.1) and (2.2). If $\alpha>0$ and
Proof Applying the same notation and method of Theorem 2.1, denoting $\beta=-1$, we need only to give the different parts. Note that $(2.13) $ implies that $I_3 < \infty.$ It is obvious that (2.13) implies $E|X|^{1+\alpha/r} < \infty.$ Therefore, we get that (2.8) holds. Thus, to complete the proof of (2.14), it suffices to show that
In fact, noting that $\alpha+\beta+1>0, -2r+rq-1 < -1, \alpha-1-r < -1$ and (2.13), by taking t=2 in the proof of (2.10), (2.11) and (2.12), we deduce that
Then, by Markov's inequality, (2.15) and Lemma 1.5, we have
Corollary 2.1 Let $\{X, X_n, n\geq 1\}$ be a sequence of identically distributed $\varphi-$mixing random variables with $EX=0$ and $\displaystyle\sum_{n=1}^\infty\varphi^{1/2}(n) < \infty$. Suppose that $r>1, 1\leq p < 2, q\geq 1$. Then (1.1) implies
Proof Take $\beta=r-2, a_{ni}=n^{-1/p}$ for $1\leq i\leq n, \, \, n\geq 1, $ and $\alpha=1-1/p$ in Theorem 2.1. It is obvious that $a_{ni}$ satisfies (2.1) and (2.2). Thus, by (1.1) and Theorem 2.1, we have
Corollary 2.2 Let $\{X, X_n, n\geq 1\}$ be a sequence of identically distributed $\varphi$-mixing random variables with $EX=0$ and $\displaystyle\sum_{n=1}^\infty\varphi^{1/2}(n) < \infty$. Suppose that $ 1\leq p < 2, 1\leq q < 2$. Then
implies $\displaystyle \sum_{n=1}^\infty n^{-1-q/p}E(\max_{1\leq k\leq n}|\sum_{i=1}^kX_i|-\epsilon n^{1/p})_+^q < \infty, \forall \epsilon>0.$
Proof Take $\beta=-1, a_{ni}=n^{-1/p}$ for $1\leq i\leq n, \, \, n\geq 1, $ and $r=1/p, \, \, \alpha=1-1/p$ in Theorem 2.2. It is obvious that $a_{ni}$ satisfies (2.1) and (2.2). Thus, by (2.17) and Theorem 2.2, we have
Remark 2.2 When $1\leq p < 2$, by Theorem 2.1 and Theorem 2.2, we establish the results of Chen and Wang [4]. Theorem 2.1 and Theorem 2.2 deal with more general weights, and generalize and extend those of Chen and Wang [4].
Theorem 2.3 Let $\{X, X_{n}, n\geq 1 \}$ be a sequence of identically distributed $\varphi$-mixing random variables with $EX=0$ and $\displaystyle\sum_{n=1}^\infty\varphi^{1/2}(n) < \infty$. Let $\{a_{ni}, i\geq 1, n\geq 1 \}$ be an array of constants such that
Let $\beta>-1$ and $s=1+\displaystyle {(\alpha+\beta+1)}/r$. Then (2.3) implies
Proof The proof is the same as that of Theorem 2.1 except that we use Lemma 1.6 instead of Lemma 1.5 and it is omitted.
Theorem 2.4 Let $\{X, X_{n}, n\geq 1 \}$ be a sequence of identically distributed $\varphi$-mixing random variables with $EX=0$ and $\displaystyle\sum_{n=1}^\infty\varphi^{1/2}(n) < \infty$. Let $\{a_{ni}, i\geq 1, n\geq 1 \}$ be an array of constants satisfying (2.19) and (2.20). If $\alpha>0$, then (2.13) implies
Proof The proof is the same as that of Theorem 2.2 except that we use Lemma 1.6 instead of Lemma 1.5 and it is omitted.
Acknowledgement The authors are deeply grateful to the anonymous referee and the Editor for the careful reading, valuable comments and correcting some errors, which have greatly improved the quality of the paper.