Hsu and Robbins [1] introduced the concept of complete convergence of $\{X_n\}$. A se-quence $\{X_n, n=1, 2, \cdots\}$ is said to converge completely to a constant C if
Moreover, they proved that the sequence of arithmetic means of independent identically distributed (i.i.d.) random variables converge completely to the expected value if the variance of the summands is finite. The converse theorem was proved by Erdös [2]. In view of the Borel-Cantelli lemma, the complete convergence implies that almost sure convergence. Therefore the complete convergence is very important tool in establishing almost sure convergence. The result of Hsu-Robbins-Erdös is a fundamental theorem in probability theory and was generalized and extended in several directions by many authors.
We recall that the array $\{{X_{ni}, i\geq 1, n\geq 1}\} $ of random variables is said to be stochastically dominated by a random variable $X $ if there exists a positive constant $C$, such that $P\{|X_{ni}|> x\}\leq CP\{|X| > x\}$ for all $x\geq 0, $ $i\geq 1$ and $n\geq 1$.
Volodin et al. [3] and Chen et al. [4] ($\beta>-1 $ and $\beta = -1$, respectively) obtained complete convergence for weighted sums of arrays of rowwise independent Banach-space-valued random elements.
Theorem 1.1 [3, 4] Suppose that $\beta\geq -1$. Let $\{X_{ni}, i\geq 1, n\geq 1 \}$ be an array of rowwise independent random elements in a real separable Banach space which are stochastically dominated by a random variable $X$. Let $\{a_{ni}, i\geq 1, n\geq 1 \}$ be an array of constants satisfying
and
for some $0<\theta\leq 2$ and $\mu$ such that $\theta+\mu/r<2$ and $1+\mu+\beta>0$. If $E|X|^{\theta+(1+\mu+\beta)/r}<\infty$ and $\sum\limits_{i=1}^{\infty}a_{ni}X_{ni}\rightarrow 0$ in probability, then
If $\beta<-1$, then (1.3) is immediate. Hence Theorem 1.1 is of interest only for $\beta\geq -1$.
Recently, Sung [5] extended Theorem 1.1 to negatively associated and negatively de-pendent random variables when $\theta=1$. Moreover, similar results for sequences of $\varphi$-mixing and $\rho^*$-mixing random variables are also established.
Theorem 1.2 [5] Suppose that $\beta\geq -1$. Let $\{X_{ni}, i\geq 1, n\geq 1 \}$ be an array of rowwise negatively associated random variables which are stochastically dominated by a random variable $X$. Let $\{a_{ni}, i\geq 1, n\geq 1 \}$ be an array of constants satisfying (1.1) and
If $EX_{ni}=0$ for all $i\geq 1, n\geq 1$ and
then
Guo and Zhu [6] extended Theorem 1.2 to complete moment convergence of the supre-mum of partial sums for arrays of negatively associated random variables when $\beta>-1$. However, the proof of Guo and Zhu [6] does not work for the case of $\beta=-1$.
Theorem 1.3 [6] Under the conditions of Theorem 1.2. If $\beta>-1$, then
Wu [7] extended Theorem 1.1 to negatively dependent random variables when $\beta>-1$. Wu [7] also considered the case of $1+\mu+\beta=0(\beta>-1)$. However, the proof of Wu [7] does not work for the case of $\beta=-1$.
Theorem 1.4 [7] Suppose that $\beta> -1$. Let $\{X_{ni}, i\geq 1, n\geq 1 \}$ be an array of rowwise negatively dependent random variables which are stochastically dominated by a random variable $X$. Let $\{a_{ni}, i\geq 1, n\geq 1 \}$ be an array of constants satisfying (1.1) and (1.2) for some $\theta$ and $\mu$ such that $\mu<2r$ and $0<\theta<\min\{2, 2-\mu/r\}.$ Furthermore, assume that $EX_{ni}=0$ for all $i\geq 1$ and $n \geq 1$ if $\theta+(1+\mu+\beta)/r\geq 1$. If
In this paper, We deal with more general weights and establish some weaker sufficient conditions for complete moment convergence of weighted sums for arrays of negatively associated and negatively dependent random variables. Similar results for sequences of $\rho^*$-mixing random variables are also obtained. The results of Volodin et al. [3], Chen et al. [4], Sung [5], Wu [7] and Guo and Zhu [6] are improved and generalized.
For the proofs of the main results, we need to restate a few lemmas for easy reference. Throughout this paper, the symbol $C$ denotes a positive constant which is not necessarily the same one in each appearance, $I(A)$ denotes the indicator function of $A$. For a finite set $B$, the symbol $\sharp B$ denotes the number of elements in the set $B$. Let $a_n\ll b_n$ denote that there exists a constant $C>0$ such that $a_n\leq C b_n$ for sufficiently large $n$. Also, let $\log x$ denote $\ln\max(e, x)$.
Lemma 1.1 [5] Let the sequence $\{X_{n}, n\geq 1\}$ of random variables be stochastically dominated by a random variable $X$. Then for any $p>0, x>0, $
The following lemma is well known, and its proof is standard.
Lemma 1.2 Let $X$ be a random variable. For any $\alpha>0, r>0$, the following statements hold:
(ⅰ) $\sum\limits_{n=1}^\infty n^\beta E|X|^\alpha I{(|X|>n^r)}\ll E|X|^{\alpha+\frac {\beta+1}r} \, \, {\rm for \, \, any} \, \, \beta >-1$,
(ⅱ) $\sum\limits_{n=1}^\infty n^\beta E|X|^\alpha I{(|X|\leq n^r)}\ll E|X|^{\alpha+\frac {\beta+1}r} \, \, {\rm for \, \, any} \, \, \beta <-1$.
One of the most interesting inequalities to probability theory is the Rosenthal-type inequality. The Rosenthal-type inequality plays an important role in establishing complete convergence. The Rosenthal-type inequalities for sequences of dependent random variables were established by many authors.
The concept of negatively associated random variables was introduced by Alam and Saxena [8] and was carefully studied by Joag-Dev and Proschan [9]. A finite family of random variables $\{X_i, 1\leq i\leq n\}$is said to be negatively associated, if for every pair disjoint subset $A$ and $B$ of $\{1, 2, \cdots, n\}$ and any real nondecreasing coordinate-wise functions $f_1$ on $\mathbb{R}^A$ and $f_2$ on $\mathbb{R}^B, $
whenever the covariance exists. An infinite family of random variables $\{X_i, -\infty<i<\infty\}$ is negatively associated if every finite subfamily is negatively associated.
The following lemma is a Rosenthal-type inequality for negatively associated random variables.
Lemma 1.3 [10] Let $\{X_n, n\geq 1\}$ be a sequence of negatively associated random variables with $EX_n=0$ and $E|X_n|^p<\infty$ for any $ n\geq 1, p\geq 1$. Then there exist constants $C_p> 0$ and $D_p>0$ depending only on p such that,
The concept of negatively dependent random variables was given by Lehmann [11]. A finite family of random variables $\{X_i, 1\leq i\leq n\}$ is said to be negatively dependent (or negatively orthant dependent) if for all real numbers $x_1, x_2, \cdots, x_n$,
An infinite family of random variables is negatively dependent if every finite subfamily is negatively dependent.
Obviously, negatively associated implies negatively dependent from the definition of negatively associated and negatively dependent. But negatively dependent does not imply negatively associated, so negatively dependent is much weaker than negatively associated. The following lemma is a Rosenthal-type inequality for negatively dependent random variables.
Lemma 1.4 [12] Let $\{X_n, n\geq 1\}$ be a sequence of negatively dependent random variables with $EX_n=0$ and $E|X_n|^p<\infty$ for any $ n\geq 1, p\geq 1$. Then there exist constants $C_p> 0$ and $D_p>0$ depending only on p such that,
Let $\{X_n, \, \, n\geq 1\}$ be a sequence of random variables defined on probability space $(\Omega, \mathscr{F}, P)$. For any $S\subset \mathbb{ N}$, let $\mathscr{F}_S= \sigma(X_k, k\in S).$ Define the $\rho^*$-mixing coefficients by
where $S, T$ are the finite subsets of positive integers such that dist$(S, T)\geq k$. We call $\{X_n, \, \, n\geq 1\}$ a $\rho^*$-mixing sequence if there exists $k\geq 1$ such that $\rho^*(k)<1$.
Note that if $\{X_n, \, \, n\geq 1\}$ is a sequence of independent random variables, then $\rho^*(n)=0$ for all $n \geq 1$.
The following lemma is a Rosenthal-type inequality for $\rho^*$-mixing random variables.
Lemma 1.5 [13, 14] Let $\{X_n, n\geq 1\}$ be a sequence of $\rho^*$-mixing random variables, $Y_n\in \sigma (X_n), \, \, EY_n=0, \, \, E|Y_n|^p<\infty, n\geq 1, \, \, p\geq 1.$ Then there exist constants $C_p> 0$ and $D_p>0$ depending only on $p, k$ and $\rho^*(k)$ where $\rho^*(k)<1$ such that,
Theorem 2.1 Suppose that $\beta\geq -1$. Let $\{X_{ni}, i\geq 1, n\geq 1 \}$ be an array of rowwise negatively associated random variables which are stochastically dominated by a random variable $X$ satisfying $E|X|^p<\infty$ for some $p>1$. Let $\{a_{ni}, i\geq 1, n\geq 1 \}$ be an array of constants satisfying (1.1) and
Furthermore, assume that
if $p\geq 2$. Let $EX_{ni}=0$ for all $i\geq 1$ and $n \geq 1$. Then
Proof Without loss of generality, we can assume that $a_{ni}>0, 1\leq i\leq n, n\geq 1$ (otherwise, we use $a_{ni}^+$ and $a_{ni}^-$ instead of $a_{ni}$, resp., and note that $a_{ni}=a_{ni}^+-a_{ni}^-$). From (1.1) and (2.1), without loss of generality, we can assume that
For any $i\geq 1, n\geq 1$, let
$X_{ni}'=-a_{ni}^{-1}I(a_{ni}X_{ni}<-1)+ X_{ni}I(a_{ni}|X_{ni}|\leq 1)+a_{ni}^{-1}I(a_{ni}X_{ni}>1), \, \, X_{ni}''=X_{ni}- X_{ni}'.$
Noting that $EX_{ni}=0, \, |X_{ni}''|\leq |X_{ni}|I(a_{ni}|X_{ni}|>1)$ for any $ i\geq 1, \, n\geq 1$, we have
Therefore
Hence, in order to prove (2.3), it suffices to prove that $I_1<\infty$ and $I_2<\infty$. Take $\delta>0$ such that $p-\delta>\max(1, q)$. By Lemma 1.1, Lemma 1.2 and (2.4), we get that
Next, we will prove $I_1<\infty$. Noting that $p>1$, for any $M\geq p$, we obtain by Markov's inequality that
Obviously, $X_{ni}'$ is monotonic on $X_{ni}$. Therefore $\{a_{ni}X_{ni}'-Ea_{ni}X_{ni}', i\geq 1, n\geq 1\}$ is also an array of rowwise negatively associated mean zero random variables.
Case 1 ($1<p< 2$). Taking $\delta>0$ such that $p+\delta<2$, we get by Lemma 1.1, Lemma 1.3, $C_r$ inequality, (2.5) and (2.6) that
Set $I_{nj}=\{i, (n(j+1))^{-r}<a_{ni}\leq (nj)^{-r}\}, \, \, \, j=1, 2, \cdots.$ Then $\cup_{j\geq 1}I_{nj}=\{1, 2, \cdots, \}$. Note also that for all $k\geq 1, n\geq 1, M\geq q$,
Hence we have
Note that for any $p>1, \delta>0$,
By Lemma 1.2 and (2.8), we obtain that
By (2.8),
By (2.9), (2.10) and (2.11), for any $p>1, \delta>0$, we have
Combining with (2.7), we get that $I_1<\infty.$
Case 2 ($p\geq 2$). Taking sufficient large $\delta>0$ such that $\beta-\alpha(p+\delta)/2<-1$, we get by Lemma 1.3, (2.6) and $C_r$ inequality that
From the proof of (2.7) and (2.12), we see that $I_{11}<\infty.$ Since $E|X|^p<\infty, p\geq 2$ implies $EX^2<\infty$, by (2.2), we obtain that
Thus $I_1<\infty$.
Remark 2.1 As in Remark 2.3 of Guo and Zhu [6], (2.3) implies (1.7). Hence, when $\theta+(1+\mu+\beta)/r>1, $ Theorem 1.1 follows from Theorem 2.1 by taking $p=\theta+(1+\mu+\beta)/r, \, \, q=\theta, $ since
Hence conditions (1.1) and (2.1) are weaker than conditions (1.1) and (1.2). Theorem 2.1 not only extends the result of Volodin et al. [3] and Chen et al. [4] for independent random variables to negatively associated case, but also obtains the weaker sufficient condition of complete moment convergence of the supremum of partial sums for arrays of negatively associated random variables.
Remark 2.2 If $1+\mu+\beta>0$, Theorem 1.2, Theorem 1.3 follow from Theorem 2.1 by taking $p=1+(1+\mu+\beta)/r, \, \, q=1.$ Theorem 2.1 extends the result of Sung [5] and Guo and Zhu [6]. Moreover, the method used for proving our main results is different from that of Sung [5]. Our method can be used efficiently to the field of the complete moment convergence for sequences of dependent random variables.
Note that conditions (1.1) and (2.1) together imply
The following theorem shows that if the moment condition of Theorem 2.1 is replaced by a stronger condition $E|X|^p\log |X|<\infty$, then condition (2.1) can be replaced by the weaker condition (2.13).
Theorem 2.2 Suppose that $\beta\geq -1$. Let $\{X_{ni}, i\geq 1, n\geq 1 \}$ be an array of rowwise negatively associated random variables which are stochastically dominated by a random variable $X$ satisfying $E|X|^p\log|X|<\infty$ for some $p\geq 1$. Let $\{a_{ni}, i\geq 1, n\geq 1 \}$ be an array of constants satisfying (1.1) and (2.13). Furthermore, assume that (2.2) holds for some $\alpha>0$ if $p\geq 2$. Let $EX_{ni}=0$ for all $i\geq 1$ and $n \geq 1$. Then (2.3) holds.
Proof Applying the same notation and method of Theorem 2.1, we need only to give the different parts. Noting that $\sum\limits_{n=1}^k n^{-1}\ll \log k$ and $p\geq 1, $ we have
Set $I_{nj}=\{i, (n(j+1))^{-r}<a_{ni}\leq (nj)^{-r}\}, \, \, \, j=1, 2, \cdots.$ Note that for all $k\geq 1, n\geq 1, M\geq p$,
Hence we have $\displaystyle \sum_{j=k}^\infty(\sharp I_{nj})j^{-rM} \ll n^{-1-\beta+rp}k^{-r(M-p)}.$ Similar to the corresponding part of the proof of (2.12), for any $p\geq 1, \delta>0, $ we can obtain that
The rest of the proof is the same as that of Theorem 2.1 and is omitted.
Corollary 2.1 Suppose that $\beta\geq -1$. Let $\{X_{ni}, i\geq 1, n\geq 1 \}$ be an array of rowwise negatively associated random variables which are stochastically dominated by a random variable $X$. Let $\{a_{ni}, i\geq 1, n\geq 1 \}$ be an array of constants satisfying (1.1) and (1.2) for some $\theta$ and $\mu$ such that $\mu<2r$ and $1\leq\theta<\min\{2, 2-\mu/r\}.$ Furthermore, assume that $EX_{ni}=0$ for all $i\geq 1$ and $n \geq 1$. If
then (2.3) holds.
Proof If $1+\mu+\beta=0, $ we take $p=\theta$ in Theorem 2.2. If $1+\mu+\beta>0$, we take $p=\theta+(1+\mu+\beta)/r, \, \, q=\theta$ in Theorem 2.1. Hence (2.3) holds by Theorem 2.1 and Theorem 2.2.
Remark 2.3 Corollary 2.1 extends the result of Sung [5] and Guo and Zhu [6] for $\theta=1$ to $1\leq \theta< 2$.
The following theorems extend Theorem 1.1 to negatively dependent random variables. The proof is the same as that of Theorem 2.1 and Theorem 2.2 except that we use Lemma 1.4 instead of Lemma 1.3.
Theorem 2.3 Suppose that $\beta\geq -1$. Let $\{X_{ni}, i\geq 1, n\geq 1 \}$ be an array of rowwise negatively dependent random variables which are stochastically dominated by a random variable $X$ satisfying $E|X|^p<\infty$ for some $p>1$. Let $\{a_{ni}, i\geq 1, n\geq 1 \}$ be an array of constants satisfying (1.1) and (2.1). Furthermore, assume that (2.2) holds for some $\alpha>0$ if $p\geq 2$. Let $EX_{ni}=0$ for all $i\geq 1$ and $n \geq 1$. Then
Theorem 2.4 Suppose that $\beta\geq -1$. Let $\{X_{ni}, i\geq 1, n\geq 1 \}$ be an array of rowwise negatively dependent random variables which are stochastically dominated by a random variable $X$ satisfying $E|X|^p\log|X|<\infty$ for some $p\geq 1$. Let $\{a_{ni}, i\geq 1, n\geq 1 \}$ be an array of constants satisfying (1.1) and (2.13). Furthermore, assume that (2.2) holds for some $\alpha>0$ if $p\geq 2$. Let $EX_{ni}=0$ for all $i\geq 1$ and $n \geq 1$. Then (2.16) holds.
Remark 2.4 If $1+\mu+\beta=0, $ we take $p=\theta$ in Theorem 2.4. If $1+\mu+\beta>0$, we take $p=\theta+(1+\mu+\beta)/r, \, \, q=\theta$ in Theorem 2.3. Therefore Theorem 1.4 follows from Theorem 2.3 and Theorem 2.4. However, Theorem 1.4 does not deal with the case of $\beta=-1$. Our result covers the case of $\beta=-1$.
If the array $\{X_{ni}, i\geq 1, n\geq 1 \}$ in Theorem 2.1 and Theorem 2.2 is replaced by the sequence $\{X_n, n\geq 1\}$ then we can extend Theorem 1.1 to $\rho^*$-mixing random variables.
Theorem 2.5 Suppose that $\beta\geq -1$. Let $\{X_{i}, i\geq 1 \}$ be a sequence of rowwise $\rho^*$-mixing random variables which are stochastically dominated by a random variable $X$ satisfying $E|X|^p<\infty$ for some $p>1$. Let $\{a_{ni}, i\geq 1, n\geq 1 \}$ be an array of constants satisfying (1.1) and (2.1). Furthermore, assume that (2.2) holds for some $\alpha>0$ if $p\geq 2$. Let $EX_{i}=0$ for all $i\geq 1$. Then
Proof For any $i\geq 1, n\geq 1$, let $\displaystyle X_{ni}= X_{i}I(|a_{ni}X_{i}|\leq 1).$ Note that
The rest of the proof is the same as that of Theorem 2.1 except that we use Lemma 1.5 instead of Lemma 1.3 and it is omitted.
Theorem 2.6 Suppose that $\beta\geq -1$. Let $\{X_{i}, i\geq 1 \}$ be a sequence of rowwise $\rho^*$-mixing random variables which are stochastically dominated by a random variable $X$ satisfying $E|X|^p\log|X|<\infty$ for some $p\geq 1$. Let $\{a_{ni}, i\geq 1, n\geq 1 \}$ be an array of constants satisfying (1.1) and (2.13). Furthermore, assume that (2.2) holds for some $\alpha>0$ if $p\geq 2$. Let $EX_{i}=0$ for all $i\geq 1$. Then (2.17) holds.
The rest of the proof is the same as that of Theorem 2.2 except that we use Lemma 1.5 instead of Lemma 1.3 and it is omitted.
Remark 2.5 As in Remark 3.7 of Sung [5], Theorem 2.5 and Theorem 2.6 can not be extended to the array $\{X_{ni}, i\geq 1, n\geq 1\}$ of rowwise $\rho^*$-mixing random variables by using the method of the proof of Theorem 2.1 and Theorem 2.2.