We first introduce some concepts of dependent random variables. The concept of negatively associated (abbreviated to NA in the following) random variables was introduced by Joag-Dev and Proschan [1].
Definition 1.1 A finite family of random variables $\{X_{i};1\leq i\leq n\}$ is said to be NA if for every pair of disjoint subsets $A, B \subset \{1, 2, \cdot \cdot \cdot, n\}, $
whenever $f$ and $g$ are coordinatewise nondecreasing functions such that this covariance exists. An infinite family of random variables is NA if every finite subfamily is NA.
Definition 1.2 (see [2]) A function $ \phi: \mathbb{R}^n\to \mathbb{R}$ is called superadditive if $\phi (x\vee y)+\phi (x\wedge y)\geq \phi (x)+\phi (y)$ for all $x, y\in \mathbb{R}^n$, where $\vee $ is for componentwise maximum and $\wedge $ is for componentwise minimum.
The concept of negatively superadditive-dependent (abbreviated to NSD in the following) random variables was introduced by Hu [3] as follows.
Definition 1.3 A random vector $X=(X_{1}, X_{2}, \cdot \cdot \cdot, X_{n})$ is said to be NSD if
where $X_{1}^{\ast}, X_{2}^{\ast}, \cdot \cdot \cdot, X_{n}^{\ast}$ are independent such that $ X_{i}$ and $X_{i}^{\ast}$ have the same distribution for each $i$ and $\phi$ is a superadditive function such that the expectations exist. A sequence of random variables $\{X_{n};n\geq 1\}$ is said to be NSD if for any $n \geq 1$, $(X_{1}, X_{2}, \cdot \cdot \cdot, X_{n})$ is NSD.
Hu [3] gave an example illustrating that NSD does not imply NA. Christofides and Vaggelatou [4] indicated that NA implies NSD.
Hu et al. [5] introduced the concept of $m$-negatively associated random variables as follows.
Definition 1.4 Let $m\geq 1$ be a fixed integer. A sequence of random variables $\{X_{n};n\geq 1\}$ is said to be $m$-negatively associated (abbreviated to $m$-$\mathrm{NA}$ in the following) if for any $n \geq 2$ and $ i_{1}, \cdot \cdot \cdot, i_{n}$ such that $|i_{k}-i_{j}|\geq m$ for all $1\leq k \neq j \leq n$, we have that $ X_{i_{1}}, \cdot \cdot \cdot, X_{i_{n}}$ are NA.
The concept of $m$-$\mathrm{NA}$ random variables is a natural extension from NA random variables (wherein $m=1$).
Similarly, we can define $m$-NSD random variables.
Definition 1.5 Let $m\geq 1$ be a fixed integer. A sequence of random variables $\{X_{n};n\geq 1\}$ is said to be $m$-negatively superadditive-dependent (abbreviated to $m$-NSD in the following) if for any $n \geq 2$ and $ i_{1}, \cdot \cdot \cdot, i_{n}$ such that $|i_{k}-i_{j}|\geq m$ for all $1\leq k \neq j \leq n$, we have that $ (X_{i_{1}}, \cdot \cdot \cdot, X_{i_{n}})$ is NSD.
Hsu and Robbins [6] introduced the concept of complete convergence of a sequence of random variables. Hu et al. [7] proposed the following general complete convergence of rowwise independent arrays of random variables.
Theorem A Let $\{X_{ni};1\leq i \leq k_{n}, n\geq 1\}$ be an array of rowwise independent random variables and $\{c_{n}\}$ be a sequence of positive real numbers. Suppose that for every $\varepsilon>0$ and some $\delta>0$,
(ⅰ) $\sum\limits_{n=1}^\infty c_{n}\sum\limits_{i=1}^{k_{n}}P\{|X_{ni}|>\varepsilon\}<\infty$;
(ⅱ) there exists $j\geq 1$ such that $\begin{aligned} \sum\limits_{n=1}^\infty c_{n}\left(\sum\limits_{i=1}^{k_{n}}EX_{ni}^2I\{|X_{ni}|\leq \delta\}\right)^j<\infty; \end{aligned}$
(ⅲ) $\sum\limits_{i=1}^{k_{n}}EX_{ni}I\{|X_{ni}|\leq \delta\}\to 0\;\;\textrm{as}\; n \to \infty$.
Then
In this paper, we let $ \{k_{n}, n\geq 1\}$ be a sequence of positive integers such that $\lim\limits_{n \to \infty} k_{n}=\infty$.
The proof of Hu et al. [7] is mistakenly based on the fact that the assumptions of Theorem A imply convergence in probability of the corresponding partial sums. Hu and Volodin [8] and Hu et al. [9] presented counterexamples to this proof. They mentioned that whether Theorem A was true remained open. Since then many authors attempted to solve this problem. Hu et al. [9] and Kuczmaszewska [10] gave partial solution to this question. Sung et al. [11] completely solved this problem by using a symmetrization procedure and Kruglov et al. [12] obtained the complete convergence for maximum partial sums by using a submartingale approach.
Recently, Chen et al. [13] extended Theorem A to the case of arrays of rowwise NA random variables and obtained the complete convergence for maximum partial sums. Hu et al. [5] obtained complete convergence for maximum partial sums similar to Theorem A for arrays of rowwise $m$-NA random variables. Qiu et al. [14] obtained similar result for arrays of rowwise ND random variables. Wand et al. [15] extended and improved Theorem A for NSD arrays. Qiu [16] obtained similar result for weighted sums of NA random variables. The main purpose of this article is to generalize and improve Theorem A for the case of arrays of rowwise $m$-$\mathrm{NSD}$ random variables.
Theorem 2.1 Let $\{X_{ni}; 1\leq i \leq k_{n}, n\geq 1\}$ be an array of rowwise $m$-$\mathrm{NSD}$ random variables and $\{c_{n}\}$ be a sequence of positive real numbers. Assume that for every $\varepsilon>0$ and some $\delta>0$,
(ⅰ) $\sum\limits_{n=1}^\infty c_{n}\sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>\varepsilon)<\infty$;
(ⅱ) there exists $j\geq 1$ such that $\begin{aligned} \sum\limits_{n=1}^\infty c_{n}\left(\sum_{i=1}^{k_{n}}\mathrm{Var}(X_{ni}I\{|X_{ni}|\leq \delta\})\right)^j<\infty. \end{aligned}$
From Theorem 2.1, we can obviously obtain the following corollary.
Corollary 2.1 Let $\{X_{ni}; 1\leq i \leq k_{n}, n\geq 1\}$ be an array of rowwise $m$-$\mathrm{NSD}$ random variables. If conditions (ⅰ) and (ⅱ) of Theorem 2.1 and
(ⅲ) $\sum\limits_{i=1}^{k_{n}}EX_{ni}I\{|X_{ni}|\leq \delta\}\to 0\;\;\textrm{as}\; n \to \infty$ are satisfied, then
Theorem 2.2 Let $\{X_{ni}; 1\leq i \leq k_{n}, n\geq 1\}$ be an array of rowwise $m$-$\mathrm{NSD}$ random variables with $EX_{ni}=0$ and $EX_{ni}^2<\infty$ for $1\leq i \leq k_{n}, n\geq 1$. Let $\{c_{n}\}$ be a sequence of positive real numbers. Assume that for every $\varepsilon>0$ and some $\delta>0$,
(ⅱ) there exists $j\geq 1$ such that $\begin{aligned} \sum_{n=1}^\infty c_{n}\left(\sum_{i=1}^{k_{n}}EX_{ni}^2\right)^j<\infty. \end{aligned}$
Remark 1 Corollary 2.1 shows that the main results of Hu et al. [7] and Sung et al. [11] remain true for $m$-$\mathrm{NSD}$ random variables. We generalize the corresponding complete convergence theorems from the independent case to $m$-$\mathrm{NSD}$ arrays without adding any extra conditions.
Remark 2 In Theorem 2.2, we only need conditions (ⅰ) and (ⅱ) of Corollary 2.1. Condition (ⅲ) of Corollary 2.1 is not needed. Therefore Theorem 2.2 extends and improves the corresponding results of Hu et al. [7] and Sung et al. [11]. In addition, our results also extend the corresponding results of Chen et al. [13], Hu et al. [5], Qiu et al. [14] and Wang et al. [15]. When $m=1$, from Theorem 2.1 and Theorem 2.2, we can obtain the results of Theorem 3.3 and Theorem 3.2 of Wand et al. [15], respectively. We mention that Theorem 2.1 of this paper not only extends the results of Wand et al. [15] but also we have a simpler proof. More precisely, we only divide the sum into two parts in our proof instead of into four parts as was in the paper of Wand et al. [15].
Throughout this paper, $C$ denotes a positive constant which may differ from one place to another.
In order to prove our results, we need the following lemmas.
Lemma 3.1 (cf. Wand et al. [15], Lemma 2.4) Let $\{X_{n};n\geq 1\}$ be a sequence of $\mathrm{NSD}$ random variables with $EX_{n}=0$ and $EX_{n}^2<\infty$, $n\geq 1$. Let
Then for all $x>0, a>0$,
Lemma 3.2 Let $\{X_{n};n\geq 1\}$ be a sequence of $m$-$\mathrm{NSD}$ random variables with $EX_{n}=0$ and $EX_{n}^2<\infty$, $n\geq 1$. Let
Then for all $n\geq m, x>0, a>0, $
Proof Given any $1\leq k \leq n$, let $r=[\frac{n}{m}]$. Set
and $S_{mk+j}^{'}=\sum\limits_{i=0}^{k}Y_{mi+j}$ for $1\leq j \leq m$.
by Lemma 3.1, we have
If we consider $-X_{n}$ instead of $X_{n}$ in the arguments above, by a similar way we get
Therefore
Lemma 3.3 Let $\{X_{n};n\geq 1\}$ be a sequence of $m$-$\mathrm{NSD}$ random variables with $EX_{n}=0$ and $EX_{n}^2<\infty$, $n\geq 1$. Let $S_{n}=\sum\limits_{i=1}^{n}X_{i}, \;B_{n}=\sum\limits_{i=1}^{n}EX_{i}^2$. Then for all $n\geq 1, x>0, a>0$,
Proof By the fact that $e^{-x}\leq 1/(1+x)\leq 1/x$ for $x>0$, we have, for all $n\geq 1, x>0, a>0$,
On the other hand,
Therefore by Lemma 3.2, the conclusion holds.
Proof of Theorem 2.1 Let $Y_{ni}=\delta I\{X_{ni}>\delta\}+ X_{ni}I\{|X_{ni}|\leq \delta\}-\delta I\{X_{ni}<-\delta\}$ and $Y_{ni}^{'}=\delta I\{X_{ni}>\delta\}-\delta I\{X_{ni}<-\delta\}\quad \textrm{and}\;\; 1\leq i \leq k_{n}, \;n\geq 1.$ $\{Y_{ni}, 1\leq i \leq k_{n}, \;n\geq 1\}$ is an array of rowwise $m$-$\mathrm{NSD}$ random variables. Note that
Hence, by condition (ⅰ), it is sufficient to prove that
For any $a>0$ and set
Note that
Hence, it remains to prove that $\sum\limits_{n\in N_{2}} c_{n}P\left(\max\limits_{1\leq l \leq k_{n}}\left|\sum\limits_{i=1}^l (Y_{ni}-EY_{ni})\right|>\varepsilon/2\right)<\infty $. By Lemma 3.3, we get
For $n\in N_{2}$, we have
Denote $B_{n}=\sum\limits_{i=1}^{k_{n}}\mathrm{Var}(Y_{ni})$.
When $n\in N_{2}$, we have that $\sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>\delta)\leq 1$. Let $a=\frac{\varepsilon}{24mj}$, we have
Proof of Theorem 2.2 Let $a=\frac{\varepsilon}{12mj}$. By Lemma 3.3, we have that