数学杂志  2017, Vol. 37 Issue (5): 889-897   PDF    
扩展功能
加入收藏夹
复制引文信息
加入引用管理器
Email Alert
RSS
本文作者相关文章
FENG Feng-xiang
WANG Ding-cheng
WU Qun-ying
COMPLETE CONVERGENCE FOR ARRAYS OF ROWWISE M-NSD RANDOM VARIABLES
FENG Feng-xiang1,2, WANG Ding-cheng1, WU Qun-ying2    
1. School of Mathematical Science, University of Electronic Science and Technology of China, Chengdu 611731, China;
2. College of Science, Guilin University of Technology, Guilin 541004, China
Abstract: In this article, we study complete convergence theorems for arrays of rowwise m-negatively superadditive-dependent (m-NSD) random variables. By using Kolmogorov-type exponential inequality for m-NSD random variables, we obtain complete convergence theorems for arrays of rowwise m-NSD random variables, which generalize those on complete convergence theorem previously obtained by Hu et al. (1998) and Sung et al. (2005) from independent distributed case to m-NSD arrays. Our results also extend the corresponding results of Chen et al.(2008), Hu et al. (2009), Qiu et al. (2011) and Wang et al. (2014).
Key words: Kolmogorov-type exponential inequality     complete convergence     m-NSD random variables    
m-NSD随机变量阵列的完全收敛性
冯凤香1,2, 王定成1, 吴群英2    
1. 电子科技大学数学科学学院, 四川成都 611731;
2. 桂林理工大学理学院, 广西桂林 541004
摘要:本文研究了行m-NSD随机变量阵列的完全收敛性问题.主要利用m-NSD随机变量的Kolmogorov型指数不等式,获得了行m-NSD随机变量阵列的完全收敛性定理,将Hu等(1998)andSung等(2005)的结果从独立情形推广到了m-NSD随机变量阵列.本文的结论同样推广了Chen等(2008),Hu等(2009),Qiu等(2011)和Wang等(2014)的结果.
关键词Kolmogorov型指数不等式    完全收敛性    m-NSD随机变量    
1 Introduction

We first introduce some concepts of dependent random variables. The concept of negatively associated (abbreviated to NA in the following) random variables was introduced by Joag-Dev and Proschan [1].

Definition 1.1  A finite family of random variables $\{X_{i};1\leq i\leq n\}$ is said to be NA if for every pair of disjoint subsets $A, B \subset \{1, 2, \cdot \cdot \cdot, n\}, $

$\begin{aligned} \mathrm{Cov}\left(f(X_{i}, i\in A), g(X_{j}, j\in B)\right)\leq 0, \end{aligned}$

whenever $f$ and $g$ are coordinatewise nondecreasing functions such that this covariance exists. An infinite family of random variables is NA if every finite subfamily is NA.

Definition 1.2  (see [2]) A function $ \phi: \mathbb{R}^n\to \mathbb{R}$ is called superadditive if $\phi (x\vee y)+\phi (x\wedge y)\geq \phi (x)+\phi (y)$ for all $x, y\in \mathbb{R}^n$, where $\vee $ is for componentwise maximum and $\wedge $ is for componentwise minimum.

The concept of negatively superadditive-dependent (abbreviated to NSD in the following) random variables was introduced by Hu [3] as follows.

Definition 1.3  A random vector $X=(X_{1}, X_{2}, \cdot \cdot \cdot, X_{n})$ is said to be NSD if

$\begin{aligned} E\phi(X_{1}, X_{2}, \cdot \cdot \cdot, X_{n})\leq E\phi(X_{1}^{\ast}, X_{2}^{\ast}, \cdot \cdot \cdot, X_{n}^{\ast}), \end{aligned}$

where $X_{1}^{\ast}, X_{2}^{\ast}, \cdot \cdot \cdot, X_{n}^{\ast}$ are independent such that $ X_{i}$ and $X_{i}^{\ast}$ have the same distribution for each $i$ and $\phi$ is a superadditive function such that the expectations exist. A sequence of random variables $\{X_{n};n\geq 1\}$ is said to be NSD if for any $n \geq 1$, $(X_{1}, X_{2}, \cdot \cdot \cdot, X_{n})$ is NSD.

Hu [3] gave an example illustrating that NSD does not imply NA. Christofides and Vaggelatou [4] indicated that NA implies NSD.

Hu et al. [5] introduced the concept of $m$-negatively associated random variables as follows.

Definition 1.4  Let $m\geq 1$ be a fixed integer. A sequence of random variables $\{X_{n};n\geq 1\}$ is said to be $m$-negatively associated (abbreviated to $m$-$\mathrm{NA}$ in the following) if for any $n \geq 2$ and $ i_{1}, \cdot \cdot \cdot, i_{n}$ such that $|i_{k}-i_{j}|\geq m$ for all $1\leq k \neq j \leq n$, we have that $ X_{i_{1}}, \cdot \cdot \cdot, X_{i_{n}}$ are NA.

The concept of $m$-$\mathrm{NA}$ random variables is a natural extension from NA random variables (wherein $m=1$).

Similarly, we can define $m$-NSD random variables.

Definition 1.5  Let $m\geq 1$ be a fixed integer. A sequence of random variables $\{X_{n};n\geq 1\}$ is said to be $m$-negatively superadditive-dependent (abbreviated to $m$-NSD in the following) if for any $n \geq 2$ and $ i_{1}, \cdot \cdot \cdot, i_{n}$ such that $|i_{k}-i_{j}|\geq m$ for all $1\leq k \neq j \leq n$, we have that $ (X_{i_{1}}, \cdot \cdot \cdot, X_{i_{n}})$ is NSD.

Hsu and Robbins [6] introduced the concept of complete convergence of a sequence of random variables. Hu et al. [7] proposed the following general complete convergence of rowwise independent arrays of random variables.

Theorem A  Let $\{X_{ni};1\leq i \leq k_{n}, n\geq 1\}$ be an array of rowwise independent random variables and $\{c_{n}\}$ be a sequence of positive real numbers. Suppose that for every $\varepsilon>0$ and some $\delta>0$,

(ⅰ) $\sum\limits_{n=1}^\infty c_{n}\sum\limits_{i=1}^{k_{n}}P\{|X_{ni}|>\varepsilon\}<\infty$;

(ⅱ) there exists $j\geq 1$ such that $\begin{aligned} \sum\limits_{n=1}^\infty c_{n}\left(\sum\limits_{i=1}^{k_{n}}EX_{ni}^2I\{|X_{ni}|\leq \delta\}\right)^j<\infty; \end{aligned}$

(ⅲ) $\sum\limits_{i=1}^{k_{n}}EX_{ni}I\{|X_{ni}|\leq \delta\}\to 0\;\;\textrm{as}\; n \to \infty$.

Then

$\begin{aligned} \sum\limits_{n=1}^\infty c_{n}P\left(\left|\sum\limits_{i=1}^{k_{n}}X_{ni}\right|>\varepsilon\right)<\infty\;\;\textrm{for all}\;\varepsilon>0. \end{aligned}$

In this paper, we let $ \{k_{n}, n\geq 1\}$ be a sequence of positive integers such that $\lim\limits_{n \to \infty} k_{n}=\infty$.

The proof of Hu et al. [7] is mistakenly based on the fact that the assumptions of Theorem A imply convergence in probability of the corresponding partial sums. Hu and Volodin [8] and Hu et al. [9] presented counterexamples to this proof. They mentioned that whether Theorem A was true remained open. Since then many authors attempted to solve this problem. Hu et al. [9] and Kuczmaszewska [10] gave partial solution to this question. Sung et al. [11] completely solved this problem by using a symmetrization procedure and Kruglov et al. [12] obtained the complete convergence for maximum partial sums by using a submartingale approach.

Recently, Chen et al. [13] extended Theorem A to the case of arrays of rowwise NA random variables and obtained the complete convergence for maximum partial sums. Hu et al. [5] obtained complete convergence for maximum partial sums similar to Theorem A for arrays of rowwise $m$-NA random variables. Qiu et al. [14] obtained similar result for arrays of rowwise ND random variables. Wand et al. [15] extended and improved Theorem A for NSD arrays. Qiu [16] obtained similar result for weighted sums of NA random variables. The main purpose of this article is to generalize and improve Theorem A for the case of arrays of rowwise $m$-$\mathrm{NSD}$ random variables.

2 Main Results

Theorem 2.1  Let $\{X_{ni}; 1\leq i \leq k_{n}, n\geq 1\}$ be an array of rowwise $m$-$\mathrm{NSD}$ random variables and $\{c_{n}\}$ be a sequence of positive real numbers. Assume that for every $\varepsilon>0$ and some $\delta>0$,

(ⅰ) $\sum\limits_{n=1}^\infty c_{n}\sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>\varepsilon)<\infty$;

(ⅱ) there exists $j\geq 1$ such that $\begin{aligned} \sum\limits_{n=1}^\infty c_{n}\left(\sum_{i=1}^{k_{n}}\mathrm{Var}(X_{ni}I\{|X_{ni}|\leq \delta\})\right)^j<\infty. \end{aligned}$

Then

$\begin{aligned} \sum\limits_{n=1}^\infty c_{n}P\left(\max\limits_{1\leq l \leq k_{n}}\left|\sum\limits_{i=1}^l\left(X_{ni}-EX_{ni}I\{|X_{ni}|\leq \delta\}\right)\right|>\varepsilon\right)<\infty\;\;\textrm{for all}\;\;\varepsilon>0. \end{aligned}$

From Theorem 2.1, we can obviously obtain the following corollary.

Corollary 2.1  Let $\{X_{ni}; 1\leq i \leq k_{n}, n\geq 1\}$ be an array of rowwise $m$-$\mathrm{NSD}$ random variables. If conditions (ⅰ) and (ⅱ) of Theorem 2.1 and

(ⅲ) $\sum\limits_{i=1}^{k_{n}}EX_{ni}I\{|X_{ni}|\leq \delta\}\to 0\;\;\textrm{as}\; n \to \infty$ are satisfied, then

$\begin{aligned} \sum\limits_{n=1}^\infty c_{n}P\left(\max\limits_{1\leq l \leq k_{n}}\left|\sum\limits_{i=1}^l X_{ni}\right|>\varepsilon\right)<\infty\;\;\textrm{for all}\;\;\varepsilon>0. \end{aligned}$

Theorem 2.2  Let $\{X_{ni}; 1\leq i \leq k_{n}, n\geq 1\}$ be an array of rowwise $m$-$\mathrm{NSD}$ random variables with $EX_{ni}=0$ and $EX_{ni}^2<\infty$ for $1\leq i \leq k_{n}, n\geq 1$. Let $\{c_{n}\}$ be a sequence of positive real numbers. Assume that for every $\varepsilon>0$ and some $\delta>0$,

(ⅰ) $\sum\limits_{n=1}^\infty c_{n}\sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>\varepsilon)<\infty$;

(ⅱ) there exists $j\geq 1$ such that $\begin{aligned} \sum_{n=1}^\infty c_{n}\left(\sum_{i=1}^{k_{n}}EX_{ni}^2\right)^j<\infty. \end{aligned}$

Then

$\begin{aligned} \sum\limits_{n=1}^\infty c_{n}P\left(\max\limits_{1\leq l \leq k_{n}}\left|\sum\limits_{i=1}^l X_{ni}\right|>\varepsilon\right)<\infty\;\;\textrm{for all}\;\;\varepsilon>0. \end{aligned}$

Remark 1  Corollary 2.1 shows that the main results of Hu et al. [7] and Sung et al. [11] remain true for $m$-$\mathrm{NSD}$ random variables. We generalize the corresponding complete convergence theorems from the independent case to $m$-$\mathrm{NSD}$ arrays without adding any extra conditions.

Remark 2  In Theorem 2.2, we only need conditions (ⅰ) and (ⅱ) of Corollary 2.1. Condition (ⅲ) of Corollary 2.1 is not needed. Therefore Theorem 2.2 extends and improves the corresponding results of Hu et al. [7] and Sung et al. [11]. In addition, our results also extend the corresponding results of Chen et al. [13], Hu et al. [5], Qiu et al. [14] and Wang et al. [15]. When $m=1$, from Theorem 2.1 and Theorem 2.2, we can obtain the results of Theorem 3.3 and Theorem 3.2 of Wand et al. [15], respectively. We mention that Theorem 2.1 of this paper not only extends the results of Wand et al. [15] but also we have a simpler proof. More precisely, we only divide the sum into two parts in our proof instead of into four parts as was in the paper of Wand et al. [15].

Throughout this paper, $C$ denotes a positive constant which may differ from one place to another.

3 Proofs of Main Results

In order to prove our results, we need the following lemmas.

Lemma 3.1  (cf. Wand et al. [15], Lemma 2.4) Let $\{X_{n};n\geq 1\}$ be a sequence of $\mathrm{NSD}$ random variables with $EX_{n}=0$ and $EX_{n}^2<\infty$, $n\geq 1$. Let

$S_{n}=\sum\limits_{i=1}^{n}X_{i}, \;B_{n}=\sum\limits_{i=1}^{n}EX_{i}^2.$

Then for all $x>0, a>0$,

$ P(\max\limits_{1\leq k \leq n} S_{k}\geq x)\leq P\left(\max\limits_{1\leq k \leq n}|X_{k}|>a\right)+2\exp\left(-\frac{x^2}{8B_{n}}\right) +2\left(\frac{B_{n}}{4(xa+B_{n})}\right)^{\frac{x}{12a}}, \\ P\left(\max\limits_{1\leq k \leq n}|S_{k}|\geq x\right)\leq 2P\left(\max\limits_{1\leq k \leq n}|X_{k}|>a\right)+4\exp\left(-\frac{x^2}{8B_{n}}\right) +4\left(\frac{B_{n}}{4(xa+B_{n})}\right)^{\frac{x}{12a}}. $

Lemma 3.2  Let $\{X_{n};n\geq 1\}$ be a sequence of $m$-$\mathrm{NSD}$ random variables with $EX_{n}=0$ and $EX_{n}^2<\infty$, $n\geq 1$. Let

$S_{n}=\sum\limits_{i=1}^{n}X_{i}, \;B_{n}=\sum\limits_{i=1}^{n}EX_{i}^2.$

Then for all $n\geq m, x>0, a>0, $

$ P\left(\max\limits_{1\leq k \leq n} S_{k}\geq x\right)\leq m P\left(\max\limits_{1\leq k \leq n}|X_{k}|>a\right)+2m\exp\left(-\frac{x^2}{8m^2B_{n}}\right)\\ \quad \quad \quad \quad \quad \quad \quad \quad +2m\left(\frac{B_{n}}{4(xa/m+B_{n})}\right)^{\frac{x}{12ma}}, \\ P\left(\max\limits_{1\leq k \leq n}|S_{k}|\geq x\right)\leq 2m P\left(\max\limits_{1\leq k \leq n}|X_{k}|>a\right)+4m\exp\left(-\frac{x^2}{8m^2B_{n}}\right)\\ \quad \quad \quad \quad \quad \quad \quad \quad +4m\left(\frac{B_{n}}{4(xa/m+B_{n})}\right)^{\frac{x}{12ma}}. $

Proof  Given any $1\leq k \leq n$, let $r=[\frac{n}{m}]$. Set

$ Y_{i} = \left\{ \begin{array}{ll} X_{i}, &\textrm{if $1\leq i \leq n$},\\ 0, & \textrm{if $i>n$} \end{array} \right. $

and $S_{mk+j}^{'}=\sum\limits_{i=0}^{k}Y_{mi+j}$ for $1\leq j \leq m$.

$\begin{aligned} \textrm{Since}\; \left\{\max\limits_{1\leq k \leq n} S_{k}\geq x\right\}\subset \left\{\max\limits_{0\leq k \leq r}S_{mk+1}^{'}\geq \frac{x}{m}\right\}\cup \cdot \cdot \cdot \cup \left\{\max\limits_{0\leq k \leq r}S_{mk+m}^{'}\geq \frac{x}{m}\right\}, \end{aligned}$

by Lemma 3.1, we have

$ \quad \quad P\left(\max\limits_{1\leq k \leq n} S_{k}\geq x\right)\leq \sum\limits_{j=1}^{m}P \left(\max\limits_{0\leq k \leq r}S_{mk+j}^{'}\geq \frac{x}{m}\right)\\ \leq \sum\limits_{j=1}^{m}P\left(\max\limits_{0\leq i \leq r}Y_{mi+j}>a\right)+\sum\limits_{j=1}^{m}\left[2\exp\left(-\frac{x^2}{8m^2\sum\limits_{i=0}^{r}EY_{mi+j}^2}\right) +2\left(\frac{\sum\limits_{i=0}^{r}EY_{mi+j}^2}{4(xa/m+\sum\limits_{i=0}^{r}EY_{mi+j}^2)}\right)^{\frac{x}{12ma}}\right].\\ \leq m P\left(\max\limits_{1\leq k \leq n}|X_{k}|>a\right)+2m\exp\left(-\frac{x^2}{8m^2\sum\limits_{i=1}^{n}EX_{i}^2}\right) +2m\left(\frac{\sum\limits_{i=1}^{n}EX_{i}^2}{4(xa/m+\sum\limits_{i=1}^{n}EX_{i}^2)}\right)^{\frac{x}{12ma}}.\\ \leq m P\left(\max\limits_{1\leq k \leq n}|X_{k}|>a\right)+2m\exp\left(-\frac{x^2}{8m^2B_{n}}\right) +2m\left(\frac{B_{n}}{4(xa/m+B_{n})}\right)^{\frac{x}{12ma}}.\\ $

If we consider $-X_{n}$ instead of $X_{n}$ in the arguments above, by a similar way we get

$\begin{aligned} P(\max\limits_{1\leq k \leq n}(-S_{k})\geq x)\leq m P\left(\max\limits_{1\leq k \leq n}|X_{k}|>a\right)+2m\exp\left(-\frac{x^2}{8m^2B_{n}}\right) +2m\left(\frac{B_{n}}{4(xa/m+B_{n})}\right)^{\frac{x}{12ma}}. \end{aligned}$

Therefore

$\begin{aligned} P\left(\max\limits_{1\leq k \leq n}|S_{k}|\geq x\right)\leq 2m P\left(\max\limits_{1\leq k \leq n}|X_{k}|>a\right)+4m\exp\left(-\frac{x^2}{8m^2B_{n}}\right) +4m\left(\frac{B_{n}}{4(xa/m+B_{n})}\right)^{\frac{x}{12ma}}. \end{aligned}$

Lemma 3.3  Let $\{X_{n};n\geq 1\}$ be a sequence of $m$-$\mathrm{NSD}$ random variables with $EX_{n}=0$ and $EX_{n}^2<\infty$, $n\geq 1$. Let $S_{n}=\sum\limits_{i=1}^{n}X_{i}, \;B_{n}=\sum\limits_{i=1}^{n}EX_{i}^2$. Then for all $n\geq 1, x>0, a>0$,

$\begin{aligned} P\left(\max\limits_{1\leq k \leq n} |S_{k}|\geq x\right)\leq 2m P\left(\max\limits_{1\leq k \leq n}|X_{k}|>a\right)+8m\left(\frac{2mB_{n}}{3xa}\right)^{\frac{x}{12ma}}. \end{aligned}$

Proof  By the fact that $e^{-x}\leq 1/(1+x)\leq 1/x$ for $x>0$, we have, for all $n\geq 1, x>0, a>0$,

$\begin{aligned} \exp\left(-\frac{x^2}{8m^2B_{n}}\right)=\left[\exp\left(-\frac{3xa}{2mB_{n}}\right)\right]^{\frac{x}{12ma}}\leq \left(\frac{2mB_{n}}{3xa}\right)^{\frac{x}{12ma}}. \end{aligned}$

On the other hand,

$\begin{aligned} \left(\frac{B_{n}}{4(xa/m+B_{n})}\right)^{\frac{x}{12ma}}\leq \left(\frac{B_{n}}{4xa/m}\right)^{\frac{x}{12ma}} \leq \left(\frac{2mB_{n}}{3xa}\right)^{\frac{x}{12ma}}. \end{aligned}$

Therefore by Lemma 3.2, the conclusion holds.

Proof of Theorem 2.1  Let $Y_{ni}=\delta I\{X_{ni}>\delta\}+ X_{ni}I\{|X_{ni}|\leq \delta\}-\delta I\{X_{ni}<-\delta\}$ and $Y_{ni}^{'}=\delta I\{X_{ni}>\delta\}-\delta I\{X_{ni}<-\delta\}\quad \textrm{and}\;\; 1\leq i \leq k_{n}, \;n\geq 1.$ $\{Y_{ni}, 1\leq i \leq k_{n}, \;n\geq 1\}$ is an array of rowwise $m$-$\mathrm{NSD}$ random variables. Note that

$ \quad \quad P\left(\max\limits_{1\leq l \leq k_{n}}\left|\sum\limits_{i=1}^l\left(X_{ni}-EX_{ni}I\{|X_{ni}|\leq \delta\}\right)\right|>\varepsilon\right)\\ \leq \sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>\delta)+P\left(\max\limits_{1\leq l \leq k_{n}}\left|\sum\limits_{i=1}^l\left(X_{ni}I\{|X_{ni}|\leq \delta\}-EX_{ni}I\{|X_{ni}|\leq \delta\}\right)\right|>\varepsilon\right)\\ =\sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>\delta)+P\left(\max\limits_{1\leq l \leq k_{n}}\left|\sum\limits_{i=1}^l(Y_{ni}-EY_{ni}-Y_{ni}^{'}+EY_{ni}^{'})\right|>\varepsilon\right)\\ \leq \sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>\delta)+P\left(\max\limits_{1\leq l \leq k_{n}}\left|\sum\limits_{i=1}^l(Y_{ni}^{'}-EY_{ni}^{'})\right|>\varepsilon/2\right)\\ +P\left(\max\limits_{1\leq l \leq k_{n}}\left|\sum\limits_{i=1}^l (Y_{ni}-EY_{ni})\right|>\varepsilon/2\right)\\ \leq \sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>\delta)+C\sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>\delta)+P\left(\max\limits_{1\leq l \leq k_{n}}\left|\sum\limits_{i=1}^l(Y_{ni}-EY_{ni})\right|>\varepsilon/2\right). $

Hence, by condition (ⅰ), it is sufficient to prove that

$\begin{aligned} \sum\limits_{n=1}^\infty c_{n}P\left(\max\limits_{1\leq l \leq k_{n}}\left|\sum\limits_{i=1}^l(Y_{ni}-EY_{ni})\right|>\varepsilon/2\right)<\infty. \end{aligned}$

For any $a>0$ and set

$\begin{aligned} d=\min\left\{1, \frac{a}{6\delta}\right\}, N_{1}=\left\{n: \sum\limits_{i=1}^{k_{n}}P\left(|X_{ni}|>\min\left\{\delta, \frac{a}{6}\right\}\right)>d\right\} \;\;\textrm{and}\;N_{2}=\mathbb{N}\setminus N_{1}. \end{aligned}$

Note that

$ \sum\limits_{n\in N_{1}} c_{n}P\left(\max\limits_{1\leq l \leq k_{n}}\left|\sum\limits_{i=1}^l (Y_{ni}-EY_{ni})\right|>\varepsilon/2\right) \leq \sum\limits_{n\in N_{1}} c_{n}\\ \leq \frac{1}{d}\sum\limits_{n=1}^{\infty}c_{n}\sum\limits_{i=1}^{k_{n}}P\left(|X_{ni}|>\min\left\{\delta, \frac{a}{6}\right\}\right)<\infty. $

Hence, it remains to prove that $\sum\limits_{n\in N_{2}} c_{n}P\left(\max\limits_{1\leq l \leq k_{n}}\left|\sum\limits_{i=1}^l (Y_{ni}-EY_{ni})\right|>\varepsilon/2\right)<\infty $. By Lemma 3.3, we get

$ \sum\limits_{n\in N_{2}} c_{n}P\left(\max\limits_{1\leq l \leq k_{n}}\left|\sum\limits_{i=1}^l (Y_{ni}-EY_{ni})\right|>\varepsilon/2\right)\\ \leq 2m \sum\limits_{n\in N_{2}} c_{n}P\left(\max\limits_{1\leq i \leq k_{n}}|Y_{ni}-EY_{ni}|>a\right) +8m\sum\limits_{n\in N_{2}} c_{n} \left(\frac{4m\sum\limits_{i=1}^{k_{n}}\mathrm{Var}(Y_{ni})}{3a\varepsilon}\right)^{\frac{\varepsilon}{24ma}}. $

Note that

$ P\left(\max\limits_{1\leq i \leq k_{n}}|Y_{ni}-EY_{ni}|>a\right)\leq P\left(\max\limits_{1\leq i \leq k_{n}}\left|X_{ni}I\{|X_{ni}|\leq \delta\}-EX_{ni}I\{|X_{ni}|\leq \delta\}\right|>a/2\right)\\ \;\;\quad \quad \quad \quad \quad \quad \quad \quad \quad \quad \quad \quad +P\left(\max\limits_{1\leq i \leq k_{n}}|Y_{ni}^{'}-EY_{ni}^{'}|>a/2 \right). $

For $n\in N_{2}$, we have

$ \max\limits_{1\leq i \leq k_{n}}|EX_{ni}I\{|X_{ni}|\leq \delta\}|\leq \max\limits_{1\leq i \leq k_{n}}E|X_{ni}|I\{|X_{ni}|\leq \delta\}\\ \quad \quad \quad \quad \quad \quad \quad \quad \quad \quad \leq \max\limits_{1\leq i \leq k_{n}}\left(E|X_{ni}|I\{|X_{ni}|\leq a/6\}+E|X_{ni}|I\{a/6<|X_{ni}|\leq \delta\}\right)\\ \quad \quad \quad \quad \quad \quad \quad \quad \quad \quad \leq a/6+\delta \sum\limits_{i=1}^{k_{n}}P\left(|X_{ni}|>\min\left\{\delta, \frac{a}{6}\right\}\right)\leq a/6+\delta d \leq a/3. $

Therefore

$\quad \quad \sum\limits_{n\in N_{2}} c_{n}P\left(\max\limits_{1\leq i \leq k_{n}}|Y_{ni}-EY_{ni}|>a\right)\\ \leq \sum\limits_{n=1}^{\infty}c_{n}P\left(\max\limits_{1\leq i \leq k_{n}}|X_{ni}|I\{|X_{ni}|\leq \delta\}>a/6\right)+ \sum\limits_{n=1}^{\infty}c_{n}P\left(\max\limits_{1\leq i \leq k_{n}}|Y_{ni}^{'}-EY_{ni}^{'}|>a/2 \right)\\ \leq \sum\limits_{n=1}^{\infty}c_{n}\sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>a/6)+C\sum\limits_{n=1}^{\infty}c_{n}\sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>\delta)<\infty. $

Denote $B_{n}=\sum\limits_{i=1}^{k_{n}}\mathrm{Var}(Y_{ni})$.

$\begin{aligned} B_{n}=\sum\limits_{i=1}^{k_{n}}[\mathrm{Var}(X_{ni}I\{|X_{ni}|\leq \delta\})+\mathrm{Var}(\delta I\{X_{ni}>\delta\}-\delta I\{X_{ni}<-\delta\})\\ +2\mathrm{cov}(X_{ni}I\{|X_{ni}|\leq \delta\}, \delta I\{X_{ni}>\delta\}-\delta I\{X_{ni}<-\delta\})]\\ \leq\sum\limits_{i=1}^{k_{n}}\mathrm{Var}(X_{ni}I\{|X_{ni}|\leq \delta\})+9\delta ^{2}\sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>\delta). \end{aligned}$

When $n\in N_{2}$, we have that $\sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>\delta)\leq 1$. Let $a=\frac{\varepsilon}{24mj}$, we have

$ \quad \quad \sum\limits_{n\in N_{2}} c_{n}\left(\frac{4mB_{n}}{3a\varepsilon}\right)^{\frac{\varepsilon}{24ma}}=\sum\limits_{n\in N_{2}} c_{n}\left(\frac{4m}{3a\varepsilon}\right)^j(B_{n})^j\\ \leq C\sum\limits_{n\in N_{2}}c_{n}\left(\sum\limits_{i=1}^{k_{n}}\mathrm{Var}(X_{ni}I\{|X_{ni}|\leq \delta\})+9\delta ^{2}\sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>\delta)\right)^j\\ \leq C\sum\limits_{n\in N_{2}}c_{n}\left(\sum\limits_{i=1}^{k_{n}}\mathrm{Var}(X_{ni}I\{|X_{ni}|\leq \delta\})\right)^j+C\sum\limits_{n\in N_{2}}c_{n}\left(\sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>\delta)\right)^j\\ \leq C\sum\limits_{n=1}^{\infty}c_{n}\left(\sum\limits_{i=1}^{k_{n}}\mathrm{Var}(X_{ni}I\{|X_{ni}|\leq \delta\})\right)^j+C\sum\limits_{n=1}^{\infty}c_{n}\sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>\delta)<\infty. $

Proof of Theorem 2.2  Let $a=\frac{\varepsilon}{12mj}$. By Lemma 3.3, we have that

$ \quad \quad \sum\limits_{n=1}^\infty c_{n}P\left(\max\limits_{1\leq l \leq k_{n}}\left|\sum\limits_{i=1}^l X_{ni}\right|>\varepsilon\right)\\ \leq\sum\limits_{n=1}^\infty c_{n}2m P\left(\max\limits_{1\leq l \leq k_{n}}|X_{ni}|>a\right)+\sum\limits_{n=1}^\infty c_{n}8m\left(\frac{2m\sum\limits_{i=1}^{k_{n}}EX_{ni}^2}{3a\varepsilon}\right)^{\frac{\varepsilon}{12ma}}\\ \leq C\sum\limits_{n=1}^\infty c_{n}\sum\limits_{i=1}^{k_{n}}P(|X_{ni}|>a)+C\sum\limits_{n=1}^\infty c_{n}\left(\sum\limits_{i=1}^{k_{n}}EX_{ni}^2\right)^j<\infty. $
References
[1] Joag-Dev K, Proschan F. Negative association of random variables with applications[J]. Ann. Stat., 1983, 11: 286–295. DOI:10.1214/aos/1176346079
[2] Kemperman J H B. On the FKG-inequalities for measures on a partially ordered space[J]. Proc. Akad. Wetenschappen, Ser. A., 1997, 80: 313–331.
[3] Hu T Z. Negatively superadditive dependence of random variables with applications[J]. Chinese J. Appl. Prob. Stat., 2000, 16: 133–144.
[4] Christofides T C, Vaggelatou E. A connection between supermodular ordering and positive/negative association[J]. J. Multi. Anal., 2004, 88: 138–151. DOI:10.1016/S0047-259X(03)00064-2
[5] Hu T C, Chiang C Y, Taylor R L. On complete convergence for arrays of rowwise m-negatively associated random variables[J]. Nonl. Anal., 2009, 71: 1075–1081. DOI:10.1016/j.na.2009.01.104
[6] Hsu P, Robbins H. Complete convergence and the law of large numbers[J]. Proc. Natl. Acad. Sci. USA., 1947, 33: 25–31. DOI:10.1073/pnas.33.2.25
[7] Hu T C, Szynal D, Volodin A. A note on complete convergence for arrays[J]. Stat. Prob. Lett., 1998, 38: 27–31. DOI:10.1016/S0167-7152(98)00150-3
[8] Hu T C, Volodin A. Addendum to "A note on complete convergence for arrays"[J]. Stat. Prob. Lett., 2000, 47: 209–211. DOI:10.1016/S0167-7152(99)00209-6
[9] Hu T C, Ordóñez Cabrera M, Sung S H, Volodin A. Complete convergence for arrays of rowwise independent random variables[J]. Commun. Korean Math. Soc., 2003, 18: 375–383. DOI:10.4134/CKMS.2003.18.2.375
[10] Kuczmaszewska A. On some conditions for complete convergence for arrays[J]. Stat. Prob. Lett., 2004, 66: 399–405. DOI:10.1016/j.spl.2003.11.010
[11] Sung S H, Hu T C, Volodin A I. More on complete convergence for arrays[J]. Stat. Prob. Lett., 2005, 71: 303–311. DOI:10.1016/j.spl.2004.11.006
[12] Kruglov V M, Volodin A I, Hu T C. On complete convergence for arrays[J]. Stat. Prob. Lett., 2006, 76: 1631–1640. DOI:10.1016/j.spl.2006.04.006
[13] Chen PY, Hu T C, Liu X, Volodin A. On complete convergence for arrays of row-wise negatively associated random variables[J]. The. Prob. Appl., 2008, 52(2): 323–328. DOI:10.1137/S0040585X97983079
[14] Qiu Dehua, Chang Kuangchao, Antonini R G, Volodin A. On the strong rates of convergence for arrays of rowwise negatively dependent random variables[J]. Stoch. Anal. Appl., 2011, 29: 375–385. DOI:10.1080/07362994.2011.548683
[15] Wang Xuejun, Deng Xin, Zheng Lulu, Hu Shuhe. Complete convergence for arrays of rowwise negatively superadditive-dependent random variables and its applications[J]. Stat. J. Theo. Appl. Stat., 2014, 48(4): 834–850.
[16] Qiu Dehua. Complete convergence for arrays of rowwise NA random variables[J]. J. Math., 2013, 33(1): 138–146.