Let $\{{{X}_{n}};n\ge 1\}$ be a sequence of random variables that defined on a probability space $(\Omega ,F,P)$. Let $n$ and $m$ be positive integers, $F_{n}^{m}$ denote the $\sigma $-algebra generated by the random variables ${{X}_{n}},{{X}_{n+1}},\cdots ,{{X}_{m}}$. Let $S,T\subset \mathbb{N}$ be nonempty sets, and define that ${{F}_{S}}=\sigma ({{X}_{i}};i\in S\subset \mathbb{N})$. Given two $\sigma $ -algebras $\psi ,\zeta $ in $F$, note that
and define the $\tilde{\varphi }$ -mixing coefficients by
Obviously, $0\le \tilde{\varphi }(n+1)\le \tilde{\varphi }(n)\le 1$, $n\ge 0$ and $\tilde{\varphi }(0)=1$.
Definition 1.1 A sequence of random variables $\{{{X}_{n}};n\ge 1\}$ is said to be a $\tilde{\varphi }$-mixing sequence of random variables if there exists $k\in \mathbb{N}$ such that $\tilde{\varphi }(k)<1$.
Note that if $\{{{X}_{n}};n\ge 1\}$ is a sequence of independent random variables, then $\tilde{\varphi }(n)=0$ for all $n\ge 1$.
The concept of $\tilde{\varphi }$-mixing was introduced by Wu and Lin [1], and then a number of publications are devoted to $\tilde{\varphi }$-mixing random variables. For example, We refer to Wu and Lin [1] for the complete convergence and strong law of large numbers for identically distributed, Wang and Hu et al. [2] for the convergence properties of the partial sums, Wang et al. [3] for the strong law of large numbers and growth rate, Jiang and Wu [4] for the weak convergence and complete convergence, Shen and Wang et al. [5] for the strong convergence properties of sums of products, and so on.
The main purpose of this paper is to study the complete convergence and strong law of large numbers for weighted sums of $\tilde{\varphi }$-mixing random variables without assumptions of identically distributed. The results obtained generalize and extend the results for independent and identically distributed random variables to the case of $\tilde{\varphi }$-mixing random variables, but also improve the almost sure convergence result of Wu and Lin [1] under a mild weighted condition.
Throughout this paper, $C$ will represent a positive constant whose value may change from one appearance to the next, and ${{a}_{n}}=O({{b}_{n}})$ will mean ${{a}_{n}}\le C({{b}_{n}})$. We assume that $\phi (x)$ is a positive increasing function on $(0,\infty )$ such that $\phi (x)\uparrow \infty $ as $x\to \infty $ and $\varphi (x)$ is the inverse function of $\phi (x)$. Since $\phi (x)\uparrow \infty $ as $x\to \infty $, it follows that $\varphi (x)\uparrow \infty $ as $x\to \infty $. For convenience, we let $\phi (0)=0$ and $\varphi (0)=0$.
To obtain our results, the following lemmas are needed.
Lemma 1.1 Let $\{{{X}_{n}};n\ge 1\}$ be a sequence of $\tilde{\varphi }$-mixing random variables with $E{{X}_{n}}=0$ and $E{{\left| {{X}_{n}} \right|}^{r}}<\infty $ for some $r\ge 1$ and all $n\ge 1$. Then there exists a constant $C=C(r,\tilde{\varphi }(k))$ depending only on $r$ and $\tilde{\varphi }(k)$ such that for any $n\ge 1$,
Proof The proof was obtained by Wu [6]. So we omit it.
Lemma 1.2 Assume that the inverse function $\varphi (x)$ of $\phi (x)$ satisfies
If $E\left[ \phi \left( \left| X \right| \right) \right]<\infty $, then
Proof The proof is similar to that of Lemma 1 of Sung [7]. So we omit it.
The following lemma is known, see, for example (see [6]).
Lemma 1.3 Let $\{{{X}_{n}};n\ge 1\}$ be a sequence of random variables, if there exits a random variable $X$ such that
for all $x\ge 0$. We call that the sequence $\{{{X}_{n}};n\ge 1\}$ of random variables is stochastically dominated by a random variable $X$. Then, for $\forall \beta >0$ and $\forall t>0$,
Theorem 2.1 Let $\{{{X}_{n}};n\ge 1\}$ be a sequence of $\tilde{\varphi }$-mixing random variables which is stochastically dominated by a random variable $X$. Suppose that $EX=0$, $E{{\left| X \right|}^{r}}<\infty $ for $1<r\le 2$ and $E\left[ \phi \left( \left| X \right| \right) \right]<\infty $. Assume that the inverse function $\varphi (x)$ of $\phi (x)$ satisfies (1.5). Let $\{{{a}_{ni}};n\ge 1,i\ge 1\}$ be an array of constants such that
(1) $\underset{1\le i\le n}{\mathop{\max\limits }}\,\left| {{a}_{ni}} \right|=O\left( \frac{1}{\varphi (n)} \right)$;
(2) $\sum\limits_{i=1}^{n}{|a_{ni}|^{r}}=O\left( {{\log }^{-1-\alpha }}n \right) \quad \textrm{ for } 1<r\le 2 \quad \textrm{ and some } \alpha >r$. Then
for all $\varepsilon >0$.
Proof For $n\ge 1$, define that
It is easy to check that
Hence
First, we shall need to prove that
It follows from $E\left[ \phi \left( \left| X \right| \right) \right]<\infty $, Lemma 1.2, Lemma 1.3 and the Kronecker's lemma that
From $EX=0$, $\underset{1\le i\le n}{\mathop{\max\limits }}\,\left| {{a}_{ni}} \right|=O\left( \frac{1}{\varphi (n)} \right)$, Lemma 1.3, (2.4) and $\varphi (n)\uparrow \infty $, we can get that
as $n\to \infty $. It implies that (2.3) holds true. It follows from (2.2) and (2.3) that
for $n$ large enough.
Hence to prove (2.1), we need to prove that
It follows from the Markov's inequality, (1.3) of Lemma 1.1, Lemma 1.3, $E{{X}^{r}}<\infty $, $\alpha >r$ and the condition of $\sum\limits_{i=1}^{n}{a_{ni}^{r}}=O\left( {{\log }^{-1-\alpha }}n \right)$ that
It follows from $E\left[ \phi \left( \left| X \right| \right) \right]<\infty $ and Lemma 1.3 that
The proof of Theorem 2.1 is completed.
Theorem 2.2 Let $\{{{X}_{n}};n\ge 1\}$ be a sequence of $\tilde{\varphi }$-mixing random variables which is stochastically dominated by a random variable $X$. Suppose that $EX=0$, $E{{\left| X \right|}^{r}}<\infty $ for $r>2$ and $E\left[ \phi \left( \left| X \right| \right) \right]<\infty $. Assume that the inverse function $\varphi (x)$ of $\phi (x)$ satisfies (1.5). Let $\{{{a}_{ni}};n\ge 1,i\ge 1\}$ be an array of constants such that
(2) $\sum\limits_{i=1}^{n}{|a_{ni}|^{r}}=O\left( {{\log }^{-1-\alpha }}n \right) \quad \textrm{ for } r>2 \quad \textrm{ and some } \alpha >r$. Then (2.1) holds true.
Proof The proof is similar to that of Theorem 2.1. We only need to prove that
The proof of Theorem 2.2 is completed.