数学杂志  2014, Vol. 34 Issue (6): 1015-1024   PDF    
扩展功能
加入收藏夹
复制引文信息
加入引用管理器
Email Alert
RSS
本文作者相关文章
WANG Tao
LIU Ming-ju
LI De-ming
A STRONG CHERNOFF BOUNDS DERIVED FROM EQUITABLE COLORINGS OF GRAPHS
WANG Tao1, LIU Ming-ju2, LI De-ming3    
1. Dept. of Foundation, North China Institute of Science and Technology, Sanhe 065201, China;
2. LMIB and Department of Mathematics, Beihang University, Beijing 100191, China;
3. Department of Mathematics, Capital Normal University, Beijing 100048, China
Abstract: In this paper, we present a strong Chernoff bounds by using the existence of small sized equitable colorings of graphs. The case we considered here is for sums of random variables with dependence. Our result improves the known results as far as we known.
Key words: coloring     dependence graph     Chernoff bounds    
由均匀染色导出的强Chernoff界
王涛1, 刘明菊2, 李德明3    
1. 华北科技学院基础部, 河北 三河 065201;
2. 北京航空航天大学数学与系统科学学院, 北京 100191;
3. 首都师范大学数学科学学院, 北京 100048
摘要:本文研究了相关变量的Chernoff问题.利用相关变量构造图的方法, 利用均匀染色的结果, 获得了更强的Chernoff界, 推广了Chernoff不等式在相关随机变量的不等式下的界.
关键词均匀染色    相关图    Chernoff界    
1 Introduction

In 1952, Chernoff in [3] introduced a technique that gave sharp upper bounds on the tail of the distribution of binary independent random variables. Since then, Chernoff bounds are coming to be the fundamental tools used in bounding the tail probabilities of the sums of bounded and the distribution of binary independent random variables. And the bounds have many different expressions under distinct settings. As the bound is so important, we list it as a theorem.

Theorem A  Let $X=\{X_1, X_2, \cdots, X_n\}$ be a set of $n$ mutually independent binary random variables. Let $S=\sum\limits_{i=1}^nX_i$ and $\mu=E[S]$. For $0<\varepsilon$,

$ \Pr(S\le \mu(1-\varepsilon))\le e^{-\frac{\mu\varepsilon^2}{2}} $ (1.1)

and

$ \Pr(S\ge \mu(1+\varepsilon))\le (e^{\varepsilon}(1+\varepsilon)^{-(1+\varepsilon)})^{\mu}. $ (1.2)

By the Chernoff's technique, one can obtain a better and more efficient evaluation of the bounds on the tail probabilities. The following theorem can be found in Alon and Spencer's book [1].

Theorem B  Let $S$ be a random variable with Poisson distribution and $E(S)=\mu$. For any $\varepsilon>0$, we have the following inequalities:

$ \Pr(S\le \mu(1-\varepsilon))\le e^{-\frac{\mu\varepsilon^2}{2}} $ (1.3)

and

$ \Pr(S\ge \mu(1+\varepsilon))\le (e^{\varepsilon}(1+\varepsilon)^{-(1+\varepsilon)})^{\mu}. $ (1.4)

By Theorem B, one can find good applications if a good bound on the tail probability of $S$ is obtained. For instance, Schmidt, Siegel and Srinivasan [9] obtained a bound on the tail probability of $S$, when the $X_i$'s are $k$-wise independent for $k$ smaller than a certain function of $n$, $\varepsilon$ and $\mu$. $k$-wise independence means that any $k$-subset of $X$ contains mutually independent random variables. An important idea for deriving the bounds on the tail probability of $S$ when the $X_i$'s are not mutually independent involved the theory of martingales by the so-called Azuma-Hoeffding inequality.

A dependence graph $G(V, E)$ for a set of random variables $X=\{X_1, X_2, \cdots, X_n\}$ is a graph with vertex set $V=\{X_1, X_2, \cdots, X_n\}$ and the edge set $E$ such that for each edge $X_iX_j$, $X_i$ and $X_j$ are not mutually independent. For any non-negative integer $d$, we say that the exhibit $d$-bounded dependence, if $\{X_1, X_2, \cdots, X_n\}$ have a dependence graph with maximum vertex degree $d$. The notion of dependence graph is used widely in random graphs and random algorithms, for more detailed information, refer to [1, 5, 6, 10]. Recall that the notion of a dependence graph of random variables is also used from the hypothesis of the Lovász's local lemma.

Let $g^{+}(\mu, \varepsilon)=e^{\mu \varepsilon}\cdot (1+\varepsilon)^{-\mu(1+\varepsilon)}$ and $g^{-}(\mu, \varepsilon)=e^{-\mu\varepsilon^2/2}$.

Pemmaraju showed the following results in [7, 8].

Theorem C  Let $X=\{X_1, X_2, \cdots, X_n\}$ be a set of $n$ identically distributed binary random variables with a $d$-bounded dependence graph. $S=\sum\limits_{i=1}^{n}X_i$ and $\mu=E(S)$. For any $\varepsilon$, $0<\varepsilon\le 1$, the following holds:

$ \Pr(S\le \mu(1+\varepsilon))\le \frac{4(d+1)}{e}\cdot g^{+}(\mu, \varepsilon)^{\frac{1}{d+1}} $ (1.5)

and

$ \Pr(S\ge \mu(1-\varepsilon))\le\frac{4(d+1)}{e}\cdot g^{-}(\mu, \varepsilon)^{\frac{1}{d+1}}. $ (1.6)

An equitable $t$-coloring of a graph $G$ is a proper $t$-coloring for which any two color class differ in size by at most one. The notion of equitable colorings of graphs is used in the proof of the above result. Lemma 1 is a conjecture of P. Erdös and showed by Hajnal and Szemerédi in [4]. Lemma 2 is showed by B. Bollabós and Guy in [2].

Lemma 1  A graph $G$ with maximum degree $\Delta$ has a $\Delta+1$-equitable coloring.

Lemma 2  A tree $T$ with $n$ vertices is equitably 3-colorable if $n\ge 3\Delta(T)- 8$ or $n = 3\Delta(T) - 10$.

Here we also use Lemma 1 as in [7] to derive bounds on the tail probabilities of the sums of binary independent random variables. The structure of the paper is as follows. In Section 2, we give two results on the bounds on the tail probabilities of the sums of binary independent random variables. Section 3 contains the main result and its proof. With an application of Lemma 2, a good bound will be followed.

2 Preliminaries

Theorems 1 and 2 are two results on the bounds on the tail probabilities of the sums of binary independent random variables. Theorem 1 is stronger than the Chernoff bounds in some sense and independent of $n$. Theorem 2 is stronger than the Chernoff bounds for some $k$ and $\lambda$ and easy for computation. Both will be used in next section.

Lemma 3  Let $\lambda$ be a non-negative real number and $k$ be a non-negative integer such that $(k-\lambda)^2> k$. If $x\ge k$ and $x>\lambda> 0$, then we have

$ \begin{equation} F(x)=\frac{x+1-\lambda}{x+1-k}\cdot \left(\frac{x}{x+1}\right)^x\cdot \left(\frac{x+1-\lambda}{x-\lambda}\right)^{(x-k)}> 1.\end{equation} $ (2.1)

Proof  Let $A(x)=(x+1-\lambda)/(x+1-k)$, $B(x)=(\frac{x}{x+1})^x$ and $C(x)=(\frac{x+1-\lambda}{x-\lambda})^{(x-k)}$. And then

$ \begin{eqnarray}&& A'(x)=\left(1+\frac{k-\lambda}{x+1-k}\right)'=\frac{\lambda-k}{(x+1-k)^2}, \end{eqnarray} $ (2.2)
$ \begin{eqnarray} B'(x)=B(x)\cdot \left\{\ln\frac{x}{1+x}+\frac{1}{1+x}\right\}\end{eqnarray} $ (2.3)

and

$ \begin{equation} C'(x)=C(x)\cdot \left\{ \ln\frac{x+1-\lambda}{x-\lambda}-\frac{x-k}{x+1-\lambda}\cdot \frac{1}{x-\lambda}\right\}. \end{equation} $ (2.4)

So, we have the following:

$ \begin{eqnarray} F'(x)&=& B(x)C(x)\nonumber\\ &&\{A'(x)+A(x)\{\ln\frac{x}{1+x}+\frac{1}{1+x}+\ln(1\\ &+&\frac{1}{x-\lambda})-\frac{x-k}{(x+1-\lambda)(x-\lambda)} \} \}\nonumber\\ &=&B(x)C(x)\nonumber\\ &&\{A'(x)+A(x)\{\ln(1+\frac{\lambda}{(x+1)(x-\lambda)})\\ &+&\frac{1}{x+1}-\frac{x-k}{(x+1-\lambda)(x-\lambda)}\}\}\nonumber\\ \end{eqnarray} $ (2.5)

by $\ln\frac{x}{1+x}+\ln(1+\frac{1}{x-\lambda})=\ln(1+\frac{\lambda}{(x+1)(x-\lambda)})$.

Let

$ \begin{eqnarray} G(x)&=&(x+1-k)^2\cdot \frac{F'(x)}{B(x)C(x)} = (x+1-k)^2\nonumber\\ &&\cdot \{A'(x)+A(x)\{\ln(1+\frac{\lambda}{(x+1)(x-\lambda)})\\ &+&\frac{1}{x+1}-\frac{x-k}{(x+1-\lambda)(x-\lambda)}\}\}. \end{eqnarray} $ (2.6)

By $x\ge 0$ and $\ln(1+x)\le x$, we have

$ \begin{eqnarray} G(x)& \le &(x+1-k)^2\nonumber\\ &&\cdot\{\frac{\lambda-k}{(x+1-k)^2} +\frac{x+1-\lambda}{x+1-k}\cdot\{\frac{\lambda}{(x+1)(x-\lambda)}\\ &+&\frac{1}{x+1}-\frac{x-k}{(x+1-\lambda)(x-\lambda)}\}\}\nonumber\\ &=&(x+1-k)^2\cdot\{\frac{\lambda-k}{(x+1-k)^2} +\frac{x+1-\lambda}{x+1-k}\cdot\{\frac{x}{(x-\lambda)(x-\lambda)}\\ &-&\frac{x-k}{(x+1-\lambda)(x-\lambda)}\}\}\nonumber\\ &=&\frac{(\lambda-k)(x+1)(x-\lambda)}{(x+1)(x-\lambda)} +\frac{x(x+1-k)(x+1-\lambda)}{(x+1)(x-\lambda)}\\ &-&\frac{(x-k)(x+1-k)(x+1)}{(x-\lambda)(x+1)}\nonumber\\ &=&\frac{(k-(k-\lambda)^2)x+(\lambda+1)k-k^2-\lambda^2}{(x+1)(x-\lambda)}. \end{eqnarray} $ (2.7)

By $k< (\lambda-k)^2$, we have the following:

$ \begin{equation} (\lambda+1)k-k^2-\lambda^2< (\lambda+1)k-(k+2k\lambda)=-k\lambda\le 0.\end{equation} $ (2.8)

For $0<\lambda< x$, we have

$ \begin{equation}(k-(k-\lambda)^2)x+(\lambda+1)k-k^2-\lambda^2< 0.\end{equation} $ (2.9)

So $F'(x)< 0$.

Note that $\lim\limits_{x\rightarrow +\infty}A(x)=1$, $\lim\limits_{x\rightarrow +\infty}B(x)=e^{-1}$ and $\lim\limits_{x\rightarrow +\infty}C(x)=e^{1}$, and then $\lim\limits_{x\rightarrow +\infty}F(x)=1$, so we have $F(x)> 1$.

Lemma 4  Let ${\rm BIN}(n, \lambda/n)$ be the sum of $n$ independent Bernoulli variables, each of which is equal to 1 with probability $\lambda/n$ and 0 otherwise. Let $k$ be a non-negative integer such that $(k-\lambda)^2> k$. Then we have

$ \begin{equation} {\rm BIN}(k; n, \lambda/n)< \frac{\lambda^ke^{-\lambda}}{k!}, \end{equation} $ (2.10)

where ${\rm BIN}(k; n, \lambda/n)=\Pr({\rm BIN}(n, \lambda/n)=k)$.

Proof  By Lemma 3, we have

$ \begin{eqnarray} \frac{{\rm BIN}(k; n+1, \lambda/(n+1))}{{\rm BIN}(k; n, \lambda/n)}&=& \frac{\binom{n+1}{k}}{\binom{n}{k}}\cdot \left(\frac{\lambda/(n+1)}{\lambda/n}\right)^k\cdot \frac{(1-\lambda/(n+1))^{n+1-k}}{(1-\lambda/n)^{n-k}}\nonumber\\ &=& \frac{n+1-\lambda}{n+1-k}\cdot \left(\frac{n}{n+1}\right)^n\cdot \left(\frac{n+1-\lambda}{n-\lambda}\right)^{n-k}\nonumber\\ &=& F(n)> 1. \end{eqnarray} $ (2.11)

Note that $\lim\limits_{n\rightarrow+\infty}{\rm BIN}(k; n, \lambda/n)=\frac{\lambda^ke^{-\lambda}}{k!}$. And then we have the conclusion.

Lemma 5  Let ${\rm BIN}(n, \lambda/n)$ be the sum of $n$ independent Bernoulli variables, each of which is equal to 1 with probability $\lambda/n$ and 0 otherwise. Let $(k-\lambda)^2>k$, where $k$ is a non-negative integer and $0<\lambda<n$. Then we have the following:

1. If $k<\lambda$, then

$\begin{eqnarray*} \Pr({\rm BIN}(n, \lambda/n)\le k)<\sum\limits_{i=0}^{k}\frac{\lambda^ie^{-\lambda}}{i!}.\end{eqnarray*} $ (2.12)

2. If $k>\lambda$, then

$\begin{eqnarray*} \Pr({\rm BIN}(n, \lambda/n)\ge k)<\sum\limits_{i=k}^{+\infty}\frac{\lambda^ie^{-\lambda}}{i!}.\end{eqnarray*} $ (2.13)

Proof  It follows immediately from Lemma 4.

Lemma 6  Let ${\rm BIN}(n, \lambda/n)$ be the sum of $n$ independent Bernoulli variables, each of which is equal to 1 with probability $\lambda/n$ and 0 otherwise. Let $(k-\lambda)^2>k$, where $k$ is a non-negative integer and $0<\lambda<n$. Then we have the following:

1. If $k<\lambda$, then

$ \Pr({\rm BIN}(n, \lambda/n)\le k)<\frac{\lambda^{k+1}e^{-\lambda}}{(\lambda-k)k!}, $ (2.14)

2. If $k>\lambda$, then

$ \Pr({\rm BIN}(n, \lambda/n)\ge k)<\frac{k+1}{(k+1-\lambda)}\cdot \frac{\lambda^ke^{-\lambda}}{k!}. $ (2.15)

Proof  1. If $n>\lambda>k\ge m$, then

$ \frac{{\rm BIN}(m-1; n, \lambda/n)}{{\rm BIN}(m; n, \lambda/n)}=\frac{(n-\lambda)m}{(n-m+1)\lambda}\le \frac{m}{\lambda}\le \frac{k}{\lambda}. $ (2.16)

By Lemma 4, for $0\le i\le k$,

$ {\rm BIN}(i; n, \lambda/n)\le \left(\frac{k}{\lambda}\right)^{k-i}\cdot {\rm BIN}(k; n, \lambda/n) $

and

$ 1+\frac{k}{\lambda}\cdots +\left(\frac{k}{\lambda}\right)^{k}=\frac{1-(k/\lambda)^{k+1}}{1-k/\lambda}<\frac{\lambda}{\lambda-k}. $ (2.17)

And so

$ \Pr({\rm BIN}(n, \lambda/n)\le k)\le\frac{\lambda}{\lambda-k}{\rm BIN}(k; n, \lambda/n)< \frac{ \lambda^{k+1}e^{-\lambda}}{(\lambda-k)\cdot k!}. $

2. If $0<\lambda<k\le m$, then

$ \frac{{\rm BIN}(m+1; n, \lambda/n)}{{\rm BIN}(m; n, \lambda/n)}=\frac{(n-m)\lambda}{(n-\lambda)(m+1)}\le \frac{\lambda}{m+1}\le \frac{\lambda}{k+1}. $ (2.18)

By Lemma 4, for $k\le i\le n$,

$ {\rm BIN}(i; n, \lambda/n)\le \left(\frac{\lambda}{k+1}\right)^{i-k}\cdot {\rm BIN}(k; n, \lambda/n) $

and

$ 1+\frac{\lambda}{k+1}\cdots +\left(\frac{\lambda}{k+1}\right)^{n-k}=\frac{1-(\lambda/(k+1))^{n-k+1}}{1-\lambda/(k+1)}<\frac{k+1}{k+1-\lambda}. $ (2.19)

And then,

$ \Pr({\rm BIN}(n, \lambda/n)\ge k)\le\frac{k+1}{k+1-\lambda}{\rm BIN}(k; n, \lambda/n)< \frac{(k+1) \lambda^{k}e^{-\lambda}}{(k+1-\lambda)\cdot k!}. $

Clearly, the results of Theorems 1 and 2 are stronger than those of Theorems A and B and easier to handle.

3 Main Results

Let $X=\{X_1, X_2, \cdots, X_n\}$ be a set of $n$ binary random variables and $G(V, E)$ be the dependence graph associated with $X$. Assume that $G$ admits a $t$-equitable coloring and $C_1, C_2, \cdots, C_t$ be the color classes of the coloring. Then we have two different ways to add restrictions with the elements of $X$, which are Models 1 and 2.

Model 1  The sum of the expected value of $X_i$'s is a constant. For any color class $C_i$, $1\le i\le n$, elements of $C_i$ are identical distribution.

Model 2  The sum of the expected value of $X_i$'s is a constant and $E(X_i)=E(X_j)$ for $1\le i<j\le n$.

Lemma 7  Let $X=\{X_1, X_2, \cdots, X_n\}$ be a set of $n$ binary random variables with dependence graph $G(V, E)$. Suppose that $G$ can be colored equitably with at most $t$ colors. For $(k-t-\mu)^2>kt-t^2$, $k-t>\mu>0$, we have upper tail probability

$\begin{eqnarray*} \Pr(S\ge k)<\sum\limits_{i\ge \frac{k}{t}-1}^{+\infty}\frac{\mu^ie^{-\mu/t}}{t^{(i-1)}i!};\end{eqnarray*} $ (3.1)

For $(k+t-\mu)^2>kt+t^2$, $k+t<\mu<\lceil n/t\rceil$, we have the lower tail probability

$\begin{eqnarray*} \Pr(S\le k)<\sum\limits_{i=0 }^{\frac{k}{t}+1}\frac{\mu^ie^{-\mu/t}}{t^{(i-1)}i!}.\end{eqnarray*} $ (3.2)

Proof  We distinguish two cases.

Model 1  Let $C_1, C_2, \cdots, C_t$ be the $t$ colors classes in a $t$-equitable-coloring of $G$. For each $j\in\{1, 2, \cdots, t\}$ and $i\in C_j$, let $E(\sum_{i\in C_j}X_i)=\mu/t$, and $E(X_i)=\mu/(t{|C_j|})$. We now rewrite the event $S\ge k$ as follows.

$S\ge k$ is equivalent to $S\ge \frac{k}{n}\cdot n$, which is equivalent to

$\begin{eqnarray*} S\ge \frac{k}{n}\cdot\sum\limits_{i\in [t]}|C_i|\equiv\sum\limits_{i\in [t]}\sum\limits_{j\in C_i}X_j\ge \sum\limits_{i\in [t]}\frac{k}{n}\cdot|C_i|.\end{eqnarray*} $ (3.3)

For

$\begin{eqnarray*} \sum\limits_{i\in [t]}\sum\limits_{j\in C_i}X_j\ge \sum\limits_{i\in [t]}\frac{k}{n}\cdot|C_i|\Rightarrow \exists i\in [t]: \sum\limits_{j\in C_i}X_j\ge \frac{k}{n}\cdot|C_i|.\end{eqnarray*} $ (3.4)

Hence

$\begin{eqnarray*} \Pr(\sum\limits_{i\in [t]}\sum\limits_{j\in C_i}X_j\ge \sum\limits_{i\in [t]}\frac{k}{n}\cdot|C_i|)\le \Pr(\exists i\in [t]: \sum\limits_{j\in C_i}X_j\ge \frac{k}{n}\cdot|C_i|)\end{eqnarray*} $ (3.5)

for

$\begin{eqnarray*} \Pr(\sum\limits_{j\in C_i}X_j\ge \frac{k}{n}\cdot|C_i|)\le \Pr(\sum\limits_{j\in C_i}X_j\ge \frac{k}{t}-1).\end{eqnarray*} $ (3.6)

So

$\begin{eqnarray*} \Pr(\exists i\in [t]: \sum\limits_{j\in C_i}X_j\ge \frac{k}{n}\cdot|C_i|)\le t\Pr(\sum\limits_{j\in C_i}X_j\ge \frac{k}{t}-1)\end{eqnarray*} $ (3.7)

for $(k-t-\mu)^2>kt-t^2$, $k-t>\mu>0$, by Theorem 1, we have

$\begin{eqnarray*} \Pr(\sum\limits_{j\in C_i}X_j\ge \frac{k}{t}-1)<\sum\limits_{i\ge \frac{k}{t}-1}\frac{\mu^ie^{-\mu/t}}{t^{i}i!}.\end{eqnarray*} $

So $\Pr(S\ge k)<\sum_{i\ge \frac{k}{t}-1}\frac{\mu^ie^{-\mu/t}}{t^{i-1}i!}$.

Model 2  Let $C_1, C_2, \cdots, C_t$ be the $t$ colors classes in a $t$-equitable-coloring of $G$. Let

$ S=\sum\limits_{i=1}^{n}X_i, ~~~E(X_i)=\mu/n $

for $1\le i\le n$.

We now rewrite the event $S\ge k$ as follows:

$S\ge k$ is equivalent to $S\ge \frac{k}{n}\cdot n$, which is equivalent to

$\begin{eqnarray*} S\ge \frac{k}{n}\cdot\sum\limits_{i\in [t]}|C_i|.\end{eqnarray*} $ (3.8)

For

$\begin{eqnarray*} \sum\limits_{i\in [t]}\sum\limits_{j\in C_i}X_j\ge \sum\limits_{i\in [t]}\frac{k}{n}\cdot|C_i|\Rightarrow \exists i\in [t]: \sum\limits_{j\in C_i}X_j\ge \frac{k}{n}\cdot|C_i|.\end{eqnarray*} $ (3.9)

Hence

$\begin{eqnarray*} \Pr(\sum\limits_{i\in [t]}\sum\limits_{j\in C_i}X_j\ge \sum\limits_{i\in [t]}\frac{k}{n}\cdot|C_i|)\le \Pr(\exists i\in [t]: \sum\limits_{j\in C_i}X_j\ge \frac{k}{n}\cdot|C_i|).\end{eqnarray*} $ (3.10)

Let $m=\lfloor n/t\rfloor$, we have

$\begin{eqnarray*} \Pr(\sum\limits_{j\in C_i, |C_i|=m}X_j\ge \frac{k}{t}-1)\ge \Pr(\sum\limits_{j\in C_i}X_j\ge \frac{k}{n}|C_i|).\end{eqnarray*} $ (3.11)

So

$\begin{eqnarray*} &&\Pr(\exists i\in [t]: \sum\limits_{j\in C_i}X_j\ge \frac{k}{n}\cdot|C_i|)\le \sum\limits_{j\in [t]}\\ &&\Pr(\sum\limits_{j\in C_i}X_j\ge \frac{k}{n}|C_i|)\le t\Pr(\sum\limits_{j\in C_i, |C_i|=m}X_j\ge \frac{k}{t}-1).\end{eqnarray*} $ (3.12)

Since $\sum\limits_{j\in C_i, |C_i|=m}X_j\sim b(m, \mu/n)$, for $k>\mu/t$, we have

$\begin{eqnarray*} \Pr(\sum\limits_{j\in C_i, |C_i|=m}X_j=k)\le \binom{m}{k}(\frac{\mu}{mt})^k(1-\frac{\mu}{mt})^{m-k}\end{eqnarray*} $ (3.13)

for $(k-t-\mu)^2>kt-t^2$, $k-t>\mu>0$, by Theorem 1, we have

$\begin{eqnarray*} \Pr(\sum\limits_{j\in C_i, |C_i|=m}X_j\ge \frac{k}{t}-1)<\sum\limits_{i\ge \frac{k}{t}-1}\frac{\mu^ie^{-\mu/t}}{t^{i}i!}.\end{eqnarray*} $

So $\Pr(S\ge k)<\sum\limits_{i\ge \frac{k}{t}-1}\frac{\mu^ie^{-\mu/t}}{t^{i-1}i!}$.

This completes the proof of upper tail probability bound. The proof of the lower tail probability bound is similar and omitted.

Lemma 8  Let $X=\{X_1, X_2, \cdots, X_n\}$ be a set of $n$ binary random variables with dependence graph $G(V, E)$. Suppose that $G$ can be colored equitably with at most $t$ colors. If $n/t$ is an integer, Then for $(k-\mu)^2>kt$, $k>\mu>0$, we have upper tail probability

$\begin{eqnarray*} \Pr(S\ge k)<\sum\limits_{i\ge \frac{k}{t}}\frac{\mu^ie^{-\mu/t}}{t^{i-1}i!};\end{eqnarray*} $ (3.14)

For $(k-\mu)^2>kt$, $k<\mu<n/t$, we have lower tail probability

$\begin{eqnarray*} \Pr(S\le k)<\sum\limits_{i=0}^{\frac{k}{t}}\frac{\mu^ie^{-\mu/t}}{t^{i-1}i!}.\end{eqnarray*} $ (3.15)

Theorem 1  Let $X=\{X_1, X_2, \cdots, X_n\}$ be a set of $n$ binary random variables with dependence graph $G(V, E)$. Let $d$ be the maximum degree of $G$. If

$ (k-d-1-\mu)^2>(d+1)(k-d-1) $

and $k-d-1>\mu>0$, then the upper tail probability is as follows:

$\begin{eqnarray*} \Pr(S\ge k)<\sum\limits_{i\ge \frac{k}{d+1}-1}^{+\infty}\frac{\mu^ie^{-\mu/(d+1)}}{(d+1)^{(i-1)}i!}.\end{eqnarray*} $ (3.16)

If $(k+d+1-\mu)^2>(d+1)(k+d+1)$ and $k+d+1<\mu<\lceil n/(d+1)\rceil$, then the lower tail probability is as follows:

$\begin{eqnarray*} \Pr(S\le k)<\sum\limits_{i=0}^{\frac{k}{d+1}+1} \frac{\mu^ie^{-\mu/(d+1)}}{(d+1)^{(i-1)}i!}.\end{eqnarray*} $ (3.17)

Actually, if $n/(d+1)$ is an integer, for $(k-\mu)^2>k(d+1)$ and $k>\mu>0$, then

$\begin{eqnarray*} \Pr(S\ge k)<\sum\limits_{i\ge \frac{k}{d+1}}^{+\infty}\frac{\mu^ie^{-\mu/(d+1)}}{(d+1)^{(i-1)}i!};\end{eqnarray*} $ (3.18)

For $(k-\mu)^2>k(d+1)$ and $k<\mu<n/(d+1)$, then

$\begin{eqnarray*} \Pr(S\le k)<\sum\limits_{i=0 }^{\frac{k}{d+1}}\frac{\mu^ie^{-\mu/(d+1)}}{(d+1)^{(i-1)}i!}.\end{eqnarray*} $ (3.19)

Proof  Let $G$ be the dependence graph of with maximum vertex degree $d$ such that a dependence graph exists by definition of d-bounded dependence. By Lemma 1, $G$ has a $d+1$-equitable coloring. Replacing $t$ by $(d+1)$ in Lemmas 7 and 8 yields the desired bounds.

Note that Lemmas 5 and 6 are special cases of Theorem 1.

The strength of the equitable coloring technique in deriving tail probability bounds lies in the fact that it allows us to focus closely on the structure of the dependence graph. In particular, a small equitable chromatic number for a dependence graph leads to sharp tail probability bounds. Not much seems to be known about the equitable chromatic number of different graph classes. The connection between equitable colorings and tail probability bounds presented.

The following result comes ready-made from Bollobás and Guy [2]. Lemma 2 essentially implies that if $\Delta(T) \le n/3$, then $T$ can be equitably 3-colored. This immediately translates to the following tail probability bounds on tree structured dependence graphs.

Theorem 2  Let $X=\{X_1, X_2, \cdots, X_n\}$ be a set of $n$ identical random variables that have a tree structured dependence graph with maximum degree no greater than $n/3$. Then we have the following bounds of the tail probabilities of the sum $S=\sum\limits_{i=1}^{n}X_i$ of the $X_i$'s as follows:

If $(k-3-\mu)^2>3(k-3)$, $k-3>\mu>0$, then

$\begin{eqnarray*} \Pr(S\ge k)<\sum\limits_{i\ge \frac{k}{3}-1}\frac{\mu^ie^{-\mu/3}}{3^{i-1}i!};\end{eqnarray*} $ (3.20)

If $(k+3-\mu)^2>3(k+3)$, $k+3<\mu<\lceil n/3\rceil$, then

$\begin{eqnarray*} \Pr(S\le k)<\sum\limits_{i=0 }^{\frac{k}{3}+1}\frac{\mu^ie^{-\mu/3}}{3^{i-1}i!}.\end{eqnarray*} $ (3.21)
References
[1] Alon N, Spencer J H. The probabilistic method (3rd edition)[M]. New York: Wiley, 2007.
[2] Bollobás B, Guy R K. Equitable and proportional coloring of trees[J]. J. Combinatorial Theory, Series B, 1983, 34(2): 177–186. DOI:10.1016/0095-8956(83)90017-5
[3] Chernoff H. A measure of asymptotic efficiency for tests of a hypothesis based on the sum of observations[J]. Annals of Mathematical Statistics, 1952, 23(2): 493–507.
[4] Hajnal H, Szeméredi S. Proof of a conjecture of erdös[A]. Paul. Erdös, A. Rènyi, and V. T. Sós, editors, Combinatorial Theory and Its Applications[C]. Vol Ⅱ, Volume 4 of Colloquia Mathematica Societatis János Bolyai, North-Holland, 1970: 601-623.
[5] Motwani R, Raghavan P. Randomized algorithms[M]. Cambridge: Cambridge University Press, 1995.
[6] Panconesi A, Srinivasan A. Randomized distributed edge colouring via an extension of the ChernoffHoeffding bounds[J]. SIAM J. on Computing, 1997, 26(2): 350–368. DOI:10.1137/S0097539793250767
[7] Pemmaraju S V. Equitable coloring extents Chernoff-Hoeffding bounds[A]. Pemmaraju S V. Approx-Random 2001[C]. LNCS 2129, 2001: 285-296.
[8] Pemmaraju S V. Equitable colorings, proportional colorings, and Chernoff-Hoeffding bounds[A]. Technical report, TR 01-05, Department of Computer Science, The University of Iowa, 2001.
[9] Schmidt J, Siegel A, Srinivasan A. Chernoff-Hoeffding bounds For applications with limited independence[J]. SIAM J. Discrete Mathmatics, 1995, 8(2): 223–250. DOI:10.1137/S089548019223872X
[10] Spencer J. Ten lectures on the probabilistic method[M]. Philadelphia: SIAM, 1987.