See first: Reading Vershynin’s HDP I: Markov, Chernoff, Hoeffding

See next: Reading Vershynin’s HDP III: Subexponential random variables

We ask what is the class of random variables that satisfy a Hoeffding-type bound, which for $N=1$ gives

$${\rm P}[|X| > t] \leq e^{-ct^2}.$$

This inequality says that the tails of $X$ are sub-Gaussian.

Many random variables fall into this category: the normal, and all bounded distributions.

Some notable properties of the Gaussian

Some properties of the normal are inherited by all random variables. Let us firstly look at how $\|X\|_{L^p}$ scales with $p$ for $X\sim\mathcal{N}(0, 1)$.

Exercise 2.5.1. (Moments of the normal distribution) Show that for each $p \geq 1$, the random variable $X\sim\mathcal{N}(0, 1)$ satisfies

$$\|X\|_{L^p} = \sqrt{2} \left(\frac{\Gamma\left(\frac{p+1}{2}\right)}{\Gamma(\frac{1}{2})}\right)^{1/p},$$

and

$$\|X\|_{L^p} = O(\sqrt{p}), \text{ as } p \to \infty.$$

Solution. We have

$$\begin{aligned} \|X\|_{L^p}^p {}={}& {\rm I\!E}[|X|^p] {}={} \int_{-\infty}^{\infty}|x|^p\frac{1}{\sqrt{2\pi}}e^{-\frac{x^2}{2}} {\rm d}x \\ {}={}& 2 \int_{0}^{\infty}x^p\frac{1}{\sqrt{2\pi}}e^{-\frac{x^2}{2}} {\rm d}x \\ {}={}& \frac{2}{\sqrt{2\pi}} \int_{0}^{\infty}x^pe^{-\frac{x^2}{2}} {\rm d}x \\ {}={}& \frac{\sqrt{2}}{\sqrt{\pi}} \int_{0}^{\infty}x^pe^{-\frac{x^2}{2}} {\rm d}x \end{aligned}$$

We now turn our attention to the integral

$$\begin{aligned} I {}={}& \int_{0}^{\infty}x^pe^{-\frac{x^2}{2}} {\rm d}x \\ {}={}& \int_{0}^{\infty} x x^{p-1}e^{-\frac{x^2}{2}} {\rm d}x \end{aligned}$$

We define $y = x^2/2$; then ${\rm d}y = x{\rm d}x$ and $x^{p-1} = (2y)^{(p-1)/2}$. We have

$$\begin{aligned} I {}={}& \int_{0}^{\infty} (2y)^{\frac{p-1}{2}}e^{-y} {\rm d}y \\ {}={}& 2^{\frac{p-1}{2}} \int_{0}^{\infty} y^{\left(\frac{p-1}{2} + 1\right) - 1}e^{-y} {\rm d}y \\ {}={}& 2^{\frac{p-1}{2}} \Gamma\left(\frac{p+1}{2}\right). \end{aligned}$$

As a result, and using the fact that $\sqrt{\pi} = \Gamma(1/2)$,

$$\begin{aligned} \|X\|_{L_p}^{p} = \frac{\sqrt{2}}{\Gamma(\frac{1}{2})} 2^{\frac{p-1}{2}} \Gamma\left(\frac{p+1}{2}\right), \end{aligned}$$

and the assertion follows by raising to the power $p$. The asymptotic result follows by Stirling's approximation formula

$$\begin{aligned}\Gamma(z) \sim \sqrt{2\pi} z^{z-\frac{1}{2}}e^{-z}, \text{ as } z \to \infty.\end{aligned}$$

This proves the assertion. $\Box$

As we will see below, this asymptotic property fully characterises the class of sub-Gaussian random variables.

Exercise 2.5.1/b. (MGF of normal distribution) Show that the moment generating function (MGF) of the random variable $X\sim\mathcal{N}(0, 1)$ is

$$M_X(\lambda) = \exp(\lambda^2/2),$$

for $\lambda \in {\rm I\!R}$.

Solution. This can be seen by direct computation. By the definition of mgf,

$$\begin{aligned}M_X(\lambda) = {\rm I\!E}[e^{\lambda X}] = \int_{-\infty}^{\infty}e^{\lambda x}\frac{1}{\sqrt{2\pi}}e^{-x^2/2}{\rm d}x = \ldots = \exp(\lambda^2/2).\end{aligned}$$

Characterisation of sub-Gaussianity

The following proposition characterises the property of sub-Gaussianity (this is Proposition 2.5.2 in [1]).

Proposition 1 (Sub-Gaussian properties). Let $X$ be a random variable. Then the following properties are equivalent

  1. The tails of $X$ satisfy ${\rm P}[|X| \geq t] \leq 2 \exp(-t^2/K_1^2)$, for all $t\geq 0$, for some constant $K_1 > 0$
  2. The moments of $X$ satisfy $\|X\|_{L^p} \leq K_2 \sqrt{p}$, for all $p\geq 1$, for some constant $K_2$
  3. The mgf of $X^2$ satisfies ${\rm I\!E}[\exp(\lambda^2X^2)] \leq \exp(K_3^2 \lambda^2)$, for all $\lambda$ such that $|\lambda|\leq 1/K_3$
  4. The mgf of $X^2$ is bounded at some point, namely, ${\rm I\!E}[\exp(X^2/K_4^2)] \leq 2$

Moreover, if ${\rm I\!E}[X] = 0$ the above properties are also equivalent to

  1. The mgf of $X$ satisfies ${\rm I\!E}[\lambda X] \leq \exp(K_5^2 \lambda^2)$ for all $\lambda \in {\rm I\!R}$

Lastly, there is a constant $C$ so that for any two constants $K_i$, $K_j$, $i,j=1,\ldots, 5$, we have $K_j \leq C K_i$.

Note. The constants $K_1, \ldots, K_5$ depend on $X$. In fact, they depend on the sub-Gaussian norm of $X$, that we will introduce later.







Exercise 2.5.5. (Regarding Proposition 1, Item 3)

  1. Suppose $X\sim\mathcal{N}(0,1)$. Show that the function $\lambda \mapsto {\rm I\!E}[\exp(\lambda^2 X^2)]$ is finite only in some neighbourhood of zero.
  2. Suppose that some random variable $X$ satisfies ${\rm I\!E}[\exp(\lambda^2 X^2)] \leq \exp(K\lambda^2)$ for all $\lambda \in {\rm I\!R}$ for some constant $K$. Show that $X$ is an essentially bounded random variable.

Solution. 1. For $X\sim\mathcal{N}(0,1)$ we have

$$\begin{aligned}{\rm I\!E}[\exp(\lambda^2 X^2)] {}={}& \frac{1}{\sqrt{2\pi}}\int_{-\infty}^{\infty}e^{\lambda^2x^2}e^{-\frac{x^2}{2}}{\rm d}x = \frac{2\sqrt{\pi}}{\sqrt{2 - 4\lambda}},\end{aligned}$$

and the integral converges only for $|\lambda| < 1/2$.

2. It suffices to determine a value $N$ such that ${\rm P}[|X| \geq N] = 0$. Take $z\in{\rm I\!R}$. Then,

$$\begin{aligned} {\rm P}[|X| \geq z] {}={}& {\rm P}[\exp(\lambda^2 X^2) \geq \exp(\lambda^2 z^2)] \\ {}\leq{}& \exp(-\lambda^2 z^2){\rm I\!E}\exp(K\lambda^2) \\ {}={}\exp[\lambda^2 (K-z^2)]. \end{aligned}$$

Choose $N \geq \sqrt{K}$. Then,

$$\begin{aligned}{\rm P}[|X| \geq N] \leq \exp(\lambda^2 (\underbrace{K-N^2}_{\lneq 0})),\end{aligned}$$

and take $\lambda \to \infty$ to see that ${\rm P}[|X| \geq N] = 0$. $\Box$

The sub-Gaussian norm

A random variable that satisfies any of the properties of Proposition 1 is called a sub-Gaussian random variable. We define the sub-Gaussian norm of $X$ to be the smallest value of $K_4$ in property 4 and we denote this as

$$\begin{aligned}\|X\|_{\psi_2} = \inf\{t > 0 : {\rm I\!E}\exp(X^2/t^2) \leq 2\}.\end{aligned}$$

More specifically, this is the Orlisz norm of $X$ generated by the function $\Phi(u) = e^{u^2} - 1$ (more on this later).

From the definition, it follows that ${\rm I\!E}[e^{X^2/\|X\|_{\psi_2}}] \leq 2,$ and if there is a $K$ such that ${\rm I\!E}[e^{X^2/K}] \leq 2,$ then $K \geq \|X\|_{\psi_2}.$

Note: We could define a norm $\|X\|' = \sup_{p \geq 1} \|X\|_{L^p}/\sqrt{p}$. This would be equal to $\|X\|_{\psi_2}$ times an absolute constant.

Exercise 2.5.7. ($\|{}\cdot{}\|_{\psi_2}$ is a norm) Show that $\|{}\cdot{}\|_{\psi_2}$ is indeed a norm on the space of sub-Gaussian random variables.

Solution. It is easy to show nonnegativity and homogeneity, so we will move on to subadditivity. Let $X$ and $Y$ be two sub-Gaussian random variables. We have

$$\begin{aligned} \|X+Y\|_{\psi_2} {}={}& \inf\left\{t>0 {}:{} {\rm I\!E}\exp\frac{(X+Y)^2}{t^2} \leq 2\right\} \\ {}={}& \inf\left\{t=t_1+t_2, t_1, t_2>0 {}:{} {\rm I\!E}\exp\frac{(X+Y)^2}{(t_1 + t_2)^2} \leq 2\right\} \\ {}={}& \inf\left\{t=t_1+t_2, t_1, t_2>0 {}:{} {\rm I\!E}\exp\left(\frac{t_1 \frac{X}{t_1}+t_2 {Y}{t_2}}{t_1 + t_2}\right)^2 \leq 2\right\} \\ {}={}& \inf\left\{t=t_1+t_2, t_1, t_2>0 {}:{} {\rm I\!E}\exp\left(\frac{t_1}{t_1+t_2} \frac{X}{t_1} + \frac{t_2}{t_1+t_2} \frac{Y}{t_2}\right)^2 \leq 2\right\}. \end{aligned}$$

We have managed to conjure up a convex combination. We define the convex function $g(s) = \exp s^2$ (easy to check). By the convexity property of $g$, we have

$$\begin{aligned} {\rm I\!E}\exp\left(\frac{t_1}{t_1+t_2} \frac{X}{t_1} + \frac{t_2}{t_1+t_2} \frac{Y}{t_2}\right)^2 \leq \frac{t_1}{t_1+t_2} {\rm I\!E}\exp (X/t_1)^2 {}+{} \frac{t_2}{t_1+t_2} {\rm I\!E}\exp (Y/t_2)^2, \end{aligned}$$

so,

$$\begin{aligned} \|X+Y\|_{\psi_2} {}\leq{}& \inf \Big\{ t=t_1+t_2, t_1, t_2>0 {}:{} \\ &\qquad \frac{t_1}{t_1+t_2} {\rm I\!E}\exp (X/t_1)^2 {}+{} \frac{t_2}{t_1+t_2} {\rm I\!E}\exp (Y/t_2)^2 \leq 2 \Big\}, \\ {}\leq{}& \inf \Big\{ t=t_1+t_2, t_1, t_2>0 {}:{} \\ &\qquad {\rm I\!E}\exp (X/t_1)^2 \leq 2, \text{ and } {\rm I\!E}\exp (Y/t_2)^2 \leq 2 \Big\} \\ {}={}& \inf \left\{ t_1 > 0 {}:{} {\rm I\!E}\exp (X/t_1)^2 \leq 2 \right\} \\ &\qquad{}+{} \inf \left\{ t_2 > 0 {}:{} {\rm I\!E}\exp (Y/t_2)^2 \leq 2 \right\} \\ {}={}& \|X\|_{\psi_2} + \|Y\|_{\psi_2}. \end{aligned}$$

Lastly, we need to show that if $\|X\|_{\psi_2} = 0$, then $X=0$ almost surely. From the definition of the sub-Gaussian norm and Proposition 1, there is a $C$ such that $\|X\|_{L_p} \leq C \|X\|_{\psi_2}\sqrt{p}$, for all $p\geq 1$. This means that if $\|X\|_{\psi_2} = 0$, then $\|X\|_{L_p}=0$, so $X=0$ a.s. $\Box$

Proposition 2 (Properties of $\|{}\cdot{}\|_{\psi_2}$). Let $X$ be a sub-Gaussian random variable with sub-Gaussian norm $\|X\|_{\psi_2}$. Then, there exist constants $C, c > 0$ such that

  1. ${\rm P}[|X| \geq t] \leq 2 \exp(-ct^2/\|X\|_{\psi_2})$, for all $t\geq 0$
  2. $\|X\|_{L^p} \leq C \|X\|_{\psi_2} \sqrt{p}$ for all $p\geq 1$
  3. ${\rm I\!E}\exp(X^2 / \|X\|_{\psi_2}^2) \leq 2$
  4. If ${\rm I\!E}[X] = 0$, then ${\rm I\!E}\exp(\lambda X) \leq \exp(C\lambda^2 \|X\|_{\psi_2}^2)$, for all $\lambda \in {\rm I\!R}$

Regarding the first property, we have that

$$\begin{aligned}{\rm P}[|X| > t] {}={}& {\rm P}[X^2 > t^2] \\ {}={}& {\rm P}\left[\exp\left(\frac{X}{\|X\|_{\psi_2}}\right) > \exp\left(\frac{t^2}{\|X\|_{\psi_2}}\right)\right] \\ {}\leq{}& \exp\left(-\frac{t^2}{\|X\|_{\psi_2}}\right) {\rm I\!E}\left[\exp\frac{X}{\|X\|_{\psi_2}}\right] \\ {}={}& 2 \exp\left(-\frac{t^2}{\|X\|_{\psi_2}}\right), \end{aligned}$$

which proves the first property. Note that in the first inequality we used Markov's inequality.

An alternative expression for the sub-Gaussian norm

An alternative expression for the sub-Gaussian norm can be produced by Proposition 1 and, in particular, the equivalence between 2. and 4.. We have

$$\begin{aligned} \|X\|_{\psi_2} {}={}& \inf \{t>0 {}:{} {\rm I\!E}[\exp(X^2/t^2)] \leq 2\} \\ {}={}& \inf\{t > 0 {}:{} \|X\|_{p} \leq Ct\sqrt{p}, \text{ for all } p \geq 1\}. \end{aligned}$$

for some constant $C > 0$. We can now set $s=Ct > 0$, so $t=s/C$, and write

$$\begin{aligned} \|X\|_{\psi_2} {}={}& \frac{1}{C}\inf\{s > 0 {}:{} \|X\|_{p} \leq s\sqrt{p}, \text{ for all } p \geq 1\} \\ {}={}& \frac{1}{C}\inf\{s > 0 {}:{} \frac{\|X\|_{p}}{\sqrt{p}} \leq s, \text{ for all } p \geq 1\} \\ {}={}& \frac{1}{C}\sup_{p\geq 1} \frac{\|X\|_{p}}{\sqrt{p}}. \end{aligned}$$

Along the same lines and using item 5. we can show that a zero-mean random variable has the following sub-Gaussian norm

$$\begin{aligned}\|X\|_{\psi_2} {}={}& \frac{1}{C}\inf\{K > 0 {}:{} {\rm I\!E}[\exp(\lambda X)] \leq \exp(K^2\lambda^2), \lambda \in {\rm I\!R}\} \\ {}={}& \frac{1}{C}\inf\{K > 0 {}:{} \log {\rm I\!E}[\exp(\lambda X)] \leq K^2\lambda^2, \lambda \in {\rm I\!R} \}\end{aligned}$$

The natural logarithm of the mgf is known as the cumulant generating function, $\kappa_X$. We can write

$$\begin{aligned}\|X\|_{\psi_2} {}={}& \frac{1}{C}\inf\{K > 0 {}:{} \kappa_X(\lambda) \leq K^2\lambda^2, \text{ for all } \lambda \in {\rm I\!R}\} \\ {}={}& \frac{1}{C} \sup_{\lambda}\frac{\kappa_X(\lambda)}{\lambda^2} \end{aligned}$$

Cool! Let's put these in a box...

Proposition 3. ($\|{}\cdot{}\|_{\psi_2}$ norm) The sub-Gaussian norm of a sub-Gaussian random variable $X$ can be written as

$$\|X\|_{\psi_2} {}={} \frac{1}{C}\sup_{p\geq 1} \frac{\|X\|_{p}}{\sqrt{p}}$$

for an absolute constant $C$. If $X$ is a zero-mean variable, then

$$\|X\|_{\psi_2} {}={} \frac{1}{C} \sup_{\lambda}\frac{\kappa_X(\lambda)}{\lambda^2}.$$

Maximum of sub-Gaussians

First attempt

We will first show an upper bound on the expectation of the maximum of finitely many sub-Gaussian random variables which is of the form $O(\log N)$.

Suppose that $X_1, \ldots, X_N$ are sub-Gaussian random variables. In this section we shall study $\max_i |X_i|$ and try to establish uppper bounds. To that end, we will work with the mgfs of $X_i$.

Firstly, we observe that

$$\begin{aligned}\exp(t {\rm I\!E}\max_{i=1,\ldots, N}|X_i|) {}={}& \exp ({\rm I\!E}[t \max_{i=1,\ldots, N}|X_i|]) \\ {}\overset{\text{Jensen}}{=}{}& {\rm I\!E} \exp(t \max_{i=1,\ldots, N}|X_i|) \\ {}={}& {\rm I\!E} \max_{i=1,\ldots, N} \exp(t|X_i|) \\ {}\leq{}& {\rm I\!E} \max_{i=1,\ldots, N} (e^{tX_i} + e^{-tX_i}) \\ {}={}& {\rm I\!E} \sum_{i=1}^{N} e^{tX_i} + e^{-tX_i} \\ {}={}& \sum_{i=1}^{N} {\rm I\!E}e^{tX_i} + {\rm I\!E}e^{-tX_i}. \end{aligned}$$

Now if we assume that ${\rm I\!E}[X_i] = 0$ for all $i$, we have

$$\begin{aligned}{\rm I\!E}e^{tX_i} \leq \exp(K_5^2 t^2),\end{aligned}$$

for all $t\in{\rm I\!R}$, so from the above inequality,

$$\begin{aligned}\exp(t {\rm I\!E}\max_{i=1,\ldots, N}|X_i|) \leq 2 \exp(K_5^2 t^2)N,\end{aligned}$$

therefore,

$$\begin{aligned}{\rm I\!E}\max_{i=1,\ldots, N}|X_i| \leq \frac{1}{t}\log(2N) + K_5^2 t,\end{aligned}$$

for all $t>0$. By taking $t = \log(2N)/K_5^2$, we obtain,

$$\begin{aligned}{\rm I\!E}\max_{i=1,\ldots, N}|X_i| \leq K_5^2 + \log(2N).\end{aligned}$$

Vershynin’s result

We will show the following result, which is Exercise 2.5.10 in [1].

Exercise 2.5.10. Let $X_1, X_2, \ldots, X_N$ be a sequence of sub-Gaussian random variables, not necessarily independent, with $K_{\max} = \max_i \|X_i\|_{\psi_2}$. Show that

$${\rm I\!E}\max_{i} \frac{|X_i|}{\sqrt{1+\log i}} \leq C K_{\max}.$$

Then, deduce that for $N\geq 2$,

$${\rm I\!E}\max_{i=1,\ldots,N} |X_i|\leq C K_{\max} \sqrt{\log N}.$$

Solution. For notational convenience let us define $Y_i = X_i / \sqrt{1 + \log i}$. The key idea is to express the expectation in terms of tail probabilities. We have

$$\begin{aligned} {\rm I\!E}\left[\max_{i=1,\ldots, N}Y_i\right] {}={}& \int_0^\infty {\rm P}\left[\max_{i=1,\ldots, N}Y_i \geq t\right] {\rm d}t \\ {}\leq{}& \int_0^\infty \sum_{i=1}^{N}{\rm P}\left[Y_i \geq t\right] {\rm d}t \\ {}={}& \sum_{i=1}^{N} \int_0^\infty {\rm P}\left[Y_i \geq t\right] {\rm d}t \\ {}={}& \sum_{i=1}^{N} \int_0^\infty {\rm P}\left[|X_i| \geq t\sqrt{1+\log i}\right] {\rm d}t \end{aligned}$$

Now define $u = t\sqrt{1+\log i}$, so

$$\begin{aligned} {\rm I\!E}\left[\max_{i=1,\ldots, N}Y_i\right] {}\leq{}& \sum_{i=1}^{N} \frac{1}{\sqrt{1+\log i}} \int_0^\infty {\rm P}\left[|X_i| \geq u\right] {\rm d}u \\ {}={}& \sum_{i=1}^{N} \frac{1}{\sqrt{1+\log i}} \int_0^\infty \exp(-cu^2/\|X_i\|_{\psi_2}) {\rm d}u \\ {}\leq{}& \sum_{i=1}^{N} \frac{1}{\sqrt{1+\log i}} \int_0^\infty \exp(-cu^2/K_{\max}) {\rm d}u \end{aligned}$$

The proof is concluded with a trick that can be found here.

Sum of sub-Gaussians

Here we will show that the sum of independent sub-Gaussians is sub-Gaussian too.

Proposition 4. (Sub of sub-Gaussians) Let $X_1, \ldots, X_N$ be independent sub-Gaussian random variables with zero-mean. Then,

$$\left\|\sum_{i=1}^N X_i\right\|_{\psi_2}^2 {}\leq{} C \sum_{i=1}^{N} \|X_i\|_{\psi_2}^2$$

where $C$ is an absolute constant.

Proof. The mgf of the sum is

$$\begin{aligned} {\rm I\!E}\left[\exp\left(\lambda \sum_{i=1}^{N}X_i\right)\right] {}={}& \prod_{i=1}^{N} {\rm I\!E}[\exp(\lambda X_i)] \\ {}\leq{}& \prod_{i=1}^N \exp (C_0\lambda^2 \|X_i\|_{\psi_2}^2) \\ {}={}& \exp\left(\lambda^2 C_0 \sum_{i=1}^N \|X_i\|_{\psi_2}^2\right). \end{aligned}$$

Given that $\sum_i X_i$ is a zero-mean RV, it follows from Proposition 1, Item 5, that it is sub-Gaussian and $K_5^2 = C_0 \sum_{i=1}^N \|X_i\|_{\psi_2}^2.$ There is a constant $\bar{C}$ such that $K_1 \leq \bar{C}K_5 = \bar{C}C_0 \sum_{i=1}^N \|X_i\|_{\psi_2}^2$. This completes the proof. $\Box$

See next: Reading Vershynin’s HDP III: Subexponential random variables

Non-zero mean random variables

If $X$ is sub-Gaussian, then $X - {\rm I\!E}[X]$ is also sub-Gaussian. The following result gives its sub-Gaussian norm.

Proposition 5. (Centering) If $X$ is a sub-Gaussian random variable, then

$$\|X - {\rm I\!E}[X]\|_{\psi_2} \lesssim \|X\|_{\psi_2}.$$

Note: The notation $\|X - {\rm I\!E}[X]\|_{\psi_2} \lesssim \|X\|_{\psi_2}$ means that there is an absolute constant $C$ such that $\|X - {\rm I\!E}[X]\|_{\psi_2} \leq C \|X\|_{\psi_2}.$

Proof. We have

$$\begin{aligned} \|X - {\rm I\!E}[X]\|_{\psi_2} \leq \|X\|_{\psi_2} + \|{\rm I\!E}[X]\|_{\psi_2}, \end{aligned}$$

where the second term is the $\psi_2$-norm of a constant. Using the fact that $\|X\|_{\psi_2} \leq C \|X\|_{\infty}$, where $C = 1/\sqrt{\log 2}$, we have $\|{\rm I\!E}[X]\|_{\psi_2} \leq C |{\rm I\!E}[X]|$. As a result

$$\begin{aligned} \|{\rm I\!E}[X]\|_{\psi_2} \leq C |{\rm I\!E}[X]| \leq C {\rm I\!E}[|X|] = C \|X\|_1. \end{aligned}$$

We also have that $\|X\|_1 \leq C \|X\|_{\psi_2}$, so $\|{\rm I\!E}[X]\|_{\psi_2} \lesssim \|X\|_1 \lesssim \|X\|_{\psi_2}$ and the result follows. $\Box$

See next: Reading Vershynin’s HDP III: Subexponential random variables

Hoeffding’s inequality

Let us state Hoeffding's inequality for subgaussian random variables.

Theorem 6. (Hoeffding's inequality for subgaussian random variables) Let $X_1,\ldots, X_N$ be independent zero-mean subgaussian random variables and let $a\in{\rm I\!R}^N$. Then for every $t\geq 0$,

$${\rm P}\left[\left| \sum_{i=1}^{N}a_i X_i \right| \geq t\right] \leq 2 \exp\left(-\frac{ct^2}{K^2 \|a\|^2}\right),$$

where $K=\max_i \|X_i\_{\psi_2}\|$.

References

  1. R. Vershynin, High Dimensional Probabiltiy: an introduction with applications in data science, Cambridge University Press, 2019