The $L^p$ norm is formally defined as

The $L^p$ norm has several special cases that supposedly arise often in linear algebra, numerical analysis, and machine learning.

• The $L^1$ norm, commonly called the “taxicab” norm:

• The $L^2$ norm, commonly called the “Euclidean” norm:

• The $L_\infty$ norm:

It can be shown that this definition of the $L^\infty$ norm is equivalent to taking the limit as $p \to \infty$ of the $L^p$ norm:

Proof. Let $\displaystyle\norm{x}_\infty = \max_i \vert x_i \vert$ and $\displaystyle\norm{x}_p = \left( \sum_i \vert x_i \vert ^p \right) ^ \frac{1}{p}$. We wish to show that $\displaystyle\lim_{p \to \infty} \norm{x}_p = \norm{x}_\infty$.

We have that

We arrive at the last item because $\displaystyle\left(\frac{\vert x_i \vert}{\norm{x}_\infty}\right)^p \leq 1$ for every $i$ (because $\norm{x}_\infty \geq \vert x_i \vert$ for each $i$). Thus we have

so, taking a limit as $p \to \infty$, we have

In other words, we have $\lim_{p \to \infty} \norm{x}_p$ sandwiched between $\norm{x}_\infty$ and $\norm{x}_\infty$, implying equality. Therefore $\displaystyle\lim_{p \to \infty} \norm{x}_p = \norm{x}_\infty = \max_i \vert x_i \vert$.

Consider also this proof:

Proof. Let $\displaystyle\norm{x}_p = \left( \sum_i \vert x_i \vert ^p \right) ^ \frac{1}{p}$. Observe that for all $x$ we have that

and that

Thus,

but then,

Thus

however, $\displaystyle\lim_{p \to \infty} n^\frac{1}{p} = 1$, so

or rather, that $\displaystyle\norm{x}_\infty = \lim_{p \to \infty} \norm{x}_p = \max_i \vert x_i \vert$.