Least Squares Closing Thoughts - Discrete

Thomas J. Kennedy

Contents:

We now have a few more tools available to us. It is now possible to skip the initial $X^T * X$ and $X^T * Y$ matrix multiplications.

1 Another Example

Suppose we wanted to compute a discrete approximation for $f(x) = 2x^2$ where $x \ge 0$ using four points.

x f(x)
0 0
1 2
2 8
3 18

We will stick with a line. This leads to two basis functions:

$\pi_0 = 1$ and $\pi_1 = x$.

Using these two basis functions our approximation function takes the form:

\[\begin{align} \hat{\varphi} &= \sum_{i=0}^{1} c_i \pi_i \\ &= \sum_{i=0}^{1} c_i x^i \\ &= c_0 + c_1 x \ \end{align} \]

We could use the $[X^TX|X^TY]$ method… Let us try the $A\vec{c}|b$ method instead.

1.1 The Setup

Our $A$ matrix is defined as

$$ \left[\begin{array}{rr} \sum\limits_{i=0}^3 \pi_{0}(x_i) \pi_{0}(x_i) & \sum\limits_{i=0}^3 \pi_{0}(x_i) \pi_{1}(x_i) \\ \sum\limits_{i=0}^3 \pi_{1}(x_i) \pi_{0}(x_i) & \sum\limits_{i=0}^3 \pi_{1}(x_i) \pi_{1}(x_i) \\ \end{array}\right] $$

Since $\pi_0 = 1$ this simplifies to

$$ \left[\begin{array}{rr} \sum\limits_{i=0}^3 1 & \sum\limits_{i=0}^3 \pi_{1}(x_i) \\ \sum\limits_{i=0}^3 \pi_{1}(x_i) & \sum\limits_{i=0}^3 \pi_{1}(x_i) \pi_{1}(x_i) \\ \end{array}\right] $$

And… if we apply $\pi_1 = x$ we end up with

$$ \left[\begin{array}{rr} \sum\limits_{i=0}^3 1 & \sum\limits_{i=0}^3 x_i \\ \sum\limits_{i=0}^3 x_i & \sum\limits_{i=0}^3 (x_i)^2 \\ \end{array}\right] $$

Well… $\sum\limits_{i=0}^3 1 = 4$. Hooray! That is one small sum eliminated.

$$ \left[\begin{array}{rr} 4 & \sum\limits_{i=0}^3 x_i \\ \sum\limits_{i=0}^3 x_i & \sum\limits_{i=0}^3 (x_i)^2 \\ \end{array}\right] $$

If we plug in all the $x_i$ values and compute the sums, we have

 

$$ \left[\begin{array}{rr} 4 & 6 \\ 6 & 14 \\ \end{array}\right] $$

That leaves $\vec{b}$, which is defined as

$$ \left[\begin{array}{r} \sum\limits_{i=0}^3 \pi_{0}(x_i) f(x_i) \\ \sum\limits_{i=0}^3 \pi_{1}(x_i) f(x_i) \\ \end{array}\right] $$

which becomes

$$ \left[\begin{array}{r} \sum\limits_{i=0}^3 f(x_i) \\ \sum\limits_{i=0}^3 x_i * f(x_i) \\ \end{array}\right] $$

Once we plug in our input points, we end up with

$$ \left[\begin{array}{r} 28 \\ 72 \\ \end{array}\right] $$

That allows us to construct an augmented matrix.

$$ \left[\begin{array}{rr|r} 4 & 6 & 28 \\ 6 & 14 & 72 \\ \end{array}\right] $$

1.2 Solving the System

Let us continue using Gaussian Elimination.

$$ \left[\begin{array}{rr|r} 4 & 6 & 28 \\ 6 & 14 & 72 \\ \end{array}\right] $$

Our first step is to scale both rows using $r_0 = \frac{1}{4}r_0$ and $r_1 = \frac{1}{6}r_1$

$$ \left[\begin{array}{rr|r} 1 & \frac{3}{2} & 7 \\ 1 & \frac{7}{3} & 12 \\ \end{array}\right] $$

Next $r_1 = r_1 - r_0$

 

$$ \left[\begin{array}{rr|r} 1 & \frac{3}{2} & 7 \\ 0 & \frac{5}{6} & 5 \\ \end{array}\right] $$

Our next step is to scale row 1, $r_1 = \frac{6}{5}r_1$

$$ \left[\begin{array}{rr|r} 1 & \frac{3}{2} & 7 \\ 0 & 1 & 6 \\ \end{array}\right] $$

The final step is to backsolve using $r_0 = r_0 - \frac{3}{2}r_1$

 

$$ \left[\begin{array}{rr|r} 1 & 0 & -2 \\ 0 & 1 & 6 \\ \end{array}\right] $$


1.3 Final Result

Coefficients

$$c_0 = -2$$ $$c_1 = 6$$

Approximation Function (phi hat)

\[\begin{align} \hat{\varphi} & = -2 * x^0 + 6 * x^1 \\ & = -2 + 6x \end{align} \]