Since solving a system of linear equations is a basic skill that will be used for
interpolation and approximation, we will briefly discuss a commonly used
technique here. In fact, what we will be using is a slightly more general form.
Suppose we have a *n*×*n* coefficient matrix **A**, a
*n*×*h* "constant" term matrix **B**, and
a *n*×*h* unknown matrix **X** defined as follows:

Suppose further that they satisfy the following relation:

If **A** and **B** are known, we need a fast method to solve for **X**.
One may suggest the following: compute the inverse matrix **A**^{-1}
of **A** and the solution is simply
**X** = **A**^{-1}**B**. While this is a correct way to solve
the problem, it is a little overkill. One might also observe the following fact:
column *j* of matrix **B** is the product of matrix **A** and
column *j* of matrix **X** as shown below:

With this in mind, we can solve column *j* of **X** using **A** and
column *j* of **B**. In this way, the given problem is equivalent to
solving *h* systems of linear equations. Actually, this is not a good
strategy either because in solving for column 1 of **X** matrix **A**
will be destroyed, and, as a result, for each column we need to make a copy of
matrix **A** before running the linear system solver.

So, we need a method that does not use matrix inversion and does not have to copy
matrix **A** over and over. A possible way is the use of the
LU decomposition technique.

An efficient procedure for solving
**B** = **A**^{.}**X** is the LU-decomposition. While other
methods such as Gaussian elimination method and Cholesky method can do the job
well, this LU-decomposition method can help accelerate the computation.

The LU-decomposition method first "decomposes" matrix **A** into
**A** = **L**^{.}**U**, where **L** and **U** are
lower triangular and upper triangular matrices, respectively. More precisely,
if **A** is a *n*×*n* matrix, **L** and **U**
are also *n*×*n* matrices with forms like the following:

The lower triangular matrix **L** has zeros in all
entries *above* its diagonal and the upper triangular matrix
**U** has zeros in all entries *below* its diagonal. If the
LU-decomposition of **A** = **L**^{.}**U** is found,
the original equation becomes
**B** = (**L**^{.}**U**)^{.}**X**. This
equation can be rewritten as
**B** = **L**^{.}(**U**^{.}**X**).
Since **L** and **B** are known, solving for
**B** = **L**^{.}**Y** gives
**Y** = **U**^{.}**X**. Then, since **U** and
**Y** are known, solving for **X** from
**Y** = **U**^{.}**X** yields the desired result.
In this way, the original problem of solving for **X** from
**B** = **A**^{.}**X** is decomposed into two steps:

- Solving for
**Y**from**B**=**L**^{.}**Y** - Solving for
**X**from**Y**=**U**^{.}**X**

How easy are these two steps? It turns out to be *very easy*. Consider
the first step. Expanding **B** = **L**^{.}**Y** gives

It is not difficult to verify that column *j* of matrix **B** is the
product of matrix **A** and column *j* of matrix **Y**. Therefore,
we can solve one column of **Y** at a time. This is shown below:

This equation is equivalent to the following:

From the above equations, we see that
*y*_{1} = *b*_{1}/*l*_{11}.
Once we have *y*_{1} available, the second equation yields
*y*_{2} = (*b*_{2}-*l*_{21}*y*_{1})/*l*_{22}.
Now we have *y*_{1} and *y*_{2}, from equation 3, we have
*y*_{3} = (*b*_{3} -
(*l*_{31}*y*_{1}
+*l*_{32}*y*_{2})/*l*_{33}.
Thus, we compute *y*_{1} from the first equation and substitute it
into the second to compute *y*_{2}. Once *y*_{1}
and *y*_{2} are available, they are substituted into the third
equation to solve for *y*_{3}. Repeating this process,
when we reach equation *i*, we will have *y*_{1},
*y*_{2}, ..., *y*_{i-1} available. Then,
they are substituted into equation *i* to solve for *y*_{i}
using the following formula:

Because the values of the *y*_{i}'s are substituted to solve
for the next value of *y*, this process is referred to as
*forward substitution*. We can repeat the forward substitution process
for each column of **Y** and its corresponding column of **B**. The
result is the solution **Y**. The following is an algorithm:

Input:MatrixB_{n×h}and a lower triangular matrixL_{n×h}

Output: MatrixY_{n×h}such thatB=L^{.}Yholds.

Algorithm:

/* there are

hcolumns */

forj:= 1tohdo/* do the following for each column */

begin

/* compute

y_{1}of the current column */

y_{1,j}=b_{1,j}/l_{1,1};

fori:= 2tondo/* process elements on that column */

begin

sum := 0; /* solving for

y_{i}of the current column */

fork:= 1toi-1do

sum := sum +

l_{i,k}×y_{k,j};

y_{i,j}= (b_{i,j}- sum)/l_{i,i};endend

After **Y** becomes available, we can solve for **X** from
**Y** = **U**^{.}**X**. Expanding this equation and only
considering a particular column of **Y** and the corresponding column of
**X** yields the following:

This equation is equivalent to the following:

Now, *x*_{n} is immediately available from
equation *n*, because
*x*_{n} = *y*_{n}/*u*_{n,n}.
Once *x*_{n} is available, plugging it into equation *n*-1

and solving for *x*_{n-1} yields
*x*_{n-1} = (*y*_{n-1}-
*u*_{n-1,n}*x*_{n})/
*u*_{n-1,n-1}. Now, we have *x*_{n}
and *x*_{n-1}. Plugging them into equation *n*-2

and solving for *x*_{n-2} yields
*x*_{n-2} = [*y*_{n-2}-
(*u*_{n-2,n-1}*x*_{n-1}
+ *u*_{n-2,n}*x*_{n-})]/
*u*_{n-2,n-2}.

From *x*_{n}, *x*_{n-1} and
*x*_{n-2}, we can solve for *x*_{n-3}
from equation *n*-3. In general, after *x*_{n},
*x*_{n-1}, ..., *x*_{i+1} become
available, we can solve for *x*_{i} from equation *i*
using the following relation:

Repeat this process until *x*_{1} is computed. Then, all unknown
*x*'s are available and the system of linear equations is solved.
The following algorithm summarizes this process:

Input:MatrixY_{n×h}and a upper triangular matrixU_{n×h}

Output: MatrixX_{n×h}such thatY=U^{.}Xholds.

Algorithm:

/* there are

hcolumns */

forj:= 1tohdo/* do the following for each column */

begin

/* compute

x_{n}of the current column */

x_{n,j}=y_{n,j}/u_{n,n};

fori:=n-1downto1do/* process elements of that column */

begin

sum := 0; /* solving for

x_{i}on the current column */

fork:=i+1tondo

sum := sum +

u_{i,k}×x_{k,j};

x_{i,j}= (y_{i,j}- sum)/u_{i,i};endend

This time we work backward, from *x*_{n} backward to
*x*_{1}, and, hence, this process is referred to as
*backward substitution*.

LU-decomposition and forward and backward substitutions, including their subroutines/functions, should be available in many numerical method textbooks and mathematical libraries. Check your numerical methods textbook for the details.