## Gauss Jordan Elimination

1st October 2002

### Gauss-Jordan Elimination

The steps of the Gauss-Jordan elimination process are as follows:

- Input augmented matrix A
- Get the reduced echelon form for A using elementary row operations. This is done by first creating leading 1s, then ensuring that columns containing leading 1s have only 0s above and below the leading 1, column by column starting with the first column.
- The matrix in reduced echelon form will either give the solution or demonstrate that there is no solution.

Two matrices A and B are row equivalent if A =>* B by elementary row operations. During the process of Gauss-Jordan elimination all matrices are row equivalent to the original and each other.

#### Example

x_{1}+ 2x_{2}+ x_{3}= 1 2x_{1}+ 4x_{2}- x_{3}= -1 [ 1 2 1 1 ] [ 1 2 1 1 ] [ 2 4 -1 -1 ] => [ 0 0 -3 -3 ] R_{2}:= R_{2}- 2R_{1}[ 1 2 1 1 ] [ 0 0 1 1 ] R_{2}:= R_{2}/ -3 To fix column 3: [ 1 2 0 0 ] R_{1}:= R_{1}- R_{2}[ 0 0 1 1 ] (this is in reduced echelon form) So... x_{1}= -2x_{2}x_{2}= anything x_{3}= 1 The system has many solutions.

In general, any columns without a leading one of them correspond to a variable that can be any value.

The difficult part of Gauss-Jordan elimination is the bit in the middle—deciding which columns to manipulate and how to convert them in to leading 1s. Consider a matrix A_{n x m} in the middle of the computation. Some number i_{l} of rows have leading 1s in them and some number j of columns have been processed. j ≥ i_{l}

[ a_{1}... a_{1m}b_{1}] A = [ ... ] [ a_{n}... a_{nm}b_{n}] The matrix now looks like this: j 0 0 ... 0 1 0 ... 0 ... ... i_{l}0 0 ... 0 0 0 ... 1 ... ... 0 0 ... 0 0 0 ...

Next step: Increment j. Look in column j for a_{oj} ≠ 0, with i > i_{l}—now interchange row i and row i_{l} + 1 (if nothing found, move on to next columns. Create a leading 1, then reduce column j.

#### Pseudo code for Gauss-Jordan elimination

i_{l}= 0 (number of leading 1s created) for j = 1 ... m do: if ∃ i > i_{l}such that a_{ij}≠ 0 then: i_{l}= i_{l}+ 1 interchange row i and row i_{l}(this brings the new row just below the row you have just done) divide row i by a_{ilj}creating a leading 1 Reduce all other entries in the column to 0

#### Complexity

Suppose A is n x m. We can measure complexity by the number of +, -, *, / operations that are carried out. To do this we look at the pseudo code. If we run the main loop m times:

- The interchange operation takes 2m operations
- Dividing row i by a
_{ilj}takes m-j operations - The reduce other entries bit takes nm operations

So the complexity of the operation is O(mnm) or O(nm^{2})

#### Additional Resources

- Wikipedia has an excellent explanation of Gauss-Jordan Elimination
- The Gauss-Jordan elimination game is a javascript puzzle
- This page explains the Gauss-Jordan algorithm in a bit more depth

### Homogeneous Systems

A system of linera equations is homogeneous if all of the constant terms are 0. For example:

3x_{1}+ 2x_{2}= 0 x_{1}- x_{2}= 0

A homogeneous system in n variables *always* has the trivial solution x_{1} = x_{2} = ... = 0—so there is no chance that the system will have no solutions, reducing the possible outcomes from three to two.

#### Theorem

A homogeneous system with more unknowns than equations always has many solutions

#### Proof

Suppose the system has n equations with m unknowns, and m > n.

Let A = the augmented matrix

Find R = reduced echelon form of A using Gauss-Jordan elimination.

m 1 . . . . . . . 0 1 . . . . . . n 0 0 1 . . . . . 0 0 0 1 . . . . 0 0 0 0 1 . . .

R has ≤ n leading 1s, so at least m-n of the variables can take *any* value. This shows that there are many solutions.

#### Example

x_{1}+ 2x_{2}+ x_{3}= 0 x_{2}- x_{3}= 0

This is the intersection of two planes through the origin in 3D space.

#### Matrix notation for a system of equations

[ a_{11}x_{1}+ ... + a_{1m}x_{m}= b_{1}] [ ... ] can be written as AX = B [ a_{n1}x_{1}+ ... + a_{nm}x_{m}= b_{n}] [ a_{1 }... a_{1m}] A = [ ... ] [ a_{n1}... a_{nm}] [ x_{1}] [ b_{1}] X = [ ._{ }] B = [_{ }] [ x_{n}] [ b_{n}]

## More recent articles

- Prompt injection explained, November 2023 edition - 27th November 2023
- I'm on the Newsroom Robots podcast, with thoughts on the OpenAI board - 25th November 2023
- Weeknotes: DevDay, GitHub Universe, OpenAI chaos - 22nd November 2023
- Deciphering clues in a news article to understand how it was reported - 22nd November 2023
- Exploring GPTs: ChatGPT in a trench coat? - 15th November 2023
- Financial sustainability for open source projects at GitHub Universe - 10th November 2023
- ospeak: a CLI tool for speaking text in the terminal via OpenAI - 7th November 2023
- DALL-E 3, GPT4All, PMTiles, sqlite-migrate, datasette-edit-schema - 30th October 2023
- Now add a walrus: Prompt engineering in DALL-E 3 - 26th October 2023
- Execute Jina embeddings with a CLI using llm-embed-jina - 26th October 2023