Introduction

This method finds all the solutions of a linear equations system. The definition of a linear equations system can be found in this post.

Method

Gauss-Seidel and Jacobi methods are really similar between them. The way they work is the following: they start with dummy values (generally zeros). Then, one must clear one variable on each equation. Each variable must be cleared on each equation, and the resulting permutations are equal to n!. To guarantee a converging permutation, the matrix must have a dominant diagonal. This means that for each row i,

CodeCogsEqn

The difference between Gauss-Seidel and Jacobi is that Gauss-Seidel evaluates with the new values of each variable as soon as they are calculated, while Jacobi completes the n-th iteration with the n-1-th iteration values.

Examples

  • A = [7., 2.; -21., -5.], B = [16.; 50.], R = [-25.7; 98.]
  • A = [2., -3.; 4., 1.], B = [-2.; 24.], R = [5.; 4.]
  • A = [3., -0.1, -0.2; 0.1, 7., -0.3; 0.3, -0.2, 10.], B = [7.85; -19.3; 71.4], R = [3.; -2.5; 7.]
  • A = [0.3, 0.52, 1.; 0.5, 1., 1.9; 0.1, 0.3, 0.5], B = [-0.01; 0.67; -0.44], R=[-14.9; -29.5; -19.8]

Singularities

This methods are iterative method, which means they make operations until an error margin is met. Another thing on this algorithm is that the running time depends on the time it takes to find a correct permutation of cleared equations. As a requisite, this method requires a diagonally dominant matrix.

Flowchart

Code

Conclusions

Gauss-Seidel and Jacobi methods are good methods for systems with small memory. But, the iterative running and the n! permutations make them potentially slow methods. Together with Cramer, this are not recommended for big systems.

Advertisements