E. T. S. I. Caminos, Canales y Puertos1 Engineering Computation Lecture 6

Preview:

Citation preview

E. T. S. I. Caminos, Canales y Puertos 1

EngineeringComputation

Lecture 6

E. T. S. I. Caminos, Canales y Puertos 2

Forward Elimination:DO k = 1 to n–1

DO i = k+1 to nr = A(i,k)/A(k,k)

DO j = k+1 to n A(i,j)=A(i,j) – r*A(k,j)ENDDO

B(i) = B(i) – r*B(k)ENDDO

ENDDO

Gauss elimination (operation counting)

E. T. S. I. Caminos, Canales y Puertos 3

Operation Counting for Gaussian Elimination

Back Substitution:X(n) = B(n)/A(n,n)

DO i = n–1 to 1 by –1SUM = 0

DO j = i+1 to nSUM = SUM + A(i,j)*X(j)

ENDDO

X(i) = [B(i) – SUM]/A(i,i)

ENDDO

Gauss elimination (operation counting)

E. T. S. I. Caminos, Canales y Puertos 4

Operation Counting for Gaussian Elimination

Forward Elimination

Inner loop:

Second loop:

n

j=k+1

1 = n - (k +1) +1 = n - k

n

i=k+1

2 + (n - k) (2 n) k (n k) = (n2 + 2n) – 2(n + 1)k + k2

Gauss elimination (operation counting)

E. T. S. I. Caminos, Canales y Puertos 5

Operation Counting for Gaussian Elimination

Forward Elimination (cont'd)

Outer loop =1

2 2

1

[( 2 ) 2( 1) ]n

k

n n n k k

n 1 n 1 n 1

2 2

k 1 k 1 k 1

(n 2n) 1 2(n 1) k k

2 (n 1))(n)(n 2n)(n 1) 2(n 1)

2

(n 1)(n)(2n 1)

6

3n 2= + (n )3

O

Gauss elimination (operation counting)

E. T. S. I. Caminos, Canales y Puertos 6

Operation Counting for Gaussian Elimination

Back Substitution

Inner Loop:

Outer Loop:

n

j i 1

1 n (i 1) 1

n - i

n 1 n 1 n 1

i 1 i 1 i 1

1 (n i) (1 n) 1 i

(n 1)n

(1 n)(n 1)2

2n= + (n)

2O

Gauss elimination (operation counting)

E. T. S. I. Caminos, Canales y Puertos 7

Total flops = Forward Elimination + Back Substitution= n3/3 + O (n2) + n2/2 + O (n) n3/3 + O (n2)

To convert (A,b) to (U,b') requires n3/3, plus terms of order n2 and smaller, flops.

To back solve requires:

1 + 2 + 3 + 4 + . . . + n = n (n+1) / 2 flops;

Grand Total: the entire effort requires n3/3 + O(n2) flops altogether.

Gauss elimination (operation counting)

E. T. S. I. Caminos, Canales y Puertos 8

Diagonalization by both forward and backward elimination in each column.

Perform elimination both backwards and forwards until:

Operation count for Gauss-Jordan is: (slower than Gauss elimination)

11 12 13 1n 1 1

21 22 23 2n 2 2

31 32 33 3n 3 3

n1 n2 n3 nn nn n

a a a ... a x b

a a a ... a x b

=a a a ... a x b

...

a a a ... a x b

1 1

2 2

3 3

n n

x x1 0 0 0

x x0 1 0 0

= x x0 0 1 0

x x0 0 0 1

32n

O(n )2

Gauss-Jordan Elimination

E. T. S. I. Caminos, Canales y Puertos 9

Example (two-digit arithmetic):

50 1 2 11 40 4 22 6 30 3

1 0.02 0.04 0.020 40 4 20 6 30 3

1 0 0.038 0.0190 1 0.1 0.050 0 29 2.7

1 0 0 0.0150 1 0 0.0410 0 1 0.093

x1 = 0.015 (vs. 0.016, t = 6.3%)x2 = 0.041 (vs. 0.041, t = 0%)x3 = 0.093 (vs. 0.091, t = 2.2%)

Gauss-Jordan Elimination

E. T. S. I. Caminos, Canales y Puertos 10

The solution of: [A]{x} = {b} is: {x} = [A]-1{b}

where [A]-1 is the inverse matrix of [A]

Consider: [A] [A]-1 = [ I ]

1) Create the augmented matrix: [ A | I ]

2) Apply Gauss-Jordan elimination:

==> [ I | A-1 ]

Gauss-Jordan Matrix Inversion

E. T. S. I. Caminos, Canales y Puertos 11

Gauss-Jordan Matrix Inversion (with 2 digit arithmetic):

50 1 2 1 0 0A I = 1 40 4 0 1 0

2 6 30 0 0 1

1 0.02 0.04 0.02 0 00 40 4 -0.02 1 00 6 30 -0.04 0 1

MATRIX INVERSE [A-1]

1 0 0 0.02 0.00029 0.0014

0 1 0 0.00037 0.026 0.0036

0 0 1 0.0013 0.0054 0.036

1 0 0.038 0.02 0.005 0

0 1 0.1 0.0005 0.025 0

0 0 28 0.037 0.15 1

Gauss-Jordan Matrix Inversion

E. T. S. I. Caminos, Canales y Puertos 12

50 1 2 0.020 -0.00029 -0.0014 0.997 0.13 0.0022 40 4 -0.00037 0.026 -0.0036 0.000 1.016 0.0012 6 30 -0.0013 -0.0054 0.036 0.001 0.012 1.056

CHECK:[ A ] [ A ]-1 = [ I ]

[ A ]-1 { b } = { x }

0.020 -0.0029 -0.0014 1 0.015-0.00037 0.026 -0.0036 2 0.033-0.0013 -0.0054 0.036 3 0.099

true

0.016x 0.041

0.091

0.016x 0.040

0.093

GaussianElimination

Gauss-Jordan Matrix Inversion

E. T. S. I. Caminos, Canales y Puertos 13

LU decomposition

• LU decomposition - The LU decomposition is a method that uses the elimination techniques to transform the matrix A in a product of triangular matrices. This is specially useful to solve systems with different vectors b, because the same decomposition of matrix A can be used to evaluate in an efficient form, by forward and backward sustitution, all cases.

A = L U

E. T. S. I. Caminos, Canales y Puertos 14

LU decomposition

DecompositionA = L U

Initial system

A X = B

Transformed system 1

L U X = BSubstitution

U X = D

Transformed system 2

L D = B

Forward sustitutionDBackward sustitution

X

E. T. S. I. Caminos, Canales y Puertos 15

LU decomposition

• LU decomposition is very much related to Gauss method, because the upper triangular matrix is also looked for in the LU decomposition. Thus, only the lower triangular matrix is needed.

• Surprisingly, during the Gauss elimination procedure, this matrix L is obtained, but one is not aware of this fact. The factors we use to get zeroes below the main diagonal are the elements of this matrix L.

Substract

E. T. S. I. Caminos, Canales y Puertos 16

LU decomposition

E. T. S. I. Caminos, Canales y Puertos 17

Basic Approach

Consider [A]{x} = {b}

a) Gauss-type "decomposition" of [A] into [L][U] n3/3 flops [A]{x} = {b} becomes [L] [U]{x} = {b}; let [U]{x} {d}

b) First solve [L] {d} = {b} for {d} by forward subst. n2/2 flops

c) Then solve [U]{x} = {d} for {x} by back substitution n2/2 flops

LU decomposition (Complexity)

E. T. S. I. Caminos, Canales y Puertos 18

1ij

1 0L = a 1

0ij

0 0L = a 0

ij0

0 aU =

0 0

ij1

1 aU =

0 1

1 0D = 0 1

[A] = [L] + [U0]

[A] = [L0] + [U]

[A] = [L0] + [U0] + [D]

11ij nn

a 0L = a a

11 ij

nn

a aU =

0 a

[A] = [L1] [U]

[A] = [L] [U1]

LU Decompostion: notation

E. T. S. I. Caminos, Canales y Puertos 19

LU Decomposition VariationsDoolittle [L1][U] General [A]Crout [L][U1] General [A]Cholesky[L][L] T Pos. Def. Symmetric [A]

Cholesky works only for Positive Definite Symmetric matrices

Doolittle versus Crout:• Doolittle just stores Gaussian elimination factors where Crout

uses a different series of calculations (see C&C 10.1.4).• Both decompose [A] into [L] and [U] in n3/3 FLOPS

• Different location of diagonal of 1's• Crout uses each element of [A] only once so the same array

can be used for [A] and [L\U] saving computer memory!

LU decomposition

E. T. S. I. Caminos, Canales y Puertos 20

Matrix Inversion

Definition of a matrix inverse:

[A] [A]-1 = [ I ]

==> [A] {x} = {b}

[A]-1 {b} = {x}

First Rule: Don’t do it.

(numerically unstable calculation)

LU decomposition

E. T. S. I. Caminos, Canales y Puertos 21

Matrix InversionIf you really must --

1) Gaussian elimination: [A | I ] –> [U | B'] ==> A-1

2) Gauss-Jordan: [A | I ] ==> [I | A-1 ] Inversion will take

n3 + O(n2)flops if one is careful about where zeros are (taking advantage of the sparseness of the matrix)

Naive applications (without optimization) take 4n3/3 + O(n2) flops. For example, LU decomposition requires n3/3 + O(n2) flops. Back solving twice with n unit vectors ei:

2 n (n2/2) = n3 flops.

Altogether: n3/3 + n3 = 4n3/3 + O(n2) flops

LU decomposition

E. T. S. I. Caminos, Canales y Puertos 22

Summary

FLOP Counts for Linear Algebraic Equations, [A]{x} = {b}

Gaussian Elimination (1 r.h.s) n3/3 + O (n2)

Gauss-Jordan (1 r.h.s) n3/2 + O (n2)

LU decomposition n3/3 + O (n2)

Each extra LU right-hand-side n2

Cholesky decomposition (symmetric A) n3/6 + O (n2)

Inversion (naive Gauss-Jordan) 4n3/3 +O (n2)

Inversion (optimal Gauss-Jordan) n3 + O (n2)

Solution by Cramer's Rule n!

FLOP Counts for Linear Algebraic Equations