[...]

[orthogonal vs. orthonormal]

Quote:> "Numerical Recipes" in the eigenvalue chapters has a lot of

> orthonormal words in it when they speak about matrices. I

> always thought that orthonormal means that each axis in the

> matrix has 90 degree angle to it's neighbourn axii. Anyway I

> was referring to that situation.

That's nonstandard terminology. For matrices, all LinAlg references

I've seen define orthogonal as meaning

M * M^T = identity

For sets of basis vectors, "orthogonal" means

v_i . v_j = 0 whenever i != j

whereas "orthonormal" also requires

v_i . v_i = 1

I.e. a set of orthonormal basis vector, composed into a square matrix,

makes an orthogonal matrix. Confusing, isn't it?

Quote:>> What you need are the eigenvalues of your matrices. You get them as

>> solutions to the following equation:

>> det ( M - lambda * idendity_matrix) = 0

[...]

Quote:> I'm working on 2D, couldn't I use 2x2 matrix then?

Of course. I had forgotten the subject line already when I wrote that :-(

Quote:> Are you suggesting in the last sentence of your last

> paragraph that I should just drop out all the complex

> eigenvalues? :)

No. I'm saying you should only look at the absolute values of the

numbers, no matter whether they're complex or not.

[...]

Quote:> I can only can solve the eigenvalue using the Hessenberg

> stuff or by solving the polynomial (if it is 2nd degree,

> maybe it's not that complicated)? That is, I cannot use all

> the fancy and fast solvers (Jacobi, Givens/Householder)?

For a mere 2x2, solving the polynomial is going to be faster than any

of the more fancy methods. Those are for larger matrices, mainly. In

particular, for any matrix larger than 4x4 you cannot solve the

polynomial analytically anymore, so you *need* iterative numerical

solutions to find the eigenvalues.

--

Even if all the snow were burnt, ashes would remain.