## Extracting scale from a 2D matrix

### Extracting scale from a 2D matrix

Hi,
I have a set a of 2D affine transformation matrices which
are used to transform a bezier path and I need an average
scale factor of those matrices. The scale factor is then
used to determine the tesselation factor of the bezier path
so that I can use the same tesselated path with all the
matrices without too much loss in visual quality.

The problem is that I dont know how to extract the scale
factor from the matrix. The matrices can have rotation,
scale, skew and translation in any order. Since I'm going to
use the scale to determine the tesselation level the scale
can be a good estimate too.

I know that as long as the matrix is orthogonal (or is it
orthonormal... anyway) the scale factor of x-axis is the
length of the x-axis in the matrix (in my case the first
row), but I guess the existence of the possible skew
transformation ruins my changes here.

--memon

### Extracting scale from a 2D matrix

> The problem is that I dont know how to extract the scale
> factor from the matrix. The matrices can have rotation,
> scale, skew and translation in any order. Since I'm going to
> I know that as long as the matrix is orthogonal (or is it
> orthonormal... anyway) the scale factor of x-axis is the
> length of the x-axis in the matrix (in my case the first
> row),

No. For an orthogonal matrix, the scale is always 1.0, by definition
of 'orthogonal' (note: there's a difference between the meanings of
'orthogonal' for matrices and for sets of basis vectors, and the
qualifier 'orthonormal' doesn't exist for matrices).

What you need are the eigenvalues of your matrices. You get them as
solutions to the following equation:

det ( M - lambda * idendity_matrix) = 0

For a 3x3 matrix (translation is irrelevant, here), you end up with a
3rd degree polynomial equation in lambda. The solutions of this
equation are the three eigenvalues of matrix M (two of which may be
complex, and conjugates of each other). For an orthogonal matrix,
they're all of absolute value 1, but in your case, this will differ.

For uniform scaling by a factor 'f', you'll find all eigenvalues of
absolute value 'f'. For non-uniform scaling, you get three different
eigenvalues. I'll leave it up to you what the right single number to
extract from this would then be.
--

Even if all the snow were burnt, ashes would remain.

### Extracting scale from a 2D matrix

> > The problem is that I dont know how to extract the scale
> > factor from the matrix. The matrices can have rotation,
> > scale, skew and translation in any order. Since I'm going to

> > I know that as long as the matrix is orthogonal (or is it
> > orthonormal... anyway) the scale factor of x-axis is the
> > length of the x-axis in the matrix (in my case the first
> > row),

> No. For an orthogonal matrix, the scale is always 1.0, by definition
> of 'orthogonal' (note: there's a difference between the meanings of
> 'orthogonal' for matrices and for sets of basis vectors, and the
> qualifier 'orthonormal' doesn't exist for matrices).

"Numerical Recipes" in the eigenvalue chapters has a lot of
orthonormal words in it when they speak about matrices. I
always thought that orthonormal means that each axis in the
matrix has 90 degree angle to it's neighbourn axii. Anyway I
was referring to that situation.

Quote:> What you need are the eigenvalues of your matrices. You get them as
> solutions to the following equation:

>         det ( M - lambda * idendity_matrix) = 0

> For a 3x3 matrix (translation is irrelevant, here), you end up with a
> 3rd degree polynomial equation in lambda. The solutions of this
> equation are the three eigenvalues of matrix M (two of which may be
> complex, and conjugates of each other). For an orthogonal matrix,
> they're all of absolute value 1, but in your case, this will differ.

I'm working on 2D, couldn't I use 2x2 matrix then?

Quote:> For uniform scaling by a factor 'f', you'll find all eigenvalues of
> absolute value 'f'. For non-uniform scaling, you get three different
> eigenvalues. I'll leave it up to you what the right single number to
> extract from this would then be.

eigenvector) stuff is all new to me. Fortunately "Numerical
Recipes" has pretty good introduction to eigenvalues and
also I'm able to follow the eigenvalue solvers of the
C-language version of the book (and it's online too:
http://www.nr.com).

Are you suggesting in the last sentence of your last
paragraph that I should just drop out all the complex
eigenvalues? :) If not I'm not sure how to handle the
situation (yep.. my complex number math is rusty).

Also the eigenvalue texts speaks a lot about symmetric
matrices. As far as I understand the definition it means
that the matrices which has skewing and non-uniform scaling
are not symmetric (my case). Does that mean that in my case
I can only can solve the eigenvalue using the Hessenberg
stuff or by solving the polynomial (if it is 2nd degree,
maybe it's not that complicated)? That is, I cannot use all
the fancy and fast solvers (Jacobi, Givens/Householder)?

This solution may be the most elegant one, but new math
always scares the hell out of me :)

--memon

### Extracting scale from a 2D matrix

[...]

[orthogonal vs. orthonormal]

Quote:> "Numerical Recipes" in the eigenvalue chapters has a lot of
> orthonormal words in it when they speak about matrices. I
> always thought that orthonormal means that each axis in the
> matrix has 90 degree angle to it's neighbourn axii. Anyway I
> was referring to that situation.

That's nonstandard terminology. For matrices, all LinAlg references
I've seen define orthogonal as meaning

M * M^T = identity

For sets of basis vectors, "orthogonal" means

v_i . v_j = 0  whenever i != j

whereas "orthonormal" also requires

v_i . v_i = 1

I.e. a set of orthonormal basis vector, composed into a square matrix,
makes an orthogonal matrix. Confusing, isn't it?

Quote:>> What you need are the eigenvalues of your matrices. You get them as
>> solutions to the following equation:

>>         det ( M - lambda * idendity_matrix) = 0

[...]

Quote:> I'm working on 2D, couldn't I use 2x2 matrix then?

Of course. I had forgotten the subject line already when I wrote that :-(

Quote:> Are you suggesting in the last sentence of your last
> paragraph that I should just drop out all the complex
> eigenvalues? :)

No. I'm saying you should only look at the absolute values of the
numbers, no matter whether they're complex or not.

[...]

Quote:> I can only can solve the eigenvalue using the Hessenberg
> stuff or by solving the polynomial (if it is 2nd degree,
> maybe it's not that complicated)? That is, I cannot use all
> the fancy and fast solvers (Jacobi, Givens/Householder)?

For a mere 2x2, solving the polynomial is going to be faster than any
of the more fancy methods. Those are for larger matrices, mainly.  In
particular, for any matrix larger than 4x4 you cannot solve the
polynomial analytically anymore, so you *need* iterative numerical
solutions to find the eigenvalues.
--

Even if all the snow were burnt, ashes would remain.

### Extracting scale from a 2D matrix

> For a mere 2x2, solving the polynomial is going to be faster than any
> of the more fancy methods. Those are for larger matrices, mainly.  In
> particular, for any matrix larger than 4x4 you cannot solve the
> polynomial analytically anymore, so you *need* iterative numerical
> solutions to find the eigenvalues.

Ok, I think I can figure out the rest, so thanks again for

--memon

I am in need of an algorithm to extract scaling and rotation factors
from a transformation matrix. I've implemented two such algorithms from
Graphics Gems II (page 322), but have only had success with the rotation
extraction algorithm (I can't get the shear/scaling algorithm to work).
Would anyone be willing to supply some C source or references? Thanks.

--> Rob Lansdale

--
Robert Lansdale - (416) 978-6619       Dynamic Graphics Project

UUCP:   ..!uunet!dgp.toronto.edu!lansd University of Toronto