## Computing transform of plane given transform of points defining plane

### Computing transform of plane given transform of points defining plane

I'm wondering if anyone has some ideas on approaching this problem:

Suppose you have a plane passing through three 3d points, a b & c,
which form a triangle in that plane.  So then you have the plane
normal n = (b-a) X (c-a) / ||(b-a) X (c-a)||, and the plane in 4d
homogenous coords is [n_x, n_y, n_z, -n dot a], where n_x is the x
coordinate of the normal, and so on.  Now suppose a b & c are
transformed by matrices A B & C, and the new points are a' = A*a, b' =
B*b and c' = C*c.  (And A B & C could be limited to orthogonal
matrices, if necessary).  These three points may define a new plane.

My question is, how can we use this information to determine the
matrix representing the transformation of the original plane (in 4d
homogenous form) into the new plane?

Thanks for any help!

Chris DeCoro

### Computing transform of plane given transform of points defining plane

Quote:>My question is, how can we use this information to determine the
>matrix representing the transformation of the original plane (in 4d
>homogenous form) into the new plane?

"Subject 5.27: How do I transform normals?"
<http://www.faqs.org/>

### Computing transform of plane given transform of points defining plane

>>My question is, how can we use this information to determine the
>>matrix representing the transformation of the original plane (in 4d
>>homogenous form) into the new plane?
>   "Subject 5.27: How do I transform normals?"
>   <http://www.faqs.org/>

I'm quite certain that won't help him, because he's not using a single
matrix to transform all vertices by, in the first place.
--

Even if all the snow were burnt, ashes would remain.

### Computing transform of plane given transform of points defining plane

Quote:> Suppose you have a plane passing through three 3d points, a b & c,
> which form a triangle in that plane.  So then you have the plane
> normal n = (b-a) X (c-a) / ||(b-a) X (c-a)||, and the plane in 4d
> homogenous coords is [n_x, n_y, n_z, -n dot a], where n_x is the x
> coordinate of the normal, and so on.  Now suppose a b & c are
> transformed by matrices A B & C, and the new points are a' = A*a, b' =
> B*b and c' = C*c.  (And A B & C could be limited to orthogonal
> matrices, if necessary).  These three points may define a new plane.

Allowing different matrices to be used for each point means that a'
and friends are essentially completely random points with no relation
to each other or to a, b and c.   You could still formally compute
what the new plane coefficients would be, in terms of the old ones and
the three matrices, of course, but just redoing the calculations from
scratch using a', b' and c' is bound to be faster.

--

Even if all the snow were burnt, ashes would remain.

### Computing transform of plane given transform of points defining plane

On 4 Jul 2003 09:59:39 GMT, Hans-Bernhard Broeker

>I'm quite certain that won't help him, because he's not using a single
>matrix to transform all vertices by, in the first place.

It depends on what you think the question is. Of course what I ought
to say is "WHY?!". But assume the question means find a matrix that
transforms the plane of a, b, c into the plane of a', b', c'; and all
the mess about A, B, and C is just to indicate we're not given an
explicit point-to-point transform. Certainly, as you remark, those
matrices are essentially useless. Nevertheless we can use a rigid
motion, a rotation with a translation, to handle the plane transform.

I admit the FAQ topic I quoted isn't that much help even so.

But before I spend time solving the wrong problem, I'd like OP to tell
us what this is really about. What's the bigger problem this is a part
of, because it's an unusual question.

### Computing transform of plane given transform of points defining plane

> On 4 Jul 2003 09:59:39 GMT, Hans-Bernhard Broeker

> >I'm quite certain that won't help him, because he's not using a single
> >matrix to transform all vertices by, in the first place.

Right, each vertex is transformed using a different matrix.

Quote:> But assume the question means find a matrix that
> transforms the plane of a, b, c into the plane of a', b', c'; and all
> the mess about A, B, and C is just to indicate we're not given an
> explicit point-to-point transform.

I'm sorry for not being clear.  Yes, I am looking for the matrix that
transforms the plane of a b c into the plane of a' b' c'.

Quote:> But before I spend time solving the wrong problem, I'd like OP to tell
> us what this is really about. What's the bigger problem this is a part
> of, because it's an unusual question.

Essentially, it is related to skinning of skeletally articulated
meshes.  Each vertex is affected by a set of bone transforms, each
represented by a matrix, which are then added together to form a
transformation matrix for the vertex itself.  These are the A B and C
matrices from my original post.

Now of course, for just skinning a mesh, you don't really care about
the plane itself, just transform the vertices and render the
corresponding triangles.  The reason I am looking for an answer to
this more complicated question has to do with some ideas I have for
analyzing properties of deformable meshes.  One thing that I would
like to be able to do is, given a point relative to the local
coordinate system of a deformed triangle/plane, transform it back into
the coordinate system of the plane in the reference model, which can
then be easily converted into world coordinates. For this, it would be
great to just have the matrix that transforms the reference plane into
the deformed plane, which I can then invert.

Thanks to everyone for taking a look at this.

Chris DeCoro

### Computing transform of plane given transform of points defining plane

We must assume linear independence of a, b, c, and of a', b', c'. We
can certainly find a linear transform that maps the latter to the
former, but have no direct control of the direction normal to the
planes. Here's an easy way. First compute normals n and n' using the
usual cross product, namely n=(b-a)x(c-a). You want to solve the 4x4
matrix equation

M [a' b' c' n'] = [a b c n]

Since we have assumed (or constructed) linear independence, it's just

M = [a b c n] ([a' b' c' n']^-1)

Quote:>Essentially, it is related to skinning of skeletally articulated
>meshes.  Each vertex is affected by a set of bone transforms, each
>represented by a matrix, which are then added together to form a
>transformation matrix for the vertex itself.  These are the A B and C
>matrices from my original post.

>Now of course, for just skinning a mesh, you don't really care about
>the plane itself, just transform the vertices and render the
>corresponding triangles.  The reason I am looking for an answer to
>this more complicated question has to do with some ideas I have for
>analyzing properties of deformable meshes.  One thing that I would
>like to be able to do is, given a point relative to the local
>coordinate system of a deformed triangle/plane, transform it back into
>the coordinate system of the plane in the reference model, which can
>then be easily converted into world coordinates. For this, it would be
>great to just have the matrix that transforms the reference plane into
>the deformed plane, which I can then invert.

### Computing transform of plane given transform of points defining plane

> I'm sorry for not being clear.  Yes, I am looking for the matrix that
> transforms the plane of a b c into the plane of a' b' c'.

This is still somewhat ambiguous.  There's an infinitude of matrices
that transform one plane onto the other.  But in the usual case, only
a small subset of those will also transform a to a', b to b' and c to
c' while at it.  Just's reply shows how to construct one
representative of that subset.

Quote:> For this, it would be great to just have the matrix that transforms
> the reference plane into the deformed plane, which I can then
> invert.

Please note that this matrix is not completely well-defined.  Your
problem statement does not describe what the matrix should do with
points outside the plane defined by a, b and c.  You need one more
reference criterion, e.g. that all vectors from a point in the plane
to a point outside the plane should maintain their length, and their
orientation to the plane.  In the essence, this translates into the
additional condition Just inserted into his equations: that

M * n = n'
where
n := (b-a) x (c-a)
and     n' := (b'-a') x (c'-a')

This constraint may be natural for your case, or it may not be.  Only
you can really find out.
--

Even if all the snow were burnt, ashes would remain.

### Computing transform of plane given transform of points defining plane

Quote:> Please note that this matrix is not completely well-defined.  Your
> problem statement does not describe what the matrix should do with
> points outside the plane defined by a, b and c.  You need one more
> reference criterion, e.g. that all vectors from a point in the plane
> to a point outside the plane should maintain their length, and their
> orientation to the plane.  In the essence, this translates into the
> additional condition Just inserted into his equations: that

>    M * n = n'
> where
>    n := (b-a) x (c-a)
> and        n' := (b'-a') x (c'-a')

> This constraint may be natural for your case, or it may not be.  Only
> you can really find out.

Yes, it seems that that constraint works for what I'm trying to do.  I
tried implementing this and it seems to work out.

Thank you both for your replies.

Chris DeCoro

Hello,

This is a math question, although it's directly related
to generating a focal effect in my 3D engine.

Having a plane defined by one given normalvector A, and a point
a0 (x,y,z) in that plane, is it possible to generate a circle
of points surrounding the point a0, with a constant radius,
all point lying in the plane A?

I've tried using translation matrixes - this works, but
is there an easier/other way to define this circle?

--
.bitto http://www.ifi.uio.no/~andreaha