Monday, September 22, 2008

Free Form Deformation

Read and study:

1. FFD:
Sederberg's paper (1986)
http://www.gamasutra.com/view/feature/3372/realtime_softobject_animation_.php
http://web.cs.wpi.edu/~matt/courses/cs563/talks/freeform/free_form.html
http://www.cs.unc.edu/~hoff/projects/comp239/finalproj/ffd/ffd.html

2. Redbook: chapter 13

3. Polynormial, binormial??? <-- What is this??? FIND OUT!
http://en.wikipedia.org/wiki/Binomial_theorem

NOTE:

Redbook: TOC


OpenGL Matrix
-- http://www.morrowland.com/apron/tutorials/gl/gl_matrix.php

-- http://lists.apple.com/archives/mac-opengl/2000/Dec/msg00036.html

-- http://gpwiki.org/index.php/OpenGL:Tutorials:Theory

-- http://qa.techinterviews.com/q/20060803025430AAvfBIl

-- http://www.sjbaker.org/steve/omniv/matrices_can_be_your_friends.html


That darned euclidean spaces:

http://www.euclideanspace.com/threed/rendering/opengl/index.htm

GL transformation:

http://www.opengl.org/resources/faq/technical/transformations.htm

Linear algebra:

http://groups.csail.mit.edu/graphics/classes/6.837/F04/lectures/linear_algebra.ppt

Some random GL stuff:

http://bcook.cs.georgiasouthern.edu/eclass/cs/lesson8.htm




All these and then some:
vectors, matrices, and then dot product, cross product,
a little trig (sin(), cosine(), tangent()), Polar Coords and Cartesian
Coords. I'm sure I've left something out. It's not too bad. For me,
its been just sheer determination. Oh, and understanding how
OpenGL lays out its matrices.

dot product = scalar product
cross product = vector product



(1) A dot product is an example of an inner product, and is
sometimes called the "standard inner product". If U and V
are n-by-1 vectors, then the dot product is Transpose(U)*V.
If M is a positive definite matrix, then Transpose(U)*M*V is
also an inner product. In index summation notation, if the
components of U are named u_i and the components of V
are named v_j, the dot product is u_i v_i. The repeated
index means sum over it, so
u_i v_i = u_1 v_1 + u_2 v_2 + ... + u_n v_n.
If M has entries M_{ij}, then an inner product is
u_i M^{ij} v_j. The repeated i and repeated j mean sum over
both. If M is the identity matrix, you get the dot product.
For other M, you do not. This concept shows up in conjugate
gradient methods when searching for minimum values of
functions.

(2) A cross product is one example of an outer product, but
the term outer product is quite general in the world of tensors.
The permutation tensor is e_{ijk}, which is 1 if (i,j,k) is an
even permutation of (1,2,3), -1 is an odd permutation of
(1,2,3), and is 0 otherwise (when an index occurs twice). For
example, e_{123} = 1, e_{132} = -1, and e_{112} = 0. The
cross product of U and V is vector W, where n = 3 and
w_i = e_{ijk} u_j v_k
The j is repeated, so you sum over it. The k is repeated, so
you sum over it. The index i is "free" to vary, so the end result
is a singly indexed quantity (a vector).

In 2D, a vector perpendicular to U is vector W,
w_i = e_{ij} u_j
where e_{12} = 1, e_{21} = -1, e_{11} = e_{22} = 0. The
index j is repeated, so you sum over it. The "dot perp" of
U and V is a scalar
d = e_{ij} u_i v_j
Sums occur over i and j.

In 4D, a vector perpendicular to U, V, and R is vector W with
w_i = e_{ijkm} u_j v_k r_m
The rules for e_{ijkm} being 1, -1, or 0 are similar to other
dimensions. Sums occur over j, k, m. Index i is free to vary.

Another outer product of two n-by-1 vectors U and V is
U*Transpose(V), which is an n-by-n matrix. The outer
product using indices is u_i v_j. No index is repeated, so
there are no sums (n choices for i, n choices for j, total
number of choices is n*n). This concept shows up
in projection matrices. If N is a unit-length normal vector
for a plane through the origin, the projection matrix onto the
plane is P = I - N*Transpose(N).

No comments: