[Insight-developers] RE: Kernel transforms. K symmetric?

Miller, James V (Research) millerjv@crd.ge.com
Thu, 12 Dec 2002 14:38:44 -0500


Actually, I was thinking of the optimization more in constructing 
the K matrix which calls ComputeG() for every pairing of landmarks.
If all the kernels of symmetric, then for G for p1->p2 should be the
same as G for p2->p1 and we can save "some" computation.

This is a side issue for what I am going to do. The code currently
calculates K as 

G(p1, p1)  G(p1, p2), G(p1, p3), ....
G(p2, p1)  G(p2, p2), G(p2, p3), ....

For a thin plate spline and it looks like for the volume spline, 
the diagonal on the K matrix is zero.

If we replace the diagonal with some other constant, lambda, then
the splines are no longer forced to interpolate the landmarks but
instead are allowed to approximate the landmarks.  The lambda term
is standard lagrange multiplier to trade off fidelity to the data 
and a penalty for smoothness.

This can be generalized further so that instead of the same constant
along the diagonal, each dxd block around the diagonal can be set to
reflect a weight of confidence (or covariance) in the particular landmark.

The result is a weighted approximating thin plate (or other) spline.  I'd need
to look at the EBS closer to determine if the same can done for it.

So I am thinking of adding a method that computes G(pi, pi) that by default
calls G(pi, pj) and can be overridden in the subclass.

In ITKs implementation, the K matrix is (Nxd)x(Nxd).  I have some other 
thin plate spline code where the K matrix is only (NxN). While this later code
allows me to implement approximating splines, I can only weight each landmark
by a single scalar, in other words, I cannot put a covariance matrix of a landmark
along the diagonal.  So while ITKs implementation uses more memory, it is more
general.  I was thinking of whether it would be worth it for the subclasses of the 
KernelTransform to specify the number of "degrees of freedom" in the kernel.  The
elastic body spline and would have d degrees of freedom in the kernel and the 
thin plate and volume spline would only have 1 degree of freedom.  So the former
would need the (Nxd)x(Nxd) space for the K matrix while the latter two splines
would (usually) only need NxN space for the K matrix.

Jim



> -----Original Message-----
> From: Luis Ibanez [mailto:luis.ibanez@kitware.com]
> Sent: Thursday, December 12, 2002 2:18 PM
> To: Miller, James V (Research)
> Cc: Insight-developers (E-mail)
> Subject: Re: Kernel transforms. K symmetric?
> 
> 
> Hi Jim,
> 
> The Kernel matrix is noted as "G" in the code
> 
> Here is what I can see:
> 
> - ElasticBodySpline            is symmetric
> - ElasticBodyReciprocalSpline  is symmetric
> - VolumeSpline                 is diagonal
> - ThinPlateSpline              is diagonal
> - ThinPlateR2LogSpline         is diagonal
> 
> 
> The ComputG( vector ) method is virtual
> in KernelTranform. It is overloaded in
> every particular KernelSplineTransform.
> 
> The method is only invoked in
> 
>     ComputeDeformationContribution()
> 
> Here seems to be the place where we could
> take advantage of the matrix being symmetric
> or diagonal in order to reduce the number
> of operations.
> 
> ComputeDeformationContribution() is however,
> already overloaded in :
> 
> - ThinPlateR2LogSpline
> - VolumeSpline
> - ThinPlateSpline
> 
> To take advantage of the diagonal property.
> 
> It may be a matter then, of overloading
> this method in the ElasticBody and
> ElasticBodySpline in order to take advantage
> of the symmetry.
> 
> 
> Thanks for improving this code.
> 
> 
>     Luis
> 
> 
> ----------------------------
> 
> Miller, James V (Research) wrote:
> > Luis,
> > 
> >  
> > 
> > Is the K matrix of the various kernel transforms (thin 
> plate, elastic 
> > body, volume spline) always symmetric?
> > 
> >  
> > 
> > I think it is for the TPS.  If it is always symmetric, I am 
> going to 
> > rewrite it so that it only evaluates the upper triagular 
> part of K and 
> > copies values into the lower triangle.
> > 
> >  
> > 
> > Jim   
> > 
> >  
> >
> 
> 
>