[Rtk-users] Model-based image reconstruction based on the RTK modules

Cyril Mory cyril.mory at creatis.insa-lyon.fr
Fri Nov 4 02:31:18 EDT 2016


Hi Vahid,

It's becoming unclear to me, but I don't think you want to perform a dot 
product between a matrix and a vector. My advice: do the math, turn the 
equations into a pipeline (that's the tricky part) and try to copy some 
pieces of existing pipelines in RTK filters. And I don't recommend the 
rtkConjugateGradientFilter for this.

Then run your pipeline with known data, and at every step where you have 
a known reference (e.g. from python code), compare the intermediate 
image you get with that reference.

As for the python wrapping: I personally have no experience with writing 
python wrappings for RTK. It is supposed to be easily done, but only for 
full RTK filters. You cannot wrap part of a filter. So you will have to 
create your own filter, and only then wrap it in python. If I'm 
mistaken, please, RTK users, do correct me :)

Regards,

Cyril

On 11/03/2016 09:10 PM, vahid ettehadi wrote:
> OK. good. Thank Cyril to clarify it to me.  To implement the Newton's 
> methods instead of multiplication of R with R^T, we need the 
> multiplication of Jacobian/Sensitivity (R here) matrix with vector 
> variable (here x) and the gradient vector (here multiplication of R 
> with -r). I think it is possible to implement the Newton optimization 
> methods with the current modules as I think we could extract the R and 
> f dot product and R and r dot product. Am I right?
> About the displayed algorithms, I think the gradient vector is dot 
> product of matrix A (Jacobian) with the minus residual vector (-r).
>
> One more question:
> I already developed my codes in python. Do you think it is easy to 
> wrap your modules in python? I don't have experience in these subject.
>
> Regards,
> Vahid
>
>
> On Thursday, November 3, 2016 2:23 AM, Cyril Mory 
> <cyril.mory at creatis.insa-lyon.fr> wrote:
>
>
> Hello Vahid,
> Thank you for this insight on Newton's methods. Yes, the output of the 
> backprojection filter, in SART, is the gradient of the cost function. 
> You may have noticed that the pipeline performs a division of the 
> forward projection by something coming from 
> "RayBoxIntersectionFilter". This is to normalize the forward 
> projection, so that R^T R f ~= blurry f. If you don't do it, you'll 
> have R^T R f ~= alpha * blurry f, with alpha that can reach 200 or so, 
> and your algorithm will quickly diverge.
> You could try to extract the gradient from the conjugate gradient 
> filter as well, but it is significantly more tricky: first, the CG 
> filter was implemented from the following algorithm, taken from 
> wikipedia (which embeds the normalization I was mentioning earlier):
> In this algorithm, it is not clear to me what exactly is the gradient 
> of the cost function. I would say it is something like "- r_k", but 
> I'm not sure. Second, as you see, the CG filter needs an operator A, 
> which may differ from one problem to another, so this operator is 
> implemented in a separate filter, which in your case would be 
> rtkReconstructionConjugateGradientOperator, with the laplacian 
> regularization parameter gamma set to 0.
> Note that we never actually store the system matrix R. Instead, the 
> interpolation coefficient it contains are re-computed on the fly 
> everytime we forward project. And the same holds for backprojection, 
> i.e the matrix R^T.
> Best,
> Cyril
>
> On 11/03/2016 03:38 AM, vahid ettehadi wrote:
>> Hello Simon and Cyril,
>> Thanks for the reply.
>> You are right Simon. I did not notice it too in the literature. The 
>> main problem as you said is the storage. Actually I developed  the 
>> conjugate gradient (CG), quasi-Newton and Newton optimization methods 
>> for optical tomography and I intended to apply them to the CT 
>> reconstruction as well. I implemented the Newton's methods 
>> (Gauss-Newton and Levenberg-Marquardt) in a 
>> Jacobian-Free-Newton-Krylov approaches to avoid the matrix 
>> multiplication of Jacobians (sensitivity). It means we only need to 
>> store the Jacobian matrix for the these methods (the matrix R that 
>> Cyril was mentioned), that is still a big matrix for practical 
>> problems in CT reconstruction. For the quasi-Newton I adapted an 
>> L-BFGS algorithm that only need the 3 or 8 last iterations of the 
>> gradient vector to calculate the Hessian matrix. In my case, the 
>> L-BFGS and Newton's methods was much faster than the CG as you know 
>> because of using the second order derivative (hessian matrix). I saw 
>> in your last paper you implement the conjugate gradient method, so I 
>> thought it might be easy to extract the gradient vector from CG 
>> modules and solve the cost function within the quasi-Newton/Newton 
>> methods. I will look at the codes to see what I can do.
>> Thanks again for the reply.
>>
>> @Cyril:
>> Please correct me if I am wrong. you mean the output of 
>> backProjectionFilter is the gradient of defined cost function?
>>
>> Regards,
>> Vahid
>>
>>
>> On Wednesday, November 2, 2016 2:53 AM, Cyril Mory 
>> <cyril.mory at creatis.insa-lyon.fr> 
>> <mailto:cyril.mory at creatis.insa-lyon.fr> wrote:
>>
>>
>> Hi Vahid,
>> Welcome to RTK :)
>> Indeed, there are several iterative methods already implemented in 
>> RTK, but none of the filters allows you to easily extract the 
>> gradient of the least squares function there are minimizing.
>> If you need to minimize the classical non-regularized tomographic 
>> cost function, ie || R f - p ||², with R the forward projection 
>> operator, f the volume you are looking for, and p the measured 
>> projections, my best advice would be to copy some part of the 
>> pipeline of rtkSARTConeBeamReconstructionFilter to get the job done, 
>> ie the following part (copy-paste this into webgraphviz.com)
>>
>> digraph SARTConeBeamReconstructionFilter {
>>
>> Input0 [ label="Input 0 (Volume)"];
>> Input0 [shape=Mdiamond];
>> Input1 [label="Input 1 (Projections)"];
>> Input1 [shape=Mdiamond];
>>
>> node [shape=box];
>> ForwardProject [ label="rtk::ForwardProjectionImageFilter" URL="\ref 
>> rtk::ForwardProjectionImageFilter"];
>> Extract [ label="itk::ExtractImageFilter" URL="\ref 
>> itk::ExtractImageFilter"];
>> MultiplyByZero [ label="itk::MultiplyImageFilter (by zero)" URL="\ref 
>> itk::MultiplyImageFilter"];
>> AfterExtract [label="", fixedsize="false", width=0, height=0, 
>> shape=none];
>> Subtract [ label="itk::SubtractImageFilter" URL="\ref 
>> itk::SubtractImageFilter"];
>> MultiplyByLambda [ label="itk::MultiplyImageFilter (by lambda)" 
>> URL="\ref itk::MultiplyImageFilter"];
>> Divide [ label="itk::DivideOrZeroOutImageFilter" URL="\ref 
>> itk::DivideOrZeroOutImageFilter"];
>> GatingWeight [ label="itk::MultiplyImageFilter (by gating weight)" 
>> URL="\ref itk::MultiplyImageFilter", style=dashed];
>> Displaced [ label="rtk::DisplacedDetectorImageFilter" URL="\ref 
>> rtk::DisplacedDetectorImageFilter"];
>> ConstantProjectionStack [ label="rtk::ConstantImageSource" URL="\ref 
>> rtk::ConstantImageSource"];
>> ExtractConstantProjection [ label="itk::ExtractImageFilter" URL="\ref 
>> itk::ExtractImageFilter"];
>> RayBox [ label="rtk::RayBoxIntersectionImageFilter" URL="\ref 
>> rtk::RayBoxIntersectionImageFilter"];
>> ConstantVolume [ label="rtk::ConstantImageSource" URL="\ref 
>> rtk::ConstantImageSource"];
>> BackProjection [ label="rtk::BackProjectionImageFilter" URL="\ref 
>> rtk::BackProjectionImageFilter"];
>> OutofInput0 [label="", fixedsize="false", width=0, height=0, shape=none];
>> OutofBP [label="", fixedsize="false", width=0, height=0, shape=none];
>> BeforeBP [label="", fixedsize="false", width=0, height=0, shape=none];
>> BeforeAdd [label="", fixedsize="false", width=0, height=0, shape=none];
>> Input0 -> OutofInput0 [arrowhead=none];
>> OutofInput0 -> ForwardProject;
>> ConstantVolume -> BeforeBP [arrowhead=none];
>> BeforeBP -> BackProjection;
>> Extract -> AfterExtract[arrowhead=none];
>> AfterExtract -> MultiplyByZero;
>> AfterExtract -> Subtract;
>> MultiplyByZero -> ForwardProject;
>> Input1 -> Extract;
>> ForwardProject -> Subtract;
>> Subtract -> MultiplyByLambda;
>> MultiplyByLambda -> Divide;
>> Divide -> GatingWeight;
>> GatingWeight -> Displaced;
>> ConstantProjectionStack -> ExtractConstantProjection;
>> ExtractConstantProjection -> RayBox;
>> RayBox -> Divide;
>> Displaced -> BackProjection;
>> BackProjection -> OutofBP [arrowhead=none];
>> }
>>
>> As you can see, it is a very large part of the SART reconstruction 
>> filter, so yoiu might be better off just copying the whole 
>> SARTConeBeamReconstructionFilter and modifying it.
>>
>> Of course, you could also look into ITK's cost function class, and 
>> see if one of the classes inherited from it suits your needs, 
>> implement your cost function this way, and use ITK's off-the-shelf 
>> solvers to minimize it. See the inheritance diagram in 
>> https://itk.org/Doxygen/html/classitk_1_1CostFunctionTemplate.html if 
>> you want to try this approach.
>>
>> Best regards,
>> Cyril
>>
>> On 11/01/2016 05:50 PM, vahid ettehadi via Rtk-users wrote:
>>> Hello RTK users and developers,
>>>
>>> I already implemented the RTK and reconstructed some images with the 
>>> FDK algorithm implemented in RTK. It works well. Thanks to RTK 
>>> developers.
>>> Now, I am trying to develop a model-based image reconstruction for 
>>> our cone-beam micro-CT. I see already that some iterative algorithms 
>>> like ART and its modifications and conjugate-gradient (CG) method 
>>> are implemented in the RTK. I want to develop a model-based 
>>> reconstruction through the Newton/quasi-Newton optimizations 
>>> methods. I was wondering is it possible to extract the gradient of 
>>> least square function from implemented algorithms like CG module? 
>>> Any recommendation will be appreciated.
>>>
>>> Best Regards,
>>> Vahid
>>>
>>>
>>>
>>> _______________________________________________
>>> Rtk-users mailing list
>>> Rtk-users at public.kitware.com <mailto:Rtk-users at public.kitware.com>
>>> http://public.kitware.com/mailman/listinfo/rtk-users
>>
>>
>> _______________________________________________
>> Rtk-users mailing list
>> Rtk-users at public.kitware.com <mailto:Rtk-users at public.kitware.com>
>> http://public.kitware.com/mailman/listinfo/rtk-users
>>
>>
>
>
> _______________________________________________
> Rtk-users mailing list
> Rtk-users at public.kitware.com <mailto:Rtk-users at public.kitware.com>
> http://public.kitware.com/mailman/listinfo/rtk-users
>
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://public.kitware.com/pipermail/rtk-users/attachments/20161104/c90f58b2/attachment-0010.html>


More information about the Rtk-users mailing list