<div>Thanks to Jordan, Einstein, Karthik and Zachary,</div>
<div> </div>
<div>For the time being I am trying one plus one evolutionary optimizer and trying to read a bit on it too.</div>
<div> </div>
<div>I really appreciate the help !</div>
<div>with regards,</div>
<div>Ashish.<br><br> </div>
<div><span class="gmail_quote">On 6/22/05, <b class="gmail_sendername">Zachary Pincus</b> <<a href="mailto:zpincus@stanford.edu">zpincus@stanford.edu</a>> wrote:</span>
<blockquote class="gmail_quote" style="PADDING-LEFT: 1ex; MARGIN: 0px 0px 0px 0.8ex; BORDER-LEFT: #ccc 1px solid">Also, the OnePlusOneEvolutionaryOptimizer (<br><a href="http://itk.org/Insight/Doxygen/html/">http://itk.org/Insight/Doxygen/html/
</a><br>classitk_1_1OnePlusOneEvolutionaryOptimizer.html ) is an optimizer with<br>a simulated annealing flavor. It takes random steps within a certain<br>radius in parameter space, and adjusts that radius (anisotropically) to
<br>selectively favor directions which have produced good results in the<br>past. With each "good" step, the radius grows, and with each "bad" step<br>it shrinks (akin to the "cooling" that you do toward the end of a
<br>simulated annealing run).<br><br>Zach<br><br>On Jun 22, 2005, at 2:08 PM, Karthik Krishnan wrote:<br><br>><br>><br>> Einstein, Daniel R wrote:<br>><br>>> Anish,<br>>> As far as I can tell, all of the optimization algorithms from Netlib
<br>>> are local. Global optimization is considerably harder, and requires<br>>> much more crunching. Examples are simulated annealing, multi-starts,<br>>> i.e. stochastically sampling the solutions space, particle swarm
<br>>> methods, and sequential response surfaces. I am new enough to ITK<br>>> that I cannot say which if any of these might usefully be implemented<br>>> in ITK.<br>><br>> <a href="http://itk.org/Insight/Doxygen/html/classitk_1_1SPSAOptimizer.html">
http://itk.org/Insight/Doxygen/html/classitk_1_1SPSAOptimizer.html</a> was<br>> recently added<br>><br>>> Particle swarm methods are an interesting option because they are so<br>>> easy to program. Be advised, however, there are no global methods
<br>>> that guarantee conversion.<br>>> Dan<br>>><br>>> Daniel R Einstein, PhD<br>>> Biological Monitoring and Modeling<br>>> Pacific Northwest National Laboratory<br>>> P.O. Box 999; MSIN P7-59
<br>>> Richland, WA 99352<br>>> Tel: 509/ 376-2924<br>>> Fax: 509/376-9064<br>>> _daniel.einstein@pnl.gov_ <mailto:<a href="mailto:daniel.einstein@pnl.gov">daniel.einstein@pnl.gov</a>><br>>>
<br>>><br>>> ----------------------------------------------------------------------<br>>> --<br>>> *From:* insight-users-bounces+daniel.einstein=<a href="mailto:pnl.gov@itk.org">pnl.gov@itk.org</a>
<br>>> [mailto:<a href="mailto:insight-users-bounces+daniel.einstein=pnl.gov@itk.org">insight-users-bounces+daniel.einstein=pnl.gov@itk.org</a>] *On<br>>> Behalf Of *Ashish Poddar<br>>> *Sent:* Wednesday, June 22, 2005 10:58 AM
<br>>> *To:* Luis Ibanez<br>>> *Cc:* insight-users @ itk. org<br>>> *Subject:* [Insight-users] Re: [Insight-developers] a small bug<br>>> initkConjugateGradientOptimizer<br>>><br>>> Hi,
<br>>> Most of the optimizers which I have came across help in finding the<br>>> local minima around the given initial approximation. But in that case<br>>> I mostly end up in a wrong place. Is there any algorithm which helps
<br>>> to scan the global space somehow and help in determining the global<br>>> minima.<br>>> Other problem that I am facing with the conjugate gradient method is<br>>> that the scales do not work for conjugate gradient optimizer. The
<br>>> scales are only taken into account in the very first initialization<br>>> step and are never considered again in any of the latter iterations.<br>>> I want to fix some of the parameters by setting the scale to some
<br>>> extreme value (I used to set 100 or something for regular step<br>>> gradient descent optimizer and it used to serve the purpose very<br>>> conveniently).<br>>> any help will be highly appreciated,
<br>>> with regards,<br>>> Ashish.<br>>><br>>> On 5/18/05, *Luis Ibanez* <<a href="mailto:luis.ibanez@kitware.com">luis.ibanez@kitware.com</a><br>>> <mailto:<a href="mailto:luis.ibanez@kitware.com">
luis.ibanez@kitware.com</a>>> wrote:<br>>><br>>><br>>> Hi Ashish,<br>>><br>>> The Conjugate Gradient method is only convenient when the cost<br>>> function has smooth second derivatives. If your cost function
<br>>> is noisy, is is unlikely that this optimizer will behave nicely.<br>>><br>>> Note that is is common to find that Image Metrics are rather<br>>> noisy functions.<br>>><br>>>
<br>>><br>>> Regards,<br>>><br>>><br>>> Luis<br>>><br>>><br>>><br>>> --------------------<br>>><br>>> Ashish Poddar wrote:<br>>>
<br>>> > Hi,<br>>> ><br>>> > I also am right now struggling with the initialization options<br>>> for the<br>>> > Conjugate Gradient for which I could not find any examples.
<br>>> While<br>>> > searching I came across an example for Levenberg Marquardt<br>>> Optimizer<br>>> > which seems to be having similar interface as that of conjugate<br>>> > gradient optimizer. However the similar initialization did not
<br>>> worked<br>>> > for Conjugate gradient. If someone can point out any reference<br>>> for<br>>> > Conjugate Gradient, it would be great.<br>>> ><br>>> > Earlier I was using regular step gradient descent optimizer with
<br>>> these<br>>> > parameters:<br>>> > Transform - Centered Affine<br>>> > Scale for first 9 parameters - 1.0<br>>> > Scale for next 6 parameters - 0.0001<br>
>> > Number of Iterations - 400<br>>> > Minimum Step Length - 0.0001<br>>> > Maximum Step Length - 0.005<br>>> ><br>>> > Any help will be highly appreciated,
<br>>> > with regards,<br>>> > Ashish.<br>>> ><br>>> ><br>>> ><br>>> > On 5/17/05, Ashish Poddar <<a href="mailto:ahpoddar@gmail.com">ahpoddar@gmail.com
</a><br>>> <mailto:<a href="mailto:ahpoddar@gmail.com">ahpoddar@gmail.com</a>>> wrote:<br>>> ><br>>> >>Hi Luis,<br>>> >><br>>> >>Thank you for the quick action. probably similar change is
<br>>> required<br>>> >>for Levenberg Marquardt Optimizer too.<br>>> >><br>>> >>with regards,<br>>> >>Ashish.<br>>> >><br>>> >>On 5/16/05, Luis Ibanez <
<a href="mailto:luis.ibanez@kitware.com">luis.ibanez@kitware.com</a><br>>> <mailto:<a href="mailto:luis.ibanez@kitware.com">luis.ibanez@kitware.com</a>>> wrote:<br>>> >><br>>> >>>Hi Ashish,
<br>>> >>><br>>> >>>Thanks for pointing this out.<br>>> >>><br>>> >>>You are right, the GetValue() method should be const.<br>>> >>>
<br>>> >>>A fix has now been committed to the CVS repository.<br>>> >>><br>>> >>>Please let us know if you encounter any other problem.<br>>> >>>
<br>>> >>> Thanks<br>>> >>><br>>> >>> Luis<br>>> >>><br>>> >>>----------------------<br>>> >>>Ashish Poddar wrote:
<br>>> >>><br>>> >>>>hi,<br>>> >>>><br>>> >>>>I am not sure whether it qualifies as a bug or not, but surely<br>>> affects<br>
>> >>>>the re-usability and pluggability model of ITK Library.<br>>> >>>><br>>> >>>>the GetValue() function in ConjugateGradientOptimizer class<br>>> currently is
<br>>> >>>><br>>> >>>>MeasureType GetValue();<br>>> >>>><br>>> >>>>but in case of RegularStepGradientDescentOptimizer class its<br>>> defined by macro as
<br>>> >>>><br>>> >>>>MeasureType GetValue() const;<br>>> >>>><br>>> >>>>which is an interface mis-match... This I encountered when i
<br>>> replaced<br>>> >>>>regular step gradient descent optimizer by conjugate gradient<br>>> >>>>optimizer. In the observer I was using a const reference of<br>>> the
<br>>> >>>>optimizer and displaying the value (just the example which is<br>>> >>>>available for the same nothing new =D)...<br>>> >>>><br>>> >>>>with regards,
<br>>> >>>>Ashish.<br>>> >>>><br>>> >>><br>>> >>><br>>> >>--<br>>> >>Ashish Poddar<br>>> >>Have an acceptable reason for accepting anything.
<br>>> >>Y:ashish_poddar | <a href="mailto:MSN:ashish_poddar@yahoo.com">MSN:ashish_poddar@yahoo.com</a><br>>> <mailto:<a href="mailto:MSN">MSN</a>:<a href="mailto:ashish_poddar@yahoo.com">ashish_poddar@yahoo.com
</a>><br>>> >><br>>> ><br>>> ><br>>> ><br>>><br>>><br>>><br>>><br>>><br>>><br>>> --<br>>> Ashish Poddar<br>>> Have an acceptable reason for accepting anything.
<br>>> Y:ashish_poddar | <a href="mailto:MSN:ashish_poddar@yahoo.com">MSN:ashish_poddar@yahoo.com</a><br>>> <mailto:<a href="mailto:MSN">MSN</a>:<a href="mailto:ashish_poddar@yahoo.com">ashish_poddar@yahoo.com
</a>><br>>><br>>> ----------------------------------------------------------------------<br>>> --<br>>><br>>> _______________________________________________<br>>> Insight-users mailing list
<br>>> <a href="mailto:Insight-users@itk.org">Insight-users@itk.org</a><br>>> <a href="http://www.itk.org/mailman/listinfo/insight-users">http://www.itk.org/mailman/listinfo/insight-users</a><br>>><br>> _______________________________________________
<br>> Insight-users mailing list<br>> <a href="mailto:Insight-users@itk.org">Insight-users@itk.org</a><br>> <a href="http://www.itk.org/mailman/listinfo/insight-users">http://www.itk.org/mailman/listinfo/insight-users
</a><br>><br><br>_______________________________________________<br>Insight-users mailing list<br><a href="mailto:Insight-users@itk.org">Insight-users@itk.org</a><br><a href="http://www.itk.org/mailman/listinfo/insight-users">
http://www.itk.org/mailman/listinfo/insight-users</a><br></blockquote></div><br><br><br>-- <br>Ashish Poddar<br>Have an acceptable reason for accepting anything.<br>Y:ashish_poddar | <a href="mailto:MSN:ashish_poddar@yahoo.com">
MSN:ashish_poddar@yahoo.com</a>