搜索资源列表
3x2_2
- 这是一个最优化方法的程序,此程序为抛物线法-This is an optimization procedure, the procedure for the parabolic law
200711912175433168
- tsp蚂蚁算法的最优化的方法matlab语言-tsp ant algorithm optimization method Matlab language
someoptimiummethod
- 一些简单最优化方法的matlab实现和非线性规划算法
3573865020060409222636964
- 关于最优化方法的0.618法的matlab程序,编译通过-With regard to optimization of the 0.618 law, matlab program, compiled by
fuhexingfa
- 复合形法子程序,最优化方法大作业,-Complex another way process optimization Great job, thank you
tidufa
- 梯度法是最优化方法中由于球多变量最优值的一种方法-Gradient method is the most optimal method of multi-variable optimal value due to the ball a way of
Conjugate-Gradient-Method
- 共轭梯度法(Conjugate Gradient)是介于最速下降法与牛顿法之间的一个方法,它仅需利用一阶导数信息,但克服了最速下降法收敛慢的缺点,又避免了牛顿法需要存储和计算Hesse矩阵并求逆的缺点,共轭梯度法不仅是解决大型线性方程组最有用的方法之一,也是解大型非线性最优化最有效的算法之一。-Conjugate gradient method (Conjugate Gradient) between the steepest descent between law and Newton'
GA
- 使用遗传算法进行最优化计算的方法,其中包含遗传算法中需要的各个子程序-Using a genetic algorithm optimization calculation method, which contains the genetic algorithm requires each subroutine
RPM_concave
- 一个凹的最优化方法,鲁棒性点匹配matlab代码-Robust Point Matching Revisited: A Concave Optimization Approach
PHR_multiplier_method
- 《最优化理论与方法》书籍中的乘子法的源程序,该书中的很多案例都用此方法试验过,本代码是一个小案例,将目标函数和约束函数按自己的需求换掉就能进行所期望的运算-" Optimization Theory and Methods" books multiplier method of the source, the book' s many cases are tested using this method, the code is a small case, the obj
Trust-RegionInterior_Point
- 本代码为《最优化理论与方法》书籍重的信赖域算法的代码,本代码中的目标函数和约束等初值可由需要进行改变,来得到所期望的计算-The code for the " optimization theory and method" heavy books trust region algorithm code, the code of the objective function and constraints may need to change the initial value
conjugate-gradient-method
- 本代码为《最优化理论与方法》书籍中的共轭梯度法算法的代码,并举了两个书上的作业题的例子,本代码中的目标函数和约束等初值可由需要进行改变,来得到所期望的计算-The code for the " optimization theory and method" books conjugate gradient method algorithm code, citing two books on the example of the job title, the code in th
steepest-descent-method
- 本代码为《最优化理论与方法》书籍中的拥有Amijo改进的最速下降法算法的代码,并举了3个书上的作业题的例子,本代码中的目标函数和约束等初值可由需要进行改变,来得到所期望的计算-The code for the " optimization theory and method" books have Amijo improvement in the steepest descent method algorithm code, citing the three books on
BFGS
- 本代码为《最优化理论与方法》书籍中的牛顿法和拟牛顿法算法的代码,并举了2个书上的作业题的例子,本代码中的目标函数和约束等初值可由需要进行改变,来得到所期望的计算-The code for the " optimization theory and method" books in the Newton method and quasi-Newton method algorithm code, citing two books on the example of the job
GROUP-SPARSE-OPTIMIZATION
- 借助ADM方法的組稀疏(块稀疏)最优化的论文和部分matlab程序-GROUP SPARSE OPTIMIZATION BY ALTERNATING DIRECTION METHOD
Optimization-Algorithms
- 《最优化方法及其matlab程序设计》的源代码,包含很多经典的最优化算法-" Matlab optimization methods and procedures," the source code, including many classic optimization algorithms
wolfe
- 利用wolfe算法,编写了wolfe程序模块。测试数据引用《最优化方法及其Matlab程序设计》(马昌凤编)的Armijo准则搜索例题-Use wolfe algorithm, written wolfe program modules. Test data reference " optimization methods and Matlab programming" (Ma Changfeng ed.) The search criteria examples Armijo
最优化实验乘子法
- 最优化方法之乘子法,基本的拉格朗日乘子法就是求函数f(x1,x2,...)在约束条件g(x1,x2,...)=0下的极值的方法。 其主要思想是将约束条件函数与原函数联立,从而求出使原函数取得极值的各个变量的解。(The multiplier method of optimization method, the basic Lagrange multiplier method is to find the extreme value of function f (x1, X2,...) unde