信息论基础与编码 (6).pdf

上传人:刘静 文档编号:63511445 上传时间:2022-11-25 格式:PDF 页数:29 大小:285.89KB
返回 下载 相关 举报
信息论基础与编码 (6).pdf_第1页
第1页 / 共29页
信息论基础与编码 (6).pdf_第2页
第2页 / 共29页
点击查看更多>>
资源描述

《信息论基础与编码 (6).pdf》由会员分享,可在线阅读,更多相关《信息论基础与编码 (6).pdf(29页珍藏版)》请在得力文库 - 分享文档赚钱的网站上搜索。

1、EE 8950Tom LuoLecture 7:Smooth unconstrained minimizationterminologygeneral descent methodgradient&steepest descent methodsNewtons methodquasi-Newton methodsconjugate gradientsself-concordance&Newtons method1EE 8950Tom LuoTerminologyunconstrained minimization problemminimizef(x)f:RnR,convex,differen

2、tiable(hence domfis open.)minimizing sequence:x(k),k f(x(k)f?optimality conditionf(x?)=0set of nonlinear equations;usually no analytical solution more generally,if2f(x)”mI,thenf(x)f?12mkf(x)k2.yields stopping criterion(if you knowm)2EE 8950Tom LuoExamplesunconstrained quadratic minimizationminimizex

3、TPx+2qTx+r(P=PT”0)unconstrained geometric programmingminimizelogmXi=1eaTix+bianalytic center of linear inequalitiesminimizeXilog(bi aTix)(domf=x|aTix 03.Update.x:=x+tvuntil stopping criterion is satisfiedDescent method:f(x(k+1)f(x(k)Sincefconvex,vmust be a descent direction:f(x(k)Tv(k)0f(x+sv)backtr

4、acking line search(0 1,0 0;optimal point isx?=0use exact line searchstart atx(0)=(M,1)(to simplify formulas)20151050510152015105051015iterates are thenx(k)=M?M 1M+1k,?M 1M+1k!convergence isfast ifMclose to1slow,zig-zagging ifM?1orM 17EE 8950Tom LuoNumerical example:gradient methodminimizecTx mXi=1lo

5、g(aTix+bi)m=100,n=50gradient method with exact line search01002003004005006007008009001000050100150200250300kf(x(k)f?01002003004005006007008009001000100101102103kf(x(k)f?0100200300400500600700800900100000.511.522.53x 104kstep sizeslow convergence;zig-zagging8EE 8950Tom LuoSteepest descent directionf

6、irst-order approximation offatx:f(x+z)f(x)+f(x)Tzf(x)Tzgives approximate decrease inffor(small)stepzsteepest descent direction for general normk k:vsd=argminf(x)Tv|kvk=1gives greatest(approximate)decrease inf,per length of step(measured byk k)Euclidean norm:vsd=f(x)/kf(x)kquadratic norm:kzkP=zTPz1/2

7、,P=PT 0vsd=f(x)TP1f(x)1/2P1/2f(x)9EE 8950Tom Luocan expressvsdasvsd=argminf(x)Tv|kvk 1geometric interpretation:go as far as possible in directionf(x),while staying inunit ballquadratic norm:f(x)v1-norm:f(x)v10EE 8950Tom LuoSteepest descent methodgiven starting pointx domfrepeat1.Compute steepest des

8、cent directionvsd=argminf(x)Tv|kvk=12.Line search.Choose a step sizet3.Update.x:=x+tvuntil stopping criterion is satisfiedconverges with exact or backtracking line searchsometimesvsdis scaled between 1 and 2can be very slowused in special cases wherevis cheap to compute11EE 8950Tom LuoThe Newton ste

9、pthe Newton step(atx)isv=2f(x)1f(x)the Newton iteration(atx)isx+=x+v=x 2f(x)1f(x)interpretations:y=x+minimizes 2nd order expansion off(atx),f(x)+f(x)T(yx)+12(yx)T2f(x)(yx)solves linearized optimality condition:0=f(x)+2f(x)(y x)x+xRfR2nd-order approx.offx+xf0R1st-order approx.off0012EE 8950Tom LuoLoc

10、al convergence of Newton iterationassumptions:2f(x)”mIand Hessian satisfies Lipschitz condition:?2f(x)2f(y)?Lkx yk(Lsmall meansfnearly quadratic)resultL2m2?f(x+)?L2m2kf(x)k2region of quadratic convergence:kf(x)k(hence,f(x)f?)decreases veryrapidly if?f(x(0)?m2/Lbound on#iterations for accuracyf(x)f?:

11、log2log2(0/),0=m3/L2practical rule of thumb:56 iterations13EE 8950Tom LuoGlobal behavior of Newton iterationNewton iteration can diverge example:f(x)=log(ex+ex),start atx(0)=1.132101230.511.522.533.5xf(x)321012310.80.60.40.200.20.40.60.81xf0(x)kx(k)f(x(k)f?11.129 1005.120 10121.234 1005.349 10131.69

12、5 1006.223 10145.715 1001.035 10052.302 1042.302 10414EE 8950Tom LuoNewtons methodgiven starting pointx domfrepeat1.Compute Newton directionv=2f(x)1f(x)2.Line search.Choose a step sizet3.Update.x:=x+tvuntil stopping criterion is satisfied(also called damped or guarded Newton method)global convergenc

13、e with backtracking or exact line searchquadratic local convergence(hence,stopping criterion not an issue)15EE 8950Tom LuoAffine invariance of Newton methoduse new coordsx=T x,detT 6=0apply Newton tog(x)=f(T x)thenx(k)=T x(k)e.g.,Newton method not affected by variable scaling(cf.gradient,steepest de

14、scent)16EE 8950Tom LuoConvergence analysisassumptions:mI 2f(x)MIand Lipschitz condition?2f(x)2f(y)?Lkx ykresults:two phases1.damped Newton phase:kf(x)k 1:f(x+)f(x)2,hence#iterations 12(f(x(0)f?)2.quadratically convergent phase:kf(x)k 0f(x+tv).3.Updatexand.y:=f(x+tv)f(x),:=yTf(x+tv)/tyTv,x:=x+tv.unti

15、l stopping criterion is satisfied.same as BFGS withHreset toIeach iteration vis linear comb of current and previous gradientconverges exactly innsteps for(cvx)quadraticf(at least in theory)fast,but not quadratic,local convergence22EE 8950Tom Luoadvantage:no need to form or storeH(hence can be used f

16、or very large problems)disadvantages:can work very poorly in practice(numerical issues)not affinely invariantchange of coordinates(preconditioning)can improve practical performanceapplication:solvingAx=b,with very large,sparseA 0use CG to minimize(1/2)xTAx bTxrequiresnevals off(x)=Ax b(in theory)muc

17、h faster than direct method(if lucky,or with good preconditioning)23EE 8950Tom LuoNumerical example:CG methodminimizecTx mXi=1log(aTix+bi)m=100,n=50CG(with exact line search)05010015020025030050100150200250300kf(x(k)f?050100150200250300106105104103102101100101102103kf(x(k)f?remember:each CG iteratio

18、n is very cheap(findingf,a few inner products,line search)24EE 8950Tom LuoSelf-concordance:motivationdrawbacks of classical analysis of Newtons method:Newtons method is affinely invariant,but convergence analysis is notnever knowm,M,Lin practice m,M,Lcan depend on starting pointNesterov&Nemirovskys

19、analysis of Newtons methodis affinely invariantinvolves no unknown constantsis valid for many(but not all)functionsf25EE 8950Tom LuoSelf-concordance:definition(Nesterov&Nemirovsky)functionf:RR is self-concordant ifflflf000(t)flfl 2f00(t)3/2 f:RnR is self-concordant if restriction to arbitrary line i

20、s self-concordantSC conditionlimits third derivative in terms of secondis affinely invariant:fis SC f(Tx)is SC26EE 8950Tom Luoexamples of self-concordant fctslinear,(convex)quadratic functions log xonx|x 0 log detXonX|X=XT 0 log(t2 xTx)on(x,t)|kxk t simple properties:affine transformation of domain:

21、fSC=g(z)=f(Az+b)SCsums:f,fSC=f+fSCscaling:fSC,1=fSChence,e.g.,Pilog(bi aTix)is SC log det(F0+x1F1+xnFn)is SC27EE 8950Tom LuoConvergence analysis via SCNewton method,with backtracking or exact line search:#iterationsf(x(0)f?2+log2log2(2/)where2depends only on backtracking parameters:2=(1/2 )25 2e.g.,for=0.2,=0.7,we have1/2 365(a more refined analysis yields smaller bound)28EE 8950Tom Luoexample:f(x)=log det(F0+x1F1+xnFn)1l#Newton stepsf(x(0)f?051015202530051015202530 upper boundconclusion:f(x(0)f?gives upper bound on#iterations f(x(0)f?is also good measure in practice29

展开阅读全文
相关资源
相关搜索

当前位置:首页 > 教育专区 > 大学资料

本站为文档C TO C交易模式,本站只提供存储空间、用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。本站仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知得利文库网,我们立即给予删除!客服QQ:136780468 微信:18945177775 电话:18904686070

工信部备案号:黑ICP备15003705号-8 |  经营许可证:黑B2-20190332号 |   黑公网安备:91230400333293403D

© 2020-2023 www.deliwenku.com 得利文库. All Rights Reserved 黑龙江转换宝科技有限公司 

黑龙江省互联网违法和不良信息举报
举报电话:0468-3380021 邮箱:hgswwxb@163.com