DSpace Repository

A Variant Nonlinear Conjugate Gradient Method and its Global Convergence

Show simple item record

dc.contributor.author MOHAMMED, NSHMEEN HAMED ABDO
dc.date.accessioned 2018-07-10T08:00:32Z
dc.date.available 2018-07-10T08:00:32Z
dc.date.issued 2017-10
dc.identifier.uri http://repo.uofg.edu.sd/handle/123456789/616
dc.description A DissertationSubmitted to University of Gezira in Partial Fulfilment of the Requirements for the Degree of Master of Science,in MathematicsDepartment of Mathematics Science Faculty of Mathematical and Computer Sciences,October, 2017 en_US
dc.description.abstract Optimization is an iterative process which is initiated by an initial guess and followed by improving the solution in subsequent step, terminated by some stopping criteria such as tolerance or bound on the number of steps. The Conjugate Gradient method is one of the most useful optimization techniques for solving large linear system of equations, which is equivalent to the convex quadratic function. In addition, the Conjugate Gradient method can be extended to solve the nonlinear equations by using Fletcher_Revees (FR) and Polak Ribiere Polyak (PRP) methods. However, when these methods applied on some important functions and these functions are nonlinear and unconstrained the direction generated by FR and PRP methods may not be a descent direction. In many research works have shown that; the PRPmethod performs much better than the FR method for many optimization problems because it can automatically recover once a small step is generated〖 (β〗^(〖(k)〗^+ )). Nevertheless, the global convergence of the PRP method only proved for strictly convex functions. This research proposes a Modified Polak Ribiere Polyak (MPRP) and ModifiedFletcher_Revees (MFR) conjugate gradient methods for solving large-scale unconstrained nonlinear optimization problems, whose most important property is that its generated direction is always a sufficient descent direction for the objective function. The proposed methods have globally convergent under Armijo-type line search. Two algorithms have been developing based on PRP, FR methods; Matlab has been used as a developed environment for the proposed algorithms. The proposed methods have been evaluated using asset of test problems such as Sine function, Dixon function and QUADRATIC function. The evolution result shows that the MPRP and MFR methods out performance the PRP and FR methods respectively in terms of CPU time, number of iteration and error. This leads to minimize the cost of storage required by the algorithm and run time, which means that MFR and MPRP methods aremore robust,efficientand accurate for iterations thanFR and PRP methods. en_US
dc.description.sponsorship Muhsin Hassan Abdallah Main Supervisor Murtada Khalfallah Elbashir Co-supervisor en_US
dc.language.iso en en_US
dc.publisher University of Gezira en_US
dc.subject Optimization en_US
dc.subject Algorithms en_US
dc.subject Functions en_US
dc.title A Variant Nonlinear Conjugate Gradient Method and its Global Convergence en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Browse

My Account