001450273 000__ 05124cam\a2200529\i\4500 001450273 001__ 1450273 001450273 003__ OCoLC 001450273 005__ 20230310004517.0 001450273 006__ m\\\\\o\\d\\\\\\\\ 001450273 007__ cr\cn\nnnunnun 001450273 008__ 221013s2022\\\\sz\\\\\\ob\\\\000\0\eng\d 001450273 019__ $$a1346531324$$a1347029718 001450273 020__ $$a9783031126444$$q(electronic bk.) 001450273 020__ $$a3031126440$$q(electronic bk.) 001450273 020__ $$z3031126432 001450273 020__ $$z9783031126437 001450273 0247_ $$a10.1007/978-3-031-12644-4$$2doi 001450273 035__ $$aSP(OCoLC)1347369791 001450273 040__ $$aGW5XE$$beng$$erda$$epn$$cGW5XE$$dYDX$$dEBLCP$$dSFB$$dOCLCF$$dUKAHL$$dOCLCQ 001450273 049__ $$aISEA 001450273 050_4 $$aQA402.5 001450273 08204 $$a519.6$$223/eng/20221013 001450273 1001_ $$aZaslavski, Alexander J. 001450273 24510 $$aOptimization in Banach spaces /$$cAlexander J. Zaslavski. 001450273 264_1 $$aCham, Switzerland :$$bSpringer,$$c2022. 001450273 300__ $$a1 online resource. 001450273 336__ $$atext$$btxt$$2rdacontent 001450273 337__ $$acomputer$$bc$$2rdamedia 001450273 338__ $$aonline resource$$bcr$$2rdacarrier 001450273 4901_ $$aSpringerBriefs in optimization 001450273 504__ $$aIncludes bibliographical references. 001450273 5050_ $$aPreface -- Introduction -- Convex optimization -- Nonconvex optimization -- Continuous algorithms -- References. 001450273 506__ $$aAccess limited to authorized users. 001450273 520__ $$aThe book is devoted to the study of constrained minimization problems on closed and convex sets in Banach spaces with a Frechet differentiable objective function. Such problems are well studied in a finite-dimensional space and in an infinite-dimensional Hilbert space. When the space is Hilbert there are many algorithms for solving optimization problems including the gradient projection algorithm which is one of the most important tools in the optimization theory, nonlinear analysis and their applications. An optimization problem is described by an objective function and a set of feasible points. For the gradient projection algorithm each iteration consists of two steps. The first step is a calculation of a gradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error. In our recent research we show that the gradient projection algorithm generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. It should be mentioned that the properties of a Hilbert space play an important role. When we consider an optimization problem in a general Banach space the situation becomes more difficult and less understood. On the other hand such problems arise in the approximation theory. The book is of interest for mathematicians working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book which appeals specifically to this audience is the study of algorithms for convex and nonconvex minimization problems in a general Banach space. The book is of interest for experts in applications of optimization to the approximation theory. In this book the goal is to obtain a good approximate solution of the constrained optimization problem in a general Banach space under the presence of computational errors. It is shown that the algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems are studied in Chapter 3. In Chapter 4 we study continuous algorithms for minimization problems under the presence of computational errors. The algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems are studied in Chapter 3. In Chapter 4 we study continuous algorithms for minimization problems under the presence of computational errors. 001450273 588__ $$aDescription based on print version record. 001450273 650_0 $$aMathematical optimization. 001450273 650_0 $$aBanach spaces. 001450273 655_0 $$aElectronic books. 001450273 655_7 $$aLlibres electrònics.$$2thub 001450273 77608 $$iPrint version:$$aZASLAVSKI, ALEXANDER J.$$tOPTIMIZATION IN BANACH SPACES.$$d[Place of publication not identified] : SPRINGER INTERNATIONAL PU, 2022$$z3031126432$$w(OCoLC)1333268955 001450273 830_0 $$aSpringerBriefs in optimization. 001450273 852__ $$bebk 001450273 85640 $$3Springer Nature$$uhttps://univsouthin.idm.oclc.org/login?url=https://link.springer.com/10.1007/978-3-031-12644-4$$zOnline Access$$91397441.1 001450273 909CO $$ooai:library.usi.edu:1450273$$pGLOBAL_SET 001450273 980__ $$aBIB 001450273 980__ $$aEBOOK 001450273 982__ $$aEbook 001450273 983__ $$aOnline 001450273 994__ $$a92$$bISE