{"id":15726,"date":"2020-02-20T10:43:40","date_gmt":"2020-02-20T10:43:40","guid":{"rendered":"http:\/\/cio.edu.umh.es\/?page_id=15726"},"modified":"2020-02-20T10:43:40","modified_gmt":"2020-02-20T10:43:40","slug":"ova10","status":"publish","type":"page","link":"https:\/\/cio.umh.es\/en\/ova10\/","title":{"rendered":"OVA10: 10TH INTERNATIONAL SEMINAR ON OPTIMIZATION AND VARIATIONAL ANALYSIS"},"content":{"rendered":"<p>[:es][et_pb_section admin_label=\u00bbsection\u00bb][et_pb_row admin_label=\u00bbrow\u00bb][et_pb_column type=\u00bb4_4&#8243;][et_pb_image admin_label=\u00bbImagen\u00bb src=\u00bbhttps:\/\/cio.umh.es\/files\/2020\/02\/OVAN-10_-10-TH-International-Seminar-on-Optimization-and-Variational-Analysis-5.png\u00bb show_in_lightbox=\u00bboff\u00bb url_new_window=\u00bboff\u00bb use_overlay=\u00bboff\u00bb animation=\u00bbleft\u00bb sticky=\u00bboff\u00bb align=\u00bbleft\u00bb force_fullwidth=\u00bboff\u00bb always_center_on_mobile=\u00bbon\u00bb use_border_color=\u00bboff\u00bb border_color=\u00bb#ffffff\u00bb border_style=\u00bbsolid\u00bb]<br \/>\n&nbsp;<br \/>\n[\/et_pb_image][et_pb_text admin_label=\u00bbTexto\u00bb background_layout=\u00bblight\u00bb text_orientation=\u00bbcenter\u00bb use_border_color=\u00bboff\u00bb border_color=\u00bb#ffffff\u00bb border_style=\u00bbsolid\u00bb text_font_size=\u00bb9&#8243;]<\/p>\n<p style=\"text-align: center\">Organizing Commitee: Mar\u00eda J. C\u00e1novas and Juan Parra, UMH<\/p>\n<p>[\/et_pb_text][et_pb_text admin_label=\u00bbTexto\u00bb background_layout=\u00bblight\u00bb text_orientation=\u00bbleft\u00bb use_border_color=\u00bboff\u00bb border_color=\u00bb#ffffff\u00bb border_style=\u00bbsolid\u00bb text_text_color=\u00bb#000000&#8243;]<\/p>\n<p style=\"text-align: left\">El CIO, uno de los catorce institutos universitario de investigaci\u00f3n en matem\u00e1ticas de Espa\u00f1a, acoger\u00e1 el pr\u00f3ximo 26 de febrero el\u00a0<strong>X Seminario Internacional en Optimizaci\u00f3n y An\u00e1lisis Variacional<\/strong>. El evento se desarrollar\u00e1 a lo largo de la ma\u00f1ana en la aula\u00a00.1 del Edificio Torretamarit de la UMH y est\u00e1 dirigido tanto a investigadores como a estudiantes interesados en la tem\u00e1tica.<\/p>\n<p>El\u00a0encuentro dar\u00e1 comienzo a las 11:00 horas\u00a0con el\u00a0<em>welcoming coffee\u00a0<\/em>y, posteriormente, dar\u00e1 comienzo la primera ponencia, que lleva por t\u00edtulo\u00a0<strong>Lipschitz Modulus of Linear and Convex Inequality Systems with the Hausdorff metric<\/strong>\u00a0impartida por el investigador\u00a0<a href=\"https:\/\/cvnet.cpd.ua.es\/curriculum-breve\/es\/lopez-cerda-marco-antonio\/1624\">Marco A. L\u00f3pez<\/a>, de la Universidad de Alicante.\u00a0This talk analyzes the Lipschitz behavior of the feasible set mapping associated with linear and convex inequality systems in R\u207f. To start with, we deal with the parameter space of linear (finite\/semi-infinite) systems identified with the corresponding sets of coefficient vectors, which are assumed to be closed subsets of R<sup>n+1<\/sup>. In this framework the size of perturbations is measured by means of the (extended) Hausdorff distance. A direct antecedent, extensively studied in the literature, comes from considering the parameter space of all linear systems with a fixed index set, T, where the Chebyshev (extended) distance is used to measure perturbations. In the talk we propose an appropriate indexation strategy which allows us to establish the equality of the Lipschitz moduli of the feasible set mappings in both parametric contexts, as well as to benefit from existing results in the Chebyshev setting for transferring them to the Hausdorff one. In a second stage, the possibility of perturbing directly the set of coefficient vectors of a linear system leads to new contributions on the Lipschitz behavior of convex systems via linearization techniques.<br \/>\nEl investigador\u00a0<span lang=\"EN-US\">Mat\u00edas Raja, de la Universidad de Murcia, tomar\u00e1 el relevo a las 11:155 con la ponencia <strong>&#8216;Super weakly compact sets and their\u00a0applications&#8217;<\/strong>.\u00a0The aim of this talk is to introduce an interesting class of sets in Banach spaces lying between the norm compact and the weakly compact sets. The motivation for this class arises from the desire to extend the nice properties of uniformly convex spaces to a localised setting. Although most of the theory developed around super weak compactness is linear, together with Gilles Lancien we have addressed the non linear theory too.<\/span><br \/>\nTras una breve pausa, el evento se reanudar\u00e1 a las 12:40 horas con la ponencia &#8216;<strong>Optimality Conditions for Nonlinear Conic Programs via Squared Slack Variables&#8217;, <\/strong>del investigador\u00a0Masao Fukushimam, de la Nanzan University, Japan.\u00a0Nonlinear symmetric cone programs (NSCPs) constitute a general and important class of optimization problems that contains as special cases nonlinear semidefinite programs (NSDPs), nonlinear second order cone programs (NSOCPs) and traditional nonlinear programs (NLPs). We consider reformulating an NSCP as an ordinary NLP by means of squared slack variables. It is clear that the reformulated NLP is equivalent to the original NSCP in terms of not only global but also local optimality. This, however, is not the case in regard to optimality conditions. We discuss the first-order, i.e., Karush-Kuhn-Tucker (KKT) conditions and, in particular, the second-order necessary conditions as well as sufficient conditions for the NSCP and the reformulated NLP. Working with the reformulated NLP enables us to obtain the second-order optimality conditions for NSCPs in an easy manner, thereby bypassing a number of difficulties associated to the usual variational analytical approach. We also mention the possibility of importing convergence results from nonlinear programming, which we illustrate by means of a simple augmented Lagrangian method for NSCPs<br \/>\n<span style=\"color: #000000\">El encargado de concluir el evento ser\u00e1 el investigador <span lang=\"EN-US\">Francisco J. Arag\u00f3n\u00a0<\/span>\u00a0de la\u00a0Universidad de Alicante, con la ponencia <strong>&#8216;The Boosted DC Algorithm for linearly constrained DC programming&#8217;<\/strong> .The Boosted Difference of Convex functions Algorithm (BDCA)\u00a0has been recently introduced to\u00a0accelerate the performance of the classical Difference of Convex functions Algorithm (DCA). This\u00a0acceleration is achieved thanks to an extrapolation step from the point\u00a0computed by DCA via a line search procedure. In addition, empirical results have shown that BDCA\u00a0has better chances to escape\u00a0from bad local optima toward solutions with a better objective value\u00a0than the classical DCA. In this talk\u00a0we will show how to extend BDCA to solve a class of DC programs with\u00a0linear constraints. We propose\u00a0a new variant of BDCA and establish its global convergence to a critical\u00a0point. Finally, we present some\u00a0numerical experiments demonstrating that this new variant of BDCA\u00a0outperforms DCA both in running\u00a0time and objective value of the solutions obtained.This is a joint work with R. Campoy and P. T. Vuong<\/span><br \/>\n[\/et_pb_text][\/et_pb_column][\/et_pb_row][\/et_pb_section][:]<\/p>","protected":false},"excerpt":{"rendered":"<p>[:es][et_pb_section admin_label=\u00bbsection\u00bb][et_pb_row admin_label=\u00bbrow\u00bb][et_pb_column type=\u00bb4_4&#8243;][et_pb_image admin_label=\u00bbImagen\u00bb src=\u00bbhttps:\/\/cio.umh.es\/files\/2020\/02\/OVAN-10_-10-TH-International-Seminar-on-Optimization-and-Variational-Analysis-5.png\u00bb show_in_lightbox=\u00bboff\u00bb url_new_window=\u00bboff\u00bb use_overlay=\u00bboff\u00bb animation=\u00bbleft\u00bb sticky=\u00bboff\u00bb align=\u00bbleft\u00bb force_fullwidth=\u00bboff\u00bb always_center_on_mobile=\u00bbon\u00bb use_border_color=\u00bboff\u00bb border_color=\u00bb#ffffff\u00bb border_style=\u00bbsolid\u00bb]<br \/>\n&nbsp;<br \/>\n[\/et_pb_image][et_pb_text admin_label=\u00bbTexto\u00bb background_layout=\u00bblight\u00bb text_orientation=\u00bbcenter\u00bb use_border_color=\u00bboff\u00bb border_color=\u00bb#ffffff\u00bb border_style=\u00bbsolid\u00bb text_font_size=\u00bb9&#8243;]<br \/>\nOrganizing Commitee: Mar\u00eda J. C\u00e1novas and Juan Parra, UMH<br \/>\n[\/et_pb_text][et_pb_text admin_label=\u00bbTexto\u00bb background_layout=\u00bblight\u00bb text_orientation=\u00bbleft\u00bb use_border_color=\u00bboff\u00bb border_color=\u00bb#ffffff\u00bb border_style=\u00bbsolid\u00bb text_text_color=\u00bb#000000&#8243;]<br \/>\nEl CIO, uno de los catorce institutos universitario de investigaci\u00f3n en matem\u00e1ticas de Espa\u00f1a, acoger\u00e1 el pr\u00f3ximo [&#8230;]<\/p>","protected":false},"author":6202,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"open","ping_status":"open","template":"","meta":{"_links_to":"","_links_to_target":""},"categories":[],"tags":[],"_links":{"self":[{"href":"https:\/\/cio.umh.es\/en\/wp-json\/wp\/v2\/pages\/15726"}],"collection":[{"href":"https:\/\/cio.umh.es\/en\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/cio.umh.es\/en\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/cio.umh.es\/en\/wp-json\/wp\/v2\/users\/6202"}],"replies":[{"embeddable":true,"href":"https:\/\/cio.umh.es\/en\/wp-json\/wp\/v2\/comments?post=15726"}],"version-history":[{"count":0,"href":"https:\/\/cio.umh.es\/en\/wp-json\/wp\/v2\/pages\/15726\/revisions"}],"wp:attachment":[{"href":"https:\/\/cio.umh.es\/en\/wp-json\/wp\/v2\/media?parent=15726"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/cio.umh.es\/en\/wp-json\/wp\/v2\/categories?post=15726"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/cio.umh.es\/en\/wp-json\/wp\/v2\/tags?post=15726"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}