Variational Analysis: Theory and Applications
Org:
Heinz Bauschke (UBC),
Walaa Moursi (Waterloo),
Shawn Wang (UBC) and
Henry Wolkowicz (Waterloo)
[
PDF]
 SALIHAH ALWADANI, UBC, Okanagan
Resolvents and Yosida approximations of displacement mappings of isometries [PDF]

Maximally monotone operators are fundamental objects in modern optimization. We study in detail a nice class of monotone operators: displacement mappings of isometries of finite order. We derive explicit formulae for resolvents, Yosida approximations, and (setvalued and MoorePenrose) inverses. We illustrate our results by considering rational rotators and circular shift operators.
 SEDI BARTZ, University of Massachusetts Lowell
Open questions in multimarginal monotonicity and convex analysis [PDF]

In the twomarginal case, aspects of monotonicity and convex analysis underline optimal transport theory. Similarly to the twomarginal case, multimarginal monotonicity and convex analysis underline multimarginal optimal transport theory yet can be studied as an independent topic. We discuss basic extensions of the classical theory and point out several open questions regarding the construction of multimarginal convex antiderivatives, a multimarginal Minty type characterization of maximal monotonicity and a multimarginal extension of the maximal monotonicity of the convex subdifferential. We illustrate our discussion by several examples.
 MINH BUI, North Carolina State University
Multivariate Monotone Inclusions in Saddle Form II: Applications [PDF]

We discuss the applications of the general model and algorithm
presented in part I of this talk. In particular,
we present asynchronous blockiterative
algorithms to solve highly structured
multivariate convex optimization, variational inequalities,
traffic equilibrium, and evolution inclusions.
Joint work with P. L. Combettes.
 PATRICK COMBETTES, plc@math.ncsu.edu
Multivariate Monotone Inclusions in Saddle Form I: Theory and Algorithms [PDF]

We propose a novel approach to monotone operator splitting based on
the notion of a saddle operator. Under investigation is a highly
structured multivariate monotone inclusion problem involving a mix
of setvalued, cocoercive, and Lipschitzian monotone operators, as
well as various monotonicitypreserving operations among them. This
model encompasses most formulations found in the literature. The
analysis leads to an algorithm of unprecedented flexibility, which
achieves full splitting, exploits the specific attributes of each
operator, is asynchronous, and requires to activate only blocks of
operators at each iteration, as opposed to activating all of them.
Joint work with M. N. Bui.
 KIMON FOUNTOULAKIS, Waterloo
 TIM HOHEISEL, McGill University
From perspective maps to epigraphical projections [PDF]

The projection onto the epigraph or a level set of a closed, proper, convex function can be done by finding a root of a scalar function that involves the proximal operator as a function of the proximal parameter. To study the variationalanalytic properties of this function, we consider general optimization problems that are (partial) infimal projections of the function in question and the perspective map of a kernel. When the latter is the Euclidean norm squared, we recover the proximal map as the solution map, and extract properties such as local Lipschitz continuity, directional differentiability, and semismoothness under suitable assumptions. Based on this, we establish an SC1 optimization framework for computing epigraphical and level set projections, which is competitive with methods tailored specifically to certain instances such as the projection onto the l1unit ball.
This is joint work with Michael P. Friedlander (UBC) and Ariel Goodwin (McGill).
 HAO HU, Waterloo
 BORIS MORDUKHOVICH, Wayne State University
A Generalized Newton Method for Subgradient Systems [PDF]

This talk presents a new Newtontype algorithm to solve subdifferential inclusions defined by subgradients of extendedrealvalued proxregular functions. The proposed algorithm is formulated in terms of the secondorder subdifferential of such functions that enjoys extensive calculus rules and can be efficiently computed for broad classes of extendedrealvalued functions. Based on this and on metric regularity and subregularity properties of subgradient mappings, we establish verifiable conditions ensuring wellposedness of the proposed algorithm and its local superlinear convergence. The obtained results are also new for the class of equations defined by continuously differentiable functions with Lipschitzian derivatives , which is the underlying case of our consideration. The developed algorithm for proxregular functions is formulated in terms of proximal mappings related to and reduces to Moreau envelopes. Besides numerous illustrative examples and comparison with known algorithms, we present applications of the proposed algorithm to the practically important class of Lasso problems arising in statistics and machine learning.
Based on joint work with P.D. Khanh (University of Chile) and V. T. Phat (Wayne State University)
 WALAA MOURSI, Waterloo
 HRISTO SENDOV, The University of Western Ontario
A unified approach to operator monotone functions [PDF]

The notion of operator monotonicity dates back to a work by L\"owner in 1934.
A map $F :S^n \to S^m$ is called {\it operator monotone}, if
$A \succeq B \mbox{ implies } F(A) \succeq F(B)$.
(Here, $S^n$ is the space of symmetric matrices with the semidefinite partial order $\succeq$.)
Often, the function $F$ is defined in terms of an underlying simpler function $f$. Of main interest
is to find the properties of $f$ that characterize operator monotonicity of $F$. In that case, it is said that $f$ is also operator monotone.
Classical examples are the L\"owner operators and the spectral (scalarvalued isotropic) functions. Operator monotonicity for these two classes of functions is characterized in seemingly very different ways.
This talk extends the notion of operator monotonicity to symmetric functions $f$ on $k$ arguments. The latter is used to define {\it (generated) $k$isotropic maps} $F : S^n \to S^{n \choose k}$ for any $n \ge k$. Necessary and sufficient conditions are given
for $f$ to generate an operator monotone $k$isotropic map $F$. When $k=1$, the $k$isotropic map becomes a L\"owner operator and when $k=n$ it becomes a spectral function. This allows us to reconcile and explain the differences between the conditions for monotonicity for the L\"owner operators and the spectral functions.
 MOHAMED TAWHID, Thompson Rivers University
Improved Salp Swarm Optimization Algorithm for Data Clustering [PDF]

In this works, an improved Salp Swarm Optimization algorithm is proposed for data clustering. Our proposed algorithm utilizes the crossover operator to obtain an improvised version of the existing Salp Swarm Optimization algorithm. The performance of our suggested algorithm is tested by comparing the proposed algorithms with standard Salp Swarm Optimization algorithm and other existing algorithms in the literature. The performance of our algorithm outperforms the performance of other algorithms for the data clustering problem in terms of computational time and accuracy.
 LEVENT TUNCEL, Waterloo
 STEVE VAVASIS, Waterloo
 JANE YE, University of Victoria
Secondorder optimality conditions for nonconvex setconstrained optimization problems [PDF]

In this paper we study secondorder optimality conditions for nonconvex setconstrained optimization problems. For a convex setconstrained optimization problem, it is wellknown that secondorder optimality conditions involve the support function of the secondorder tangent set.
In this paper we propose two approaches for establishing secondorder optimality conditions for the nonconvex case. In the first approach we extend the concept of the support function so that it is applicable to general nonconvex setconstrained problems, whereas in the second approach we introduce the notion of the directional regular tangent cone and apply classical results of convex duality theory. Besides the secondorder optimality conditions, the novelty of our approach lies in the systematic introduction and use, respectively, of directional versions of wellknown concepts from variational analysis.
© Canadian Mathematical Society