Réunion d'hiver SMC 2019

Toronto, 6 - 9 décembre 2019

Inscription        

Analyse variationnelle: théorie et applications
Org: Heinz Bauschke (UBC), Hristo Sendov (Western) et Shawn Wang (UBC)
[PDF]

HEINZ BAUSCHKE, UBC Kelowna
On the minimal displacement vector and a recent question by Simons on the Gossez operator  [PDF]

This talk consists of two parts.

After reviewing relevant monotone operator theory, I will first report on recent progress on the minimal displacement vector for compositions and convex combinations of nonexpansive mappings.

The second part provides a resolution (and more!) to Stephen Simons's problem on the nonconvexity of the range of the sum of the Gossez operator and multiples of the duality mapping.

Based on recent joint works with Walaa Moursi (University of Waterloo) and Shawn Wang (UBC Kelowna)

RUBEN CAMPOY GARCIA, University of Massachusetts Lowell
Computing the resolvent of the sum of maximally monotone operators with the averaged alternating modified reflections algorithm  [PDF]

The averaged alternating modified reflections algorithm is a projection method originally designed for finding the closest point in the intersection of closed and convex sets in a Hilbert space. In this talk, we show how the scheme can be generalized so that it can deal with monotone operators. This gives rise to a new strongly convergent splitting algorithm for computing the resolvent of a sum of maximally monotone operators.

Joint work with Francisco J. Aragón Artacho (University of Alicante, Spain).

HAO HU, University of Waterloo
Facial reduction for symmetry reduced semidefinite programs  [PDF]

We consider both facial and symmetry reduction techniques for semidefinite programming, SDP. We show that the two together fits suprisingly well with an alternating direction method of multipliers, ADMM. The combination of facial and symmetry reduction leads to a significant improvement in both numerical stability and running time for both the ADMM and interior point approaches. We test our method on various relaxations of hard combinatorial problems. This is a joint work with Renata Sotirov and Henry Wolkowicz.

HUI OUYANG, University of British Columbia, Okanagan
On the convergence of circumcentered isometry methods  [PDF]

This talk is mainly based on the recent work on circumcentered isometry methods by Bauschke, Wang and Ouyang. The circumcentered isometry method is a generalization of the circumcentered Douglas--Rachford method recently introduced by Behling, Bello Cruz and Santos to accelerate the Douglas--Rachford method.

In this talk, we first present the properness of the circumcenter mapping induced by isometries, which makes the circumcentered isometry method well-defined. Then by the demiclosedness principle for circumcenter mapping, we show weak convergence results for circumcentered isometry methods, which include the Douglas--Rachford method and circumcentered reflection methods as special instances. We also provide sufficient conditions for the linear convergence of circumcentered isometry methods. At last, we display pictures on the performance of four circumcentered reflection methods, Shadow Douglas--Rachford method and method of alternating projections for finding the best approximation to the intersection of linear subspaces.

JAVIER PENA, Carnegie Mellon University
The condition number of a function relative to a set  [PDF]

The condition number of a differentiable convex function, namely the ratio of its smoothness to strong convexity constants, is closely tied to fundamental properties of the function. In particular, the condition number of a quadratic convex function is the square of the aspect ratio of a canonical ellipsoid associated to the function. Furthermore, the condition number of a function bounds the linear rate of convergence of the gradient descent algorithm for unconstrained convex minimization.

We propose a condition number of a differentiable convex function relative to a reference set and distance function pair. This relative condition number is defined as the ratio of a relative smoothness to a relative strong convexity constants. We show that the relative condition number extends the main properties of the traditional condition number both in terms of its geometric insight and in terms of its role in characterizing the linear convergence of first-order methods for constrained convex minimization.

HRISTO SENDOV, The University of Western Ontario
A unified approach to spectral and isotropic functions  [PDF]

We propose and investigate a family of maps from the space of $n \times n$ symmetric matrices, $S^n$, into the space $S^{{n \choose k}}$ for any $k=1,\ldots, n$ which are invariant under the conjugate action of the orthogonal group $O^n$. This family, called $k$-isotropic functions, not only generalizes all known types of maps with similar invariance property, such as the spectral, isotropic, primary matrix functions, multiplicative compound, and additive compound matrices on $S^n$, but also contains many previously unknown and potentially interesting subclasses of functions. \

We give necessary and sufficient conditions for these maps to be $r$-times differentiable and exhibit a recursive formula for the $r$-th derivative. Moreover, we provide a full description of the linear $k$-isotropic functions which provides a generalization to Hooke law in classical linear elasticity. Finally, we exhibit a duality motivated by the Hodge star operator between different classes of $k$-isotropic functions.\

This is a joint work with Mehdi S. Mousavi.

LEVENT TUNCEL, University of Waterloo
Primal-dual algorithms for hyperbolic programming problems  [PDF]

We will present new primal-dual algorithms for hyperbolic programming problems. Our algorithms have many desired properties including polynomial-time iteration-complexity for computing an approximate solution (for well-posed instances). In the special case of dense SDPs (as well as for symmetric cone programming problems with self-scaled barriers), our algorithms specialize to Nesterov-Todd algorithm. However, for certain sparse SDPs, our algorithms are new and have additional desired properties.

The talk is based on joint work with Lieven Vandenberghe.

HENRY WOLKOWICZ, University of Waterloo
Hard Combinatorial Problems, Doubly Nonnegative Relaxations, Facial Reduction, and Alternating Direction Method of Multipliers  [PDF]

Semi-definite programming, SDP, relaxations have proven to be extremely successful both in theory and practice for many hard combinatorial problems. We show that the Slater constraint qualification, strict feasibility, fails for many of these problems. Rather than resulting in theoretical and numerical difficulties we show how to use facial reduction, FR, to regularize while reducing the dimension of the problem. We see that FR fits perfectly with an alternating directions method of multipliers, ADMM. We show empirically that solving these relaxations solves many of the original hard combinatorial problems to optimality.

YAOLING YU, University of Waterloo
Least Squares Estimation of Weakly Convex Functions  [PDF]

Function estimation under shape restrictions, such as convexity, has many practical applications and has drawn a lot of recent interests. In this work we argue that convexity, as a global property, is too strict and prone to outliers. Instead, we propose to use weakly convex functions as a simple alternative to quantify “approximate convexity”—a notion that is perhaps more relevant in practice. We prove that, unlike convex functions, weakly convex functions can exactly interpolate any finite dataset and they are universal approximators. Through regularizing the modulus of convexity, we show that weakly convex functions can be efficiently estimated both statistically and algorithmically, requiring minimal modifications to existing algorithms and theory for estimating convex functions. Our numerical experiments confirm the class of weakly convex functions as another competitive alternative for nonparametric estimation.


© Société mathématique du Canada : http://www.smc.math.ca/