Réunion d'hiver SMC 2024

Vancouver/Richmond, 29 novembre - 2 decembre 2024

       

Variational Analysis: Theory and Applications
Org: Heinz Bauschke (UBC Okanangan) et Xianfu Wang (UBC Okanagan)
[PDF]

AHMET ALACAOGLU, UBCV
Revisiting Inexact Fixed-Point Iterations for Min-Max Problems: Stochasticity and Structured Nonconvexity  [PDF]

In this talk, we revisit the analysis of inexact Halpern and Krasnosel'skii-Mann (KM) iterations for solving constrained and stochastic min-max problems. We relax the inexactness requirement on the computation of the resolvent in stochastic Halpern iteration and modify stochastic KM iteration to work with biased samples of the resolvent. We present the consequences of these results for solving constrained and stochastic convex-concave min-max problems, such as improved last iterate convergence guarantees. Then, we apply our developments to solve constrained nonconvex-nonconcave min-max problems satisfying cohypomonotonicity assumption. Within this class of problems, we show how toexpand the limit of nonmonotonicity that can be handled by first-order methods. (Joint work with Donghwan Kim and Stephen J. Wright)

YUAN GAO, University of British Columbia Okanagan
On a result by Baillon, Bruck, and Reich  [PDF]

It is well known that the iterates of an averaged nonexpansive mapping may only converge weakly to fixed point. A celebrated result by Baillon, Bruck, and Reich from 1978 yields strong convergence in the presence of linearity. In this paper, we extend this result to allow for flexible relaxation parameters. Examples are also provided to illustrate the results.

PHILIP LOEWEN, UBC Vancouver
Sensitivity Analysis for the Linear Quadratic Regulator  [PDF]

I will discuss some analytic and computational aspects of the classic discrete-time linear quadratic regulator, paying special attention to the sensitivity of the minimizers to the various ingredients in the nominal problem, and suggesting applications for the methods provided.

SHAMBHAVI SINGH, University of Waterloo
Analysis of Chambolle-Pock through the lens of duality  [PDF]

We extensively analyze the operator associated with the primal-dual hybrid gradient algorithm (or the Chambolle-Pock algorithm) that is used to solve the composite monotone inclusion problem for maximally monotone operators. Through the lens of a dual structure on the underlying space of the operator, we obtain several properties of the underlying solution sets. We also recover known results for the Douglas-Rachford algorithm. On giving additional structure to the operators—such as being paramonotone, or subdifferentials of proper lower semicontinuous convex functions, the solution sets are further simplified.

XIANFU WANG, University of British Columbia
On Bauschke-Bendit-Moursi modulus of averagedness and classifications of averaged nonexpansive operators  [PDF]

Averaged operators are important in Convex Analysis and Optimization Algorithms. We propose classifications of averaged operators, firmly nonexpansive operators, and proximity operators by using the Bauschke-Bendit-Moursi modulus of averagedness. We show that if a resolvent has modulus of averagedness less than $1/2$, then it is a bi-Lipschitz homemorphism. Amazingly the proximity operator of a convex function has its modulus of averagedness less than $1/2$ if and only if the function is Lipschitz smooth. Joint work with Shuang Song.

ZIYUAN WANG, UBCO
Level proximal subdifferential, variational convexity, and beyond.  [PDF]

Discovered by Rockafellar in 2021, level proximal subdifferential has the pleasant feature that every proximal operator is the resolvent of a level proximal subdifferential operator. In this talk, we present a systematic study of the level proximal subdifferential, revealing its remarkable connections to the classic Fenchel subdifferential in convex analysis and to proximal hulls of proper, lsc, and prox-bounded functions. An interpretation of our results in view of the $\Phi$-subdifferential in optimal transport will be discussed. Furthermore, we established a full equivalence between variational convex functions, local (firmly) nonexpansive proximal operators, and relative (maximal) monotone level proximal subdifferential operators, which unifies and extends recent advances by Rockafellar in 2021 and by Khanh, Mordukhovich, and Phat in 2023. A pointwise version of Lipschitz smoothness will be investigated through the lens of the level proximal subdifferential. The talk is based on joint works with Honglin Luo, Xianfu Wang, and Xinmin Yang, and with Andreas Themelis and Xianfu Wang.

HENRY WOLKOWICZ, University of Waterloo
Regularized Nonsmooth Newton Algorithms for Best Approximation with Applications  [PDF]

We consider the problem of finding the best approximation point from a polyhedral set, and its applications, in particular to solving large-scale linear programs. The classical best approximation problem has many various solution techniques as well as applications. We study a regularized nonsmooth Newton type solution method where the Jacobian is singular; and we compare the computational performance to that of the classical projection method of Halpern-Lions-Wittmann-Bauschke (HLWB).

We observe empirically that the regularized nonsmooth method significantly outperforms the HLWB method. However, the HLWB method has a convergence guarantee while the nonsmooth method is not monotonic and does not guarantee convergence due in part to singularity of the generalized Jacobian.

Our application to solving large-scale linear programs uses a parametrized best approximation problem. This leads to a finitely converging stepping stone external path following algorithm. Other applications are finding triangles from branch and bound methods, and generalized constrained linear least squares. We include scaling methods and sensitivity analysis to improve the efficiency. (work with Y. Censor, W. Moursi, T. Weames)


© Société mathématique du Canada : http://www.smc.math.ca/