Réunion d'été SMC 2025

Ville de Québec, 6 - 9 juin 2025

       

Derivative-free optimization and simulation-based optimization
Org: Kwassi Joseph Dzahini (Argonne National Laboratory) et Gabriel Jarry-Bolduc (Mount Royal University)
[PDF]

PIERRE-YVES BOUCHET, Polytechnique Montréal
Convergence towards a local minimum by algorithms with a covering step  [PDF]

This talk introduces a new algorithmic step that may fit into virtually any optimization algorithm to strengthen its convergence analysis. By design, this so-called covering step ensures that all accumulation points generated by the algorithm are local solutions to the optimization problem. This new result holds true even for a discontinuous objective function, under a mild assumption that is discussed in details. A practical construction scheme for the covering step that works at low additional cost per iteration is also provided.

The content of this talk has been recently published on Optimization Letters 19, 211--231, 2025 (https://doi.org/10.1007/s11590-024-02165-2).

YIWEN CHEN, University of British Columbia
Random subspace trust-region algorithm for high-dimensional convex-constrained derivative-free optimization  [PDF]

Model-based derivative-free optimization (DFO) methods are one major class of DFO methods widely used in practice. However, these methods are known to struggle in high dimensions. Recent research tackles this issue by searching for decreases in low-dimensional subspaces sampled randomly in each iteration. This talk proposes a random subspace trust-region algorithm for general convex-constrained DFO problems. We provide a new subspace sampling technique that guarantees that the subspace preserves the first-order criticality measure by a certain percentage with a fixed probability. We define a new class of models that can be constructed using only feasible points in the subspace. Based on these new theoretical results, an almost-sure global convergence and a worst-case complexity analysis of our algorithm are presented. Numerical experiments demonstrate the strong performance of our algorithm in high dimensions.

EDWARD HALLÉ-HANNAN, Polytechnique Montréal
CatMADS: categorical variables with the MADS algorithm  [PDF]

Solving optimization problems where functions are blackboxes and variables involve different types poses significant theoretical and algorithmic challenges. Nevertheless, such settings frequently occur in simulation-based engineering design and machine learning. This paper extends the Mesh Adaptive Direct Search (MADS) algorithm to address mixed-variable problems with categorical, integer and continuous variables. MADS is a robust derivative-free optimization framework with a well-established convergence analysis for constrained quantitative problems. CatMADS generalizes MADS by incorporating categorical variables, handled via distance-induced neighborhoods. An exhaustive convergence analysis of CatMADS is provided, with flexible choices balancing computational cost and local optimality strength. Four types of mixed-variable local minima are introduced, corresponding to progressively stronger notions of local optimality. CatMADS integrates the progressive barrier strategy for handling constraints with guarantees. An instance of CatMADS employs cross-validation to construct problem-specific categorical distances. This instance is benchmarked against state-of-the-art solvers on 32 mixed-variable problems, half of which are constrained. Data profiles show that CatMADS achieves the best results, demonstrating that the framework is empirically efficient in addition to having strong theoretical foundations.

GABRIEL JARRY-BOLDUC, Mount Royal University
The cosine measure of a function  [PDF]

The cosine measure of a set of vectors is a valuable tool in derivative-free optimization (DFO) to judge the quality of the set. It provides information on how uniformly the set of vectors is covering the space $\mathbb{R}^n$. It is used in the convergence theory of certain DFO algorithms. A set of vectors is a positive spanning set of $\mathbb{R}^n$ if and only if its cosine measure is greater than zero. An important property of positive spanning sets is that when the gradient of a function at a point is well-defined and not equal to the zero vector, then there is at least one descent direction (ascent direction) of the function at the point contained in this set. This is not necessarily true if the gradient is equal to the zero vector or if the gradient does not exist. To characterize the previous two cases, the novel concept of cosine measure of a function is introduced in this paper. It provides an infimum on the value of the cosine measure of a set of vectors guaranteed to contain a descent direction of the function at the point of interest. Pseudo-codes are developed that makes it possible to compute the cosine measure of infinite sets of vectors and it is shown how to compute the cosine measure of a function for two popular classes of nonsmooth functions: indicator functions and max functions.

TANMAYA KARMARKAR, University of British Columbia
Computing the convex envelope of bivariate piecewise linear-quadratic (PLQ) functions  [PDF]

We introduce a linear-time algorithm for computing the convex envelope of bivariate piecewise linear-quadratic (PLQ) functions and establish that the biconjugate is piecewise rational defined over a polyhedral subdivision. Our approach consists of the following steps: (1) compute the convex envelope of each quadratic piece obtaining piecewise rational functions (quadratic divided by linear function) defined over a polyhedral subdivision;(2) compute the conjugate of each resulting piece to obtain piecewise quadratic functions defined over a parabolic subdivision; (3) compute the maximum of all those functions to obtain the conjugate of the original PLQ function as a piecewise quadratic function defined on a parabolic subdivision; (4) compute the conjugate of each resulting piece; and finally (5) compute the maximum over all those functions to obtain the biconjugate as rational functions (quadratic divided by linear function) defined over a polyhedral subdivision.


© Société mathématique du Canada : http://www.smc.math.ca/