2024 CMS Summer Meeting

Saskatchewan, May 30 - June 3, 2024

Abstracts        

AARMS-CMS Student Poster Session
[PDF]

SHAWN MCADAM, Saskatchewan
Approximating Multivariate Functions with Fast Piecewise Polynomials with Application to Mantle Convection Code  [PDF]

Many programs in scientific computing spend a substantive proportion of total runtime evaluating mathematical functions. Under suitable conditions, such functions may be efficiently approximated with piecewise polynomials (AKA Lookup Tables). One example includes the mantle convection code developed by S.J. Trim et al in their 2023 paper. It contains a mathematical function that performs complex arithmetic, evaluates Jacobi elliptic functions outside their usual domains, and involves deeply nested compositions of functions. This presentation illustrates how one can build multivariate LUTs from univariate ones, and applies this construction to the function in S.J. Trim et al to obtain a roughly $500$ times speedup.

RAHUL PADMANABHAN, Concordia University
Can Transformers Approximate Functions of Matrices?  [PDF]

Transformers are at the forefront of cutting edge artificial intelligence today. Transformers are used in natural language processing, notably in large language models (LLMs) such as ChatGPT, Llama, etc. It has been determined that transformers can perform certain linear algebra computations by learning from randomized generated data. However, there are few studies indicating the extent to which they can be used for advanced numerical computations. We explore the use of transformers in approximating matrix functions. Matrix functions are an extension of regular functions to matrices, where, matrices are taken in as input parameters and the output is a matrix or a scalar. As transformers blocks are mathematically represented as a parameterized function $f_\theta: \mathbb{R}^{n \times d} \rightarrow \mathbb{R}^{n \times d}$ , we represent real numbers in the matrix using encoding schemes to obtain results from the transformer. Our objective is to determine whether transformers can be used to approximate matrix functions by learning from randomized encoded data. Specifically, we focus on certain problems in the domain of functions of matrices, some of which are matrix powers, the $p^{th}$ root of a matrix and matrix exponentials. In this poster, after providing the necessary background on transformers and matrix functions, we describe our methodology for approximating matrix functions using transformers and discuss the results of our numerical experiments.


© Canadian Mathematical Society : http://www.cms.math.ca/