-
Vladimir Druskin (Southern Methodist University)5/19/26, 2:00 PMAdvances in Randomized Algorithms and Kernel Methods for Rank-Structured MatricesMinisymposium Talk
We consider the approximation of $B^T (A+sI)^{-1} B$ for large s.p.d. $A\in\mathbb R^{n\times n}$ with a dense spectrum and $B\in\mathbb R^{n\times p}$, $p\ll n$ using block-Lanczos recursion. We target the computations of MIMO transfer functions for large-scale discretizations of problems with continuous spectral measures, such as linear time-invariant (LTI) PDEs on unbounded domains....
Go to contribution page -
Ibrohim Nosirov (Cornell University)5/19/26, 2:25 PMAdvances in Randomized Algorithms and Kernel Methods for Rank-Structured MatricesMinisymposium Talk
Stochastic Lanczos Quadrature (SLQ) is a popular algorithm for approximating the spectral density of a symmetric matrix $A$ using matrix-vector products. We present a variance reduced implementation of SLQ. This implementation has two key ingredients: a faster problem-specific eigensolver and a carefully implemented selective orthogonalization scheme that we use as a deflation criterion. Our...
Go to contribution page -
Rajarshi Bhattacharjee (University of Massachusetts Amherst)5/19/26, 2:50 PMAdvances in Randomized Algorithms and Kernel Methods for Rank-Structured MatricesMinisymposium Talk
We study the problem of approximating the eigenvectors of an n x n symmetric matrix A with bounded entries using random column sampling. We show that for any eigenvalue $\lambda$ of A, one can compute a vector v satisfying $||Av-\lambda v||_2 \leq \epsilon n$ by sampling $\tilde{O}(\frac{1}{\epsilon^4})$ columns of A. For the eigenvector corresponding to the largest-magnitude eigenvalue, this...
Go to contribution page -
Marc Aurèle Gilles (Princeton University)5/20/26, 11:00 AMAdvances in Randomized Algorithms and Kernel Methods for Rank-Structured MatricesMinisymposium Talk
I will present Randomly Pivoted LU (RPLU), a randomized variant of Gaussian elimination with complete pivoting that samples pivots proportional to squared Schur-complement entries, and analyze its low-rank approximation properties. I will highlight two regimes where RPLU is particularly effective at low-rank approximation: (i) memory-limited settings, where a rank-$k$ approximation can be...
Go to contribution page -
Mikhail Lepilov (Rensselaer Polytechnic Institute)5/20/26, 11:25 AMAdvances in Randomized Algorithms and Kernel Methods for Rank-Structured MatricesMinisymposium Talk
Due to the size of many kernel matrices that arise in applications, it is often necessary to work with their low-rank approximations in order to efficiently perform many computations. Low-rank matrix decompositions to such matrices may be quickly obtained by exploiting the analytic structure of the underlying kernel, for example by using Taylor expansions or an integral representation; such...
Go to contribution page -
Siting Liu (University of California, Riverside)5/20/26, 11:50 AMAdvances in Randomized Algorithms and Kernel Methods for Rank-Structured MatricesMinisymposium Talk
We study mean-field games with nonlocal interactions modeled through kernel-based representations. Using feature-space expansions inspired by kernel methods, the resulting models admit a variational saddle-point formulation that is well suited for efficient primal–dual algorithms such as the primal-dual hybrid gradient method. We also discuss inverse problems in which interaction mechanisms...
Go to contribution page
Choose timezone
Your profile timezone: