May 18 – 22, 2026
Virginia Tech
America/New_York timezone

Novel Randomized Tensor-Train Sketch and Applications

May 18, 2026, 2:00 PM
25m
Torgersen Hall 1040

Torgersen Hall 1040

Minisymposium Talk Low-Complexity Data-driven or Classical Algorithms and Applications Low-Complexity Data-driven or Classical Algorithms and Applications

Speaker

Paul Cazeaux (Virginia Tech)

Description

The Tensor-Train (TT) or Matrix-Product States (MPS) format provides a compact, low-rank representation for high-dimensional tensors, widely used in many-body quantum physics and quantum chemistry. Its efficiency relies on rounding, which reduces tensor ranks to maintain feasible computational costs.
In this talk, we introduce a novel block-structured randomized sketch exploiting the TT format and provide explicit probabilistic guarantees. We discuss how this sketch enables in particular randomized rounding algorithms for TT rounding, extending low-rank matrix approximation techniques. These methods significantly accelerate computations, particularly for summing TT-tensors or performing matrix-vector products, key operations in Krylov iterative solvers. We present numerical results demonstrating the empirical accuracy and computational advantages of the randomized approach over deterministic methods.

This research was supported in part by Simons Travel Support for Mathematicians Award No. MPS-TSM-00966604.

Author

Paul Cazeaux (Virginia Tech)

Co-authors

Dr Mi-Song Dupuy (Sorbonne University) Rodrigo Figueroa Justiniano (Virginia Tech)

Presentation materials

There are no materials yet.