May 18 – 22, 2026
Virginia Tech
America/New_York timezone

Operator learning without the adjoint

May 19, 2026, 4:35 PM
25m
Torgersen Hall 3100

Torgersen Hall 3100

Minisymposium Talk New Directions and Challenges in Linear Algebra New Directions and Challenges in Linear Algebra

Speaker

Diana Halikias (New York University)

Description

There is a mystery at the heart of operator learning: how can one recover a non-self-adjoint operator from data without probing the adjoint? Current practical approaches suggest that one can accurately recover an operator while only using data generated by the forward action of the operator without access to the adjoint. However, naively, it seems essential to sample the action of the adjoint. We prove that that without querying the adjoint, one can approximate a family of non-self-adjoint infinite-dimensional compact operators via projection onto a Fourier basis. We then apply the result to recovering Green's functions of elliptic partial differential operators and derive an adjoint-free sample complexity bound.

We also discuss the following question in the discrete, transpose-free setting: for what problems in numerical linear algebra is it necessary to access matrix-vector products with both a matrix and its transpose? This question also arises in the "unmatched backprojector" setting of certain problems in imaging. In particular, we describe the importance of the transpose in low-rank approximation, least-squares problems, and different varieties of norm estimation.

Author

Diana Halikias (New York University)

Co-authors

Alex Townsend (Cornell University) Prof. Nicolas Boullé (Imperial College, London) Sam Otto (Cornell University)

Presentation materials

There are no materials yet.