Videos

Streaming randomized techniques for low-rank approximation of tensors with applications

Presenter
February 3, 2026
Abstract
Randomized and streaming techniques have become standard tools for constructing low-rank approximations of large-scale matrices, offering single-pass algorithms with strong computational and memory advantages. In this talk, we first recall the basic principles underlying randomized matrix approximation, highlighting their strengths and limitations in a streaming setting. We then extend these ideas to higher-order tensors and review how randomized streaming techniques can be formulated in several tensor formats, including the Tucker and tensor-train decompositions. Building on this perspective, we discuss how the same principles naturally generalize to more complex structures such as tree tensor networks. The second part of the talk focuses on tensor-train representations and their integration into iterative linear solvers. In particular, we show how streaming randomized approximations can be embedded into Krylov subspace methods such as sketched GMRES, leading to efficient tensor-structured solvers that avoid expensive intermediate tensor contractions. While we focus on the tensor-train format, the proposed framework is sufficiently general to be applied to other tensor network architectures.