Videos

Matrix-Mimetic Tensor Algebra: Optimal Decompositions and Equivariant Learning

Presenter
February 4, 2026
Abstract
High-dimensional information often contains complex multi-dimensional correlations that traditional matrix-based algorithms often fail to capture. While numerous tensor algebra frameworks have been proposed over the years, each lacked critical properties inherent to matrix algebra. The novel tensor-tensor algebra formalism preserves these matrix-mimetic properties while simultaneously resolving a decades-long open problem in tensor analysis: providing an Eckart-Young-like representation theorem for tensors. This framework delivers provably optimal, yet computationally feasible, algorithm for revealing intricate high-dimensional correlations through tensor decompositions. The mimetic properties of the tensor-tensor formalism enable seamless retrofitting of existing computational workflows into tensorial ones. We demonstrate this by tensorizing specialized neural network structures and extending the framework to dynamic graphs through tensor Graph Convolutional Neural Networks, achieving effective representation learning for time-evolving structures. We conclude by exploring recent developments in tensor group symmetry theory, which generalize the algebraic framework to equivariant domains.
Supplementary Materials