Videos

General Probabilistic Theories, tensor products, and projective transformations

Presenter
October 17, 2022
Abstract
Generalized Probabilistic Theories (GPTs) are theories of nature that have random features. A GPT must specify the set of states purporting to represent the physical reality, the allowable measurements, the rules for outcome statistics of the latter, and the composition rules describing what happens when we merge subsystems and create a larger system. Examples include classical probability and quantum theory. The composition rules alluded to above usually involve tensor products, including tensor products of normed spaces, convex sets and of cones. Among tensor products that have operational meaning in the GPT context, the projective and the injective product are the extreme ones, which leads to the natural question "How much do they differ?" considered already by Grothendieck and Pisier (in the 1950s and 1980s). We report on quantitative results concerning projective/injective discrepancy for finite-dimensional normed spaces. Some of the results are essentially optimal, but others can be likely improved. The methods involve a wide range of techniques from geometry of Banach spaces and random matrices. We also report on parallel results in the context of cones. Finally, we will encourage a more systematic study of convex bodies with the allowed morphisms being projective transformations. Joint work with G. Aubrun, L. Lami, C. Palazuelos, A. Winter (and a parallel work by a subset of co-authors and M. Plavala).
Supplementary Materials