Dear colleagues,

Wed at 4pm Liza Rebrova (UCLA) will speak about

Random matrices, low-rank tensors and beyond: using high-dimensional probability to study complex data

Abstract. It is not a secret that probabilistic view in general and random matrix theory in particular present amazing tools to understand, process and learn from large high-dimensional data. However, in many cases, one has to go beyond “simple” matrix models to correctly represent and treat the data. For example, inherently multimodal data is better represented with a tensor, that is, higher-order generalization of a matrix.

Transition to more advanced data structures sometimes can survive re-using old algorithms, however, the development of the special tools that honor the full structure within the data pays off by making the algorithms both much more efficient and better interpretable. Simultaneously, it presents many interesting and challenging non-trivialities from the math point of view. In this talk, I will focus on our new provable methods for modewise (that is, structure preserving) tensor dimension reduction. I will also discuss the connections to interpretable learning from multi-modal data through tensor decompositions, and to our new randomized algorithms for solving linear systems with corrupted equations.

Zoom coordinates for the talk can be found on the dept website: