# talks

These are my talks, at varying levels of formality, with abstracts (click to expand) and links to video where they exist.

This list is mainly for personal reference, but to those who want to watch a talk of mine and don’t know which: generally, the more recent ones should be better (in that they more closely resemble how present-day Ewin thinks). If you want a talk on my work in quantum(-inspired) algorithms, maybe try the STOC 2020 talk. If that’s too technical, maybe try the Simons boot camp talk or my blog post.

Last updated November 13, 2020.

## 2020-11-13. On Lipschitz continuity of matrix functions. UW theory lunch talk.

This was a casual 30-minute talk around a cute result I’ve tweeted about before: For \(A, B \in \mathbb{R}^{n\times n}\), \(\| |A| - |B| \| = O(\log(n)\|A - B\|)\), where \(|\cdot|\) denotes taking the absolute value of the eigenvalues of the matrix and \(\|\cdot\|\) denotes spectral norm. The log factor is tight!

## 2020-11-05. An improved quantum-inspired algorithm for linear regression. UW Allen School theory group colloquium.

Can quantum computers be used to perform machine learning faster? This still-open research question is crucial to understanding the future applications of quantum computing. I’ll briefly introduce one type of algorithm—quantum linear algebra—that is a leading candidate for quantum speedups in machine learning, and discuss the issues such algorithms have. In this talk, I will focus on the task of solving low-rank systems of linear equations, which can be done by a quantum linear algebra algorithm. For this setting, I, in joint work with András Gilyén and Zhao Song, show that a classical algorithm can perform just as well as this quantum algorithm, with a runtime “only” an 8th power greater. This improves on prior work, which gave a classical algorithm with runtime a 28th power greater. Our classical algorithm is, in fact, just stochastic gradient descent, giving some hope that this 1-to-8 gap between quantum and classical is closer than it appears in theory. (No prior knowledge of quantum computing will be assumed for this talk.)

## 2020-09-28. On quantum linear algebra vs classical linear algebra. UT quantum group meeting.

## 2020-06-23. Sampling-based sublinear low-rank matrix arithmetic framework for dequantizing quantum machine learning. STOC 2020.

## 2020-02-25. A classical algorithm framework for dequantizing quantum machine learning. Simons Institute, Quantum Algorithms workshop.

We discuss a classical analogue to the singular value transformation framework for quantum linear algebra algorithms developed by Gilyén et al. The proofs underlying this classical framework have natural connections to well-known ideas in randomized numerical linear algebra. Namely, our proofs are based on one key observation: approximating matrix products is easy in the quantum-inspired input model. This primitive’s simplicity makes finding classical analogues to quantum machine learning algorithms straightforward and intuitive. We present sketches of the key proofs of our sketches as well as of its applications to dequantizing low-rank quantum machine learning.

Joint work with Nai-Hui Chia, András Gilyén, Tongyang Li, Han-Hsuan Lin, and Chunhao Wang.

## 2020-01-28. Quantum-inspired classical linear algebra. Simons Institute, The Quantum Wave in Computing Boot Camp.

We discuss a recent line of work on classical linear algebra algorithms inspired by quantum machine learning. Such algorithms disprove that their corresponding quantum algorithms admit exponential speedups. Further, they show an interesting connection between QML and classical sketching and sampling literature, which helps clarify where exactly quantum algorithms gain advantage over classical. We discuss some lemmas central to this line of work, algorithms based on these lemmas, and some open questions in this space.

## 2020-01-06. Quantum-inspired algorithms for recommendation systems, principal component analysis, and supervised clustering. QIP plenary talk.

Slides. (possibly dead link)

## 2019-07-30. Explaining some recent quantum-inspired algorithms. Santa Fe Institute “New Algorithms for Optimization, Sampling, Learning, and Quantum Simulations” workshop.

## 2019-06-03. Building a classical framework to analyze quantum machine learning speedups. TQC plenary talk.

We cover some recent work on “dequantizing” QML algorithms and the set of classical length-squared sampling techniques used to achieve these results. First, we motivate the correspondence between QML with state preparation assumptions and classical ML with sampling assumptions. In the latter setting, we describe an algorithm toolkit which bears resemblance to quantum operations on classical distributions. Second, we describe the capabilities and limitations of these techniques in comparison with quantum computing, by considering their application to low-rank matrix inversion (1811.04909 and 1811.04852), supervised clustering (1811.00414), and recommendation systems (1807.04271). This theory of quantum-like classical sampling algorithms formalizes some intuition informing our evaluation of polylogarithmic-time QML algorithms as “good” or “bad”. By comparing QML algorithms to classical ML algorithms strengthened with sampling assumptions, we can distinguish the speedups artificially created from quantum state preparation assumptions from the speedups that truly exploit the quantumness of the algorithm.

## 2019-05-16. Quantum-inspired classical linear algebra. CIFAR QIS Meeting.

## 2019-05-15. Quantum-inspired classical linear algebra algorithms: Why and how? TCS+.

Abstract: Over the past ten years, the field of quantum machine learning (QML) has produced many polylogarithmic-time procedures for linear algebra routines, assuming certain “state preparation” assumptions. Though such algorithms are formally incomparable with classical computing, a recent line of work uses an analogous classical model of computation as an effective point of comparison to reveal speedups (or lack thereof) gained by QML. The resulting “dequantized” algorithms assume sampling access to input to speed up runtimes to polylogarithmic in input size.

In this talk, we will discuss the motivation behind this model and its relation to existing randomized linear algebra literature. Then, we will delve into an example quantum-inspired algorithm: Gilyen, Lloyd, and Tang’s algorithm for low-rank matrix inversion. This dequantizes a variant of Harrow, Hassidim, and Lloyd’s matrix inversion algorithm, a seminal work in QML. Finally, we will consider the implications of this work on exponential speedups in QML. No background of quantum computing is assumed for this talk.

## 2019-04-18. Quantum 101: Grover’s Algorithm. UW theory lunch talk.

## 2018-12-11. A Quantum-inspired Recommendation Systems Algorithm. (R,B) MSR AI Seminar.

Over the past ten years, the field of quantum machine learning (QML) has produced many polylogarithmic-time procedures for linear algebra routines, assuming certain “state preparation” assumptions. One of the most prominent examples of these is Kerenidis and Prakash’s quantum recommendation system, which takes user-product preference data in an efficient data structure and outputs recommendations for users in time polylogarithmic in input size, exponentially faster than all previous classical recommendation systems. We discuss a “de-quantization” of their algorithm: a classical algorithm that performs just as well as the quantum algorithm, up to polynomial slowdown. Thus, we exponentially improve the runtime over previous algorithms with a purely classical algorithm. We describe the algorithm’s toolkit of classical length-squared sampling techniques, motivated by a new quantum-inspired paradigm. Finally, we consider the implications of our work on exponential speedups in QML. No background of quantum computing is assumed for this talk.

## 2018-11-29. A classical toolkit for quantum machine learning algorithms. MSR QuArC.

We cover some recent work on “dequantizing” QML algorithms, and the set of classical length-squared sampling techniques used to achieve these results. First, we describe the toolkit, which bears resemblance to performing quantum operations on classical distributions. Second, we describe the capabilities and limitations of these techniques in comparison with quantum computing, by considering their application to low-rank matrix inversion (1811.04909 and 1811.04852), supervised clustering (1811.00414), and recommendation systems (1807.04271). This theory of quantum-like classical sampling algorithms formalizes some intuition informing our evaluation of polylogarithmic-time QML algorithms as “good” or “bad”. By comparing QML algorithms to classical ML algorithms strengthened with sampling assumptions, we can distinguish the speedups created from quantum state preparation assumptions from the speedups which rely in part on the quantumness of the algorithm.