Center for Global Studies

View Full Calendar

On the Role of Entanglement and Statistics in Learning

Event Type
Seminar/Symposium
Sponsor
Illinois Computer Science
Location
Siebel Center 3401
Date
Oct 2, 2023   10:00 am  
Speaker
Louis Schatzki
Contact
Abigail Barrett
E-Mail
abigailb@illinois.edu
Phone
217-244-0337
Views
12
Originating Calendar
Computer Science Speakers Calendar

On Monday the theory seminar will be given by our own Louis Schatzki, who will be speaking on quantum learning theory.

The repair work on our standard room in Siebel 3401 is finished, so we will resume meeting there on Monday, as well as for the rest of term.

In this work we make progress in understanding the relationship between learning models with access to entangled, separable and statistical measurements in the quantum statistical query (QSQ) model. To this end, we show the following results.

Entangled versus separable measurements. The goal here is to learn an unknown f from the concept class C\subseteq{f:{0,1}^n->[k]} given copies of 1/\sqrt{2^n}\sum_x |x,f(x)>. We show that, if T copies suffice to learn f using entangled measurements, then O(nT^2) copies suffice to learn f using just separable measurements.

Entangled versus statistical measurements The goal here is to learn a function f\in C given access to separable measurements and statistical measurements. We exhibit a class C that gives an exponential separation between QSQ learning and quantum learning with entangled measurements (even in the presence of noise). This proves the "quantum analogue" of the seminal result of Blum et al. [BKW'03]. that separates classical SQ and PAC learning with classification noise.

QSQ lower bounds for learning states. We introduce a quantum statistical query dimension (QSD), which we use to give lower bounds on the QSQ learning. With this we prove superpolynomial QSQ lower bounds for testing purity, shadow tomography, Abelian hidden subgroup problem, degree-2 functions, planted bi-clique states and output states of Clifford circuits of depth polylog(n).

Further applications. We give and unconditional separation between weak and strong error mitigation and prove lower bounds for learning distributions in the QSQ model. Prior works by Quek et al. [QFK+'22], Hinsche et al. [HIN+'22], and Nietner et al.  [NIS+'23] proved the analogous results assuming diagonal measurements and our work removes this assumption.

 Joint work with Srinivasan Arunachalam and Vojtech Havlicek. https://arxiv.org/abs/2306.03161 .

-Michael

https://publish.illinois.edu/theory-cs/theory-seminar/theory-seminar-fall-2023/

link for robots only