BARC Talk by Yi Li

The $\ell_p$-Subspace Sketch Problem

Suppose that one is given a $d$-dimensional subspace, represented as the column span of an $n \times d$ matrix $A$ with $O(\log(nd))$-bit entries, and would like to compress it in an arbitrary way to build a data structure $Q_p$, so that simultaneously for every query $x \in \mathbb{R}^d$, one has $Q_p(x) = (1 \pm \epsilon) \|Ax\|_p$. How many bits are required to store $Q_p$? This problem has applications to the memory of streaming algorithms for regression in the row-update model, SVM point query, and embedding subspaces of $L_p$ in functional analysis. A major open question is the dependence on the approximation factor $\epsilon$.

We assume that $p\geq 1$ is not a positive even integer, otherwise the size of $Q_p$ can be made independent of $\epsilon$. When $d = \Omega(\log(1/\epsilon))$, we show a lower bound of $\widetilde{\Omega}(\epsilon^{-2} d)$ bits, which is optimal up to logarithmic factors. When $d$ is a constant, we show a lower bound of $\Omega(\epsilon^{-\frac{2(d-1)}{d+2p}})$ bits, which is optimal up to logarithmic factors for odd integers $p$.

This talk is based on joint work with Honghao Lin, Ruosong Wang and David Woodruff.