Fast, deterministic and sparse dimensionality reduction

Daniel Dadush, Cristóbal Guzmán, Neil Olver

Research output: Chapter in Book / Report / Conference proceedingConference contributionAcademicpeer-review


We provide a deterministic construction of the sparse JohnsonLindenstrauss transform of Kane & Nelson (J.ACM 2014) which runs, under a mild restriction, in the time necessary to apply the sparse embedding matrix to the input vectors. Specifically, given a set of n vectors in Rd and target error ϵ, we give a deterministic algorithm to compute a f1; 0; 1g embedding matrix of rank O((ln n)= ϵ 2) with O((ln n)/ϵ) entries per column which preserves the norms of the vectors to within 1ϵ. If NNZ, the number of non-zero entries in the input set of vectors, is (d2), our algorithm runs in time O(NNZ ln n/ϵ). One ingredient in our construction is an extremely simple proof of the Hanson-Wright inequality for subgaussian random variables, which is more amenable to derandomization. As an interesting byproduct, we are able to derive the essentially optimal form of the inequality in terms of its functional dependence on the parameters.

Original languageEnglish
Title of host publication29th Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2018
PublisherAssociation for Computing Machinery
Number of pages15
ISBN (Electronic)9781611975031
Publication statusPublished - 2018
Event29th Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2018 - New Orleans, United States
Duration: 7 Jan 201810 Jan 2018


Conference29th Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2018
Country/TerritoryUnited States
CityNew Orleans


Dive into the research topics of 'Fast, deterministic and sparse dimensionality reduction'. Together they form a unique fingerprint.

Cite this