skip to content

Department of Computer Science and Technology

Date: 
Friday, 19 September, 2025 - 10:00 to 11:00
Speaker: 
Justin Oh (UT Austin)
Venue: 
Computer Laboratory, William Gates Building, Room SS03

A natural model of the sources of randomness one might engineer or obtain in nature is the samplable distribution. Such a distribution is generated by an efficient algorithm or circuit, but may not have much randomness overall, e.g. may not have much entropy. Trevisan and Vadhan [TV00] first introduced this notion, and showed under strong complexity theoretic assumptions that it is possible to *extract* from such sources. That is, there is an algorithm (an *extractor*) to convert such a distribution into a nearly uniform distribution, which can then be used in applications such as randomized algorithms and cryptography.

There has been recently renewed interest in this problem, with various recent results building off the initial techniques of [TV00]. The two main questions to address are:
What is the weakest complexity theoretic assumption required to construct the extractor?
What is the smallest amount of entropy required of the original distribution?
In this work, we present a new technique for constructing extractors for samplable distributions, that works for essentially the lowest possible entropy required of the original distribution, and using a "qualitatively" weaker assumption than previous works (albeit technically incomparable). Our work introduces a connection between our problem and the breakthrough construction of two-source extractors by Chattopadhyay and Zuckerman. The approach also provides a novel and direct application of pseudorandom generators, a ubiquitous tool in areas such as derandomization.

Seminar series: 
Algorithms and Complexity Seminar

Upcoming seminars