BenSasson, Eli, Oded Goldreich, Prahladh Harsha, Madhu Sudan, and Salil Vadhan. “
Robust PCPs of proximity, shorter PCPs, and applications to coding.”
SIAM Journal on Computing: Special Issue on Randomness and Complexity 36, no. 4 (2006): 889974.
Publisher's VersionAbstract
Version History. Extended abstract in STOC '04.
We continue the study of the tradeoff between the length of probabilistically checkable proofs (PCPs) and their query complexity, establishing the following main results (which refer to proofs of satisfiability of circuits of size \(n\)):

We present PCPs of length \(\exp(o(\log \log n)^2) ·n\) that can be verified by making \(o(\log \log n)\) Boolean queries.

For every \( ε > 0\), we present PCPs of length \(\exp(\log^ε n) · n \) that can be verified by making a constant number of Boolean queries.
In both cases, false assertions are rejected with constant probability (which may be set to be arbitrarily close to 1). The multiplicative overhead on the length of the proof, introduced by transforming a proof into a probabilistically checkable one, is just quasi polylogarithmic in the first case (of query complexity \(o(\log \log n)\)), and is \(2^{(\log n)^ε} \), for any \(ε > 0\), in the second case (of constant query complexity). Our techniques include the introduction of a new variant of PCPs that we call “robust PCPs of proximity.” These new PCPs facilitate proof composition, which is a central ingredient in the construction of PCP systems. (A related notion and its composition properties were discovered independently by Dinur and Reingold.) Our main technical contribution is a construction of a “lengthefficient” robust PCP of proximity. While the new construction uses many of the standard techniques used in PCP constructions, it does differ from previous constructions in fundamental ways, and in particular does not use the “parallelization” step of Arora et al. [J. ACM, 45 (1998), pp. 501–555]. The alternative approach may be of independent interest. We also obtain analogous quantitative results for locally testable codes. In addition, we introduce a relaxed notion of locally decodable codes and present such codes mapping \(k\) information bits to codewords of length \(k^{1+ε}\) for any \(ε > 0\).
SICOMP2006.pdf Healy, Alex, Salil Vadhan, and Emanuele Viola. “
Using nondeterminism to amplify hardness.”
SIAM Journal on Computing: Special Issue on STOC '04 35, no. 4 (2006): 903931.
Publisher's VersionAbstract
We revisit the problem of hardness amplification in \(\mathcal{NP}\) as recently studied by O’Donnell [J. Comput. System Sci., 69 (2004), pp. 68–94]. We prove that if \(\mathcal{NP}\) has a balanced function \(f\) such that any circuit of size \(s(n)\) fails to compute \(f\) on a \(1/\mathrm{poly}(n)\) fraction of inputs, then \(\mathcal{NP}\) has a function \(f'\) such that any circuit of size \(s'(n) = s(\sqrt{n})^{\Omega(1)}\) fails to compute \(f\) on a \(1/2 − 1/s' (n)\) fraction of inputs. In particular,

if \(s(n) = n^{\omega(1)}\), we amplify to hardness \(1/2  1/n^{\omega(1)}\);

if \(s(n) = 2^{n^{\Omega(1)}}\), we amplify to hardness \(1/2  1/2^{n^{\Omega(1)}}\);

if \(s(n) = 2^{\Omega(n)}\), we amplify to hardness \(1/2  1/2^{\Omega(\sqrt{n})}\).
Our results improve those of of O’Donnell, which amplify to\(1/2  1/ \sqrt{n}\). O’Donnell also proved that no construction of a certain general form could amplify beyond \(1/2  1/n\). We bypass this barrier by using both derandomization and nondeterminism in the construction of \(f'\). We also prove impossibility results demonstrating that both our use of nondeterminism and the hypothesis that \(f\) is balanced are necessary for “blackbox” hardness amplification procedures (such as ours).
SICOMP2006.pdf Vadhan, Salil. “
An unconditional study of computational zero knowledge.”
SIAM Journal on Computing: Special Issue on Randomness and Complexity 36, no. 4 (2006): 11601214.
Publisher's VersionAbstract
Version History: Extended abstract in FOCS '04.
We prove a number of general theorems about \(\mathbf{ZK}\), the class of problems possessing (computational) zeroknowledge proofs. Our results are unconditional, in contrast to most previous works on \(\mathbf{ZK}\), which rely on the assumption that oneway functions exist. We establish several new characterizations of \(\mathbf{ZK}\) and use these characterizations to prove results such as the following:

Honestverifier \(\mathbf{ZK}\) equals general \(\mathbf{ZK}\).

Publiccoin \(\mathbf{ZK}\) equals privatecoin \(\mathbf{ZK}\).

\(\mathbf{ZK}\) is closed under union.

\(\mathbf{ZK}\) with imperfect completeness equals \(\mathbf{ZK}\) with perfect completeness.

Any problem in \(\mathbf{ZK}\) \(\cap\) \(\mathbf{NP} \) can be proven in computational zero knowledge by a \(\mathbf{BPP^{NP}}\)prover.

\(\mathbf{ZK}\) with blackbox simulators equals \(\mathbf{ZK}\) with general, non–blackbox simulators.
The above equalities refer to the resulting class of problems (and do not necessarily preserve other efficiency measures such as round complexity). Our approach is to combine the conditional techniques previously used in the study of \(\mathbf{ZK}\) with the unconditional techniques developed in the study of \(\mathbf{SZK}\), the class of problems possessing statistical zeroknowledge proofs. To enable this combination, we prove that every problem in \(\mathbf{ZK}\) can be decomposed into a problem in \(\mathbf{SZK}\) together with a set of instances from which a oneway function can be constructed.
SICOMP2006.pdf Reingold, Omer, Luca Trevisan, and Salil Vadhan. “
Pseudorandom walks in regular digraphs and the RL vs. L problem.” In
Proceedings of the 38th Annual ACM Symposium on Theory of Computing (STOC ‘06), 457466, 2006, 457466.
Publisher's VersionAbstract
Version History: Preliminary version as ECCC TR0522, February 2005 (https://eccc.weizmann.ac.il/report/2005/022/; attached as ECCC2005.pdf).
We revisit the general \(\mathbf{RL}\) vs. \(\mathbf{L}\) question, obtaining the following results.

Generalizing Reingold’s techniques to directed graphs, we present a deterministic, logspace algorithm that given a regular directed graph G (or, more generally, a digraph with Eulerian connected components) and two vertices s and t, finds a path between s and t if one exists.

If we restrict ourselves to directed graphs that are regular and consistently labelled, then we are able to produce pseudorandom walks for such graphs in logarithmic space (this result already found an independent application).

We prove that if (2) could be generalized to all regular directed graphs (including ones that are not consistently labelled) then \(\mathbf{L=RL}\). We do so by exhibiting a new complete promise problem for \(\mathbf{RL}\), and showing that such a problem can be solved in deterministic logarithmic space given a logspace pseudorandom walk generator for regular directed graphs.
ACM2006.pdf ECCC2005.pdf Micciancio, Daniele, Shien Jin Ong, Amit Sahai, and Salil Vadhan. “
Concurrent zero knowledge without complexity assumptions.” In
S. Halevi and T. Rabin, eds., Proceedings of the Third Theory of Cryptography Conference (TCC '06), 3876:120. New York, NY, USA: Springer Verlag, Lecture Notes in Computer Science, 2006.
Publisher's VersionAbstract
Version History. Full version available at https://eccc.weizmann.ac.il//ecccreports/2005/TR05093/ (Attached as ECCC2005).
We provide unconditional constructions of concurrent statistical zeroknowledge proofs for a variety of nontrivial problems (not known to have probabilistic polynomialtime algorithms). The problems include Graph Isomorphism, Graph Nonisomorphism, Quadratic Residuosity, Quadratic Nonresiduosity, a restricted version of Statistical Difference, and approximate versions of the (\(\mathsf{coNP}\) forms of the) Shortest Vector Problem and Closest Vector Problem in lattices. For some of the problems, such as Graph Isomorphism and Quadratic Residuosity, the proof systems have provers that can be implemented in polynomial time (given an \(\mathsf{NP}\) witness) and have \(\tilde{O}(\log n)\) rounds, which is known to be essentially optimal for blackbox simulation. To the best of our knowledge, these are the first constructions of concurrent zeroknowledge proofs in the plain, asynchronous model (i.e., without setup or timing assumptions) that do not require complexity assumptions (such as the existence of oneway functions).
ECCC2005.pdf TCC2006.pdf Nguyen, Minh, and Salil Vadhan. “
Zero knowledge with efficient provers.” In
Proceedings of the 38th Annual ACM Symposium on Theory of Computing (STOC ‘06), 287295. ACM, 2006.
Publisher's VersionAbstractWe prove that every problem in NP that has a zeroknowledge proof also has a zeroknowledge proof where the prover can be implemented in probabilistic polynomial time given an NP witness. Moreover, if the original proof system is statistical zero knowledge, so is the resulting efficientprover proof system. An equivalence of zero knowledge and efficientprover zero knowledge was previously known only under the assumption that oneway functions exist (whereas our result is unconditional), and no such equivalence was known for statistical zero knowledge. Our results allow us to translate the many general results and characterizations known for zero knowledge with inefficient provers to zero knowledge with efficient provers.
STOC2006.pdf Gradwohl, Ronen, Salil Vadhan, and David Zuckerman. “
Random selection with an adversarial majority.” In
Advances in Cryptology—CRYPTO ‘06, C. Dwork, ed. 4117:409–426. Springer Verlag, Lecture Notes in Computer Science, 2006.
Publisher's VersionAbstract
Version History: Full version published in ECCC TR 06026, February 2006. Updated full version published June 2006.
We consider the problem of random selection, where \(p\) players follow a protocol to jointly select a random element of a universe of size \(n\). However, some of the players may be adversarial and collude to force the output to lie in a small subset of the universe. We describe essentially the first protocols that solve this problem in the presence of a dishonest majority in the fullinformation model (where the adversary is computationally unbounded and all communication is via nonsimultaneous broadcast). Our protocols are nearly optimal in several parameters, including the round complexity (as a function of \(n\)), the randomness complexity, the communication complexity, and the tradeoffs between the fraction of honest players, the probability that the output lies in a small subset of the universe, and the density of this subset.
CRYPTO2006.pdf ECCC02.2006.pdf FULL06.2006.pdf Nguyen, MinhHuyen, Shien Jin Ong, and Salil Vadhan. “
Statistical zeroknowledge arguments for NP from any oneway function.” In
Proceedings of the 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS ‘06), 313. IEEE, 2006.
Publisher's VersionAbstract
Version History: Merged with STOC '07 paper of Haitner and Reingold. Also available as a journal version. Full version invited to SIAM J. Computing Special Issue on FOCS ‘06
We show that every language in NP has a statistical zeroknowledge argument system under the (minimal) complexity assumption that oneway functions exist. In such protocols, even a computationally unbounded verifier cannot learn anything other than the fact that the assertion being proven is true, whereas a polynomialtime prover cannot convince the verifier to accept a false assertion except with negligible probability. This resolves an open question posed by Naor et al. (1998). Departing from previous works on this problem, we do not construct standard statistically hiding commitments from any oneway function. Instead, we construct a relaxed variant of commitment schemes called "1outof2binding commitments," recently introduced by Nguyen et al. (2006)
FOCS2006.pdf