The CS Department Princeton University Homepage mail me
35
Olden Street . Princeton University .  Princeton, NJ 08544

My Research

As a theoretical computer scientist I maintain broad interests spanning algorithms, complexity theory, coding theory, combinatorics and cryptography. Of these, cryptography --- the attempt to reduce nebulous questions of security to concrete problems in computational complexity theory --- has caught my imagination the most.

Below is an introduction to my work. It does not include papers currently under submission, or work in progress.

Network-Aware Security

Other Works in Cryptography

Works outside Cryptography

Network-Aware/Universally Composable Security

Network-Aware/Universally Composable Security is the most advanced notion of security for protocols, that theoretical cryptography has developed. It takes into account the fact that the protocols being analyzed are executed in a Network, which consists of many parties, running multiple other protocols and programs. This is in contrast to earlier notions which were developed for simplistic "stand-alone" models. My most important works deal with, and indeed develop, some of the notions of Network-Aware Security.

(A word about the terminology: in order to clarify two separate notions that were being considered together under the name "Universally Composable security," in [PS04,PS05], we introduced a new name "environmental security." Network-Aware security, is synonymous to environmental security, and is used only for further clarity.)

Network-Aware/Universally Composable Secure Multi-Party Computation. Multi-Party Computation (MPC) is a powerful generalization of virtually all interesting distributed cryptographic tasks. The main contribution I have made during my Ph.D. is to reach a milestone in modern cryptography by giving the first secure Multi-Party Computation protocol in the standard model of cryptography, that remains secure simultaneously with other secure protocols (universally composable), and even when deployed in an arbitrary network environment (Network-Aware secure) [PS04]. Previous works either were secure only in a stand-alone setting, or required a non-standard model -- "trusted setups" (like a common random string, sampled by a universally trusted entity) or restrictions on the number of participants who could be dishonest (honest majority). Further, the prior definition used for Network-Aware security -- introduced by Canetti (under the name Universally Composable Security) -- was provably unachievable in the standard model. We develop alternate definitions which capture Network-Aware security, and under which we demonstrate secure MPC protocols in the standard model. Our work also illustrates the usefulness of some novel complexity theoretic assumptions.

Weaker Network-Aware Security. In a related work [PS05], we introduce weaker notions of Network-Aware security which are sufficient for some tasks. We show that such a level of security can be achieved using much more efficient protocols. The complexity assumptions required in the proof of security are also weaker.

Network-Aware/Universally Composable Secure MPC using Clocks. In [KLP05], we show that if we make available clocks (with reasonable bounds on relative drifts) to all the parties, then secure MPC protocols can be achieved in the standard model, and with the original definition of Network-Aware/Universally Composable security of Canetti's, but it requires a delay to be imposed on all communications other than the communications within a core protocol. The key tool in this work is the use of delays and time-outs to avoid certain kinds of potential man-in-the-middle attacks.

Other Works in Cryptography

Concurrent Zero Knowledge. Earlier, we investigated questions arising out of more restricted traditional notions of composition (as opposed to Universal Composition), for a specific, but vital cryptographic tool called zero knowledge. In the late 90's Dwork, Naor and Sahai introduced a limited composition setting called concurrent composition, and addressed the problem of designing and proving security of concurrent zero knowledge proofs. A sequence of works gave increasingly more efficient protocols for Concurrent Zero Knowledge proofs. We improved on the previous works, and gave the most efficient protocol [PRS02]. (An earlier version appears as [PS02]). The previous best protocol had quadratically as many rounds as ours. Further, under the currently well-understood methods of proving security, our result is nearly optimal. Improving beyond this limit could be considered a breakthrough result.

Software Obfuscation. Software obfuscation refers to making the internals of a program unintelligible to a hacker. The motivation is to prevent a hacker from learning anything from the program, other than what he or she could legitimately learn by using the program. It is an important problem which relates to issues of software piracy and reverse-engineering on the one hand, while on the other hand provides the possibility of new secure protocols for different cryptographic tasks. In practice various heuristics for software obfuscation are widely employed in applications.

Recently security of obfuscation was defined formally and examined theoretically. But unfortunately that work showed that not all software can be obfuscated (under a very natural definition). This leaves us with the challenge of identifying useful software which can be obfuscated, and developing ways to carry out this obfuscation. We have shown the first positive results in obfuscation [LPS04], by obfuscating a variety of access control mechanisms, but in a somewhat unrealistic model called the random oracle model. In the process we develop some tools and techniques for designing and analyzing obfuscations. These results, provide insights into some of the elementary properties of obfuscation. We believe that these first results in theoretically provable obfuscation will lead to more significant ones in future.

Imperfect Randomness in Cryptography. Theoretical analysis of cryptographic protocols assumes the use of perfectly random sources. In practice, such sources are not available. In a recent work [DOPS04], we explore this gap. Our results show that the intuitive justification of replacing perfect random sources with high entropy-rate sources is not theoretically sound. We rule out any protocols for even basic cryptographic protocols like encryption (let alone MPC) which can guarantee security when the source of randomness is a high entropy-rate source.

Non-cryptographic Works

Grammar Representation of a String. In [C+02] we developed and analyzed an efficient algorithm to compactly represent a string of text by a Context Free Grammar. We showed that the representation generated by our algorithm is nearly optimal in terms of its size. The algorithm has potential applications in string compression, pattern recognition etc. It can be used to obtain a "universal source coding" scheme, which is of great interest in information theory.

Broadcasting and Gossiping in a Radio Network. In another work [LP02], we investigated a model of communication, called the Radio network. The abstraction provides a robust model to develop algorithms for networks with unknown (or frequently changing) topology and little or no collision detection in the communication channels. It is an old and well-studied area, with some recent results which answer some outstanding open problems. We showed how one of the most important lower-bound results in this model can be proven using simpler arguments, thus providing a proof which is easier to understand and verify. In the same work, we improve on previous results and gave what was at that time the fastest (randomized) algorithm for gossiping in such networks; since then it has been improved upon.

Earlier Research. I have also worked on interesting problems in computational geometry [P00a], Monte Carlo methods [P00b] and Linear Programming [P01].


[ About | Home | Work | Contact | Miscellaneous]
© Copyright 2002 Manoj M P
Last modified: Friday April 29 2016