A view from the cheap seats on the second workshop (April 10–12, 2019)
The first zkproof standards conference in May 2018 had around 75 participants and focused on the workshop format throughout. The conference had three tracks: one focused on the security protocols, another on implementations and the third on applications. Standards documents on these three tracks came out and were discussed at some length during the second workshop. This article is on the second workshop.
In the meantime, the participants did not stay idle, they have created a corpus of basic standards material; they are also working on a process for advancing the standards work; including governance of attribution, patenting guidelines, persuasion strategies to encourage adoption; actual proposals dealing with some of the sub-problems that were thrown up during the first workshop. These activities culminated in a second workshop; April 10 thru 12, 2019.
We had presentations from many people Shafi Goldwasser, Ran Canetti, Jens Groth, Yael Kalai, Alessandro Chiesa, Mariana Raykova, Amit Sahai, Yuval Ishai, abhi shelat, Muthu Venkatasubramaniam, Sean Bowe, Madars Virza, Eran Tromer, Dan Boneh, Dario Fiore, Howard Wu, Nick Sullivan, Shiri Lemel, Henry de Valence, Cathie Yun, Andrew Poelstra, Izaak Meckler, Jordi Baylina, Hitarshi Buch, Ori Wallenstein, Shashank Agrawal, Barry Whitehat, Nick Spooner, Rene Peralta, Burt Kaliski, Eduardo Moraes, Carlos Kuchovsky, Mike Hearn, Jonathan Levi, Mary Maller, Luis Brandao, Stejan Deml, Ahmed Kosba, Ariel Gabizon, Hugo Krawcyk, Justin Thaler, Daniel Benarroch, Aviv Zohar. This is just the list of official speakers, there were many informed comments from many in the audience.
Each of these presentations could result in weeks of study and in some cases years before one understood the details; hence in my mind, I grouped them into fundamental results, tooling and utilities, implementations and finally applications. In addition, there is the process of standardization; mainly driven by Luís Brandão (NIST) and Daniel Benarroch and ably joined by the moderators of the proposal sections. It is expected that all of the material from the workshop will be made public; many slides are available. The recordings are available now ; a valuable resource.
As a non-cryptographer, I often do not understand the mathematical details of what is being discussed. I do feel a tremendous sense of wonder, and do understand the basics of zkp; it is the nuances that often escape me. Nonetheless I persist; mostly in thinking about the applicability of zkp in the real world and in seeking further depths of the attributes, security guarantees and evolution of the protocols. In the implementations, I am more comfortable, once I take the math for granted. I also like to help spreading the word to build a community around zkp. In this, I am engaged in understanding the concepts better to communicate them to the lay person when I can.
Zero Knowledge Proofs
Zero knowledge proofs have wider acceptance and recognition, but when they were first revealed through a paper written by Goldwasser, Micali and Rackoff; the “founding paper” of the discipline; it was noticed by a few people in the academic community, and not always positively, as is the case with paradigm shifting theories. From the abstract of the paper we have this definition. “Zero-knowledge proofs are defined as those proofs that convey no additional knowledge other than the correctness of the proposition in question”. The intriguing combination of privacy and proof embodied in this definition is what attracts most.
The field took a while to take off. Shafi Goldwasser gave a presentation at the first zkProof standards workshop, which recounted the history of the growth and setbacks of the field. From the writing of the first paper in 1985 to its acceptance in 1989 and the numerous revisions it underwent.
There are many expositions of the ideas of zkp appropriate to a layperson. The defiles, dells, mountain ranges, rivers and secret bosky woods of a new continent become revealed as the more intrepid explorers enchanted by this new land spread out and push the frontiers of insight. New papers continue to gush forth. As we see below, they are aimed at the various attributes of zero knowledge proof systems; at making them more tractable in general through techniques of recursion or composition.
Zero knowledge proofs as laid out in the initial paper, allow a Prover, Peggy to convey the truth of a statement to a Verifier, Victor without revealing anything more than the truth of the statement. Peggy and Victor are used to mean Prover and Verifier. The three qualities of this protocol are completeness, soundness and zero-knowledgeness. The initial probabilistic proofs are done using a series of interactions between Peggy and Victor. Soundness and completeness properties from classic mathematical logic are mapped on to a zero knowledge protocol. In ZKP the soundness property makes it possible for Peggy to convince Victor that she knows a “witness” to the statement to be proven. By the completeness property Peggy cannot convince Victor if the statement is false. Zero-knowledgeness asserts that Victor cannot know anything more than the fact that the statement is true. Interactivity is the process; Victor produces a series of challenges that have to be responded to by Peggy. The number of rounds of successful challenge-response iterations increase the probability of the truth of the statement to be proven.
Each of these properties can be further analyzed varying the abilities of the participants, the truth to be conveyed, the size of the proof data and the burden of computation. The completeness property for example has been analyzed with Peggy who is computationally bounded to a Peggy who has unlimited resources, so it is with the other properties. Victor’s abilities also range from computationally limited to unbounded since the verifiers are the ones who can break the zeroknowledgeness. Many of these combinations have been studied and solutions proposed; aided by advances in related fields like homomorphic encryption.
The removal of the interactivity requirement is usually by using the Fiat-Shamir heuristic. Non-interactivity is very important in the blockchain based approaches, where the Proof Carrying Data is to be deposited in the blockchain and there is no interaction between Peggy and Victor, in fact Victor is everyperson(Victoria/Victor); this in addition to the succinctness of the Proofs as well as lessening the demands on the Provers and the Verifiers resulted in the ZCash protocol for private crypto-currency. The onus on the Provers is much greater, including the proper provisioning of the (Common Reference String) CRS which is a key element in removing interactivity. Also this opens up the verifier set to unknown and malicious sets of verifiers with high computational and other powers.
Helping Provers with their daunting task is DIZK and Howard Wu’s excellent presentation on the topic. The Prover which has previously been monolithic can now be run in a cluster, enabling the creation of circuits with much more gates (about a 100x on prior art) on commodity clusters. This is the right direction for scaling, in line with infrastructural advances and the move towards cloud computing. Along with R1CS, modularization and other techniques, proofs will get generalized and tractable; making them applicable not just to exchange of value.
Jens Groth lays out the general themes in his presentation, a survey of the field, presented as the evolutionary history of pairing based zkps. There were some other standouts for me in these presentations that were essentially surveys of the subject, like Yuval Ishai’s Modular Approach to Efficiency of zkps where the information theoretic parts were used as a pivot to analyze the systems.
Presentation on the Cloudflare Privacy Pass as a lightweight zkp to be used in a browser by Nick Sullivan would have tremendous impact as Cloudflare is used by 11 million plus sites. Solving the puzzle of sharing the fact that clients had done captcha at one site to another site without revealing anything else. Thus removing the need to do captcha repeatedly.
I have to mention some other presentations before concluding. One was by Henry de Valence on Merlin; he created Merlin to compose implementations of zkps through the Fiat-Shamir Heuristic using software. He had used the STROBE framework to create Merlin. In addition to ensuring the security of the resulting construction, interaction can be used to compose non interactive proofs and it brought many other advantages to an implementation of zk-SNARKs.
Another excellent presentation: on the work to turn bullet proofs into a ZkVM, a smart contract language based on bullet proofs was presented by Cathie Yun from Interstellar. The exposition was crystal clear and the slides and diagrams lead us through the construction of tooling to create a solution.
Another standout for me was a presentation by Dan Boneh on DLOG based Zero-Knowledge Proofs . It was the construction of quickly updatable accumulators that hit home. On blockchains like Bitcoin, the paper proposed a way to separate the mining function and provide short proofs of inclusion of UTXO to be spent, from “a proof service” in order for miners to concentrate on mining rather than data management and searching for the UTXO to validate transactions. Wondering whether these accumulators plus the proofs can be useful for Revocation list management in projects like Hyperledger Indy.
There were a few real world use cases other than for shielded crypto-currency. The investigation of BulletProofs by ING bank as detailed by Eduardo Moraes was one of them. The shortness of the proofs and the proving times got ING research interested in these, there were no real use cases, but the bank ran the PoCs in their labs. This is more than what can be said about other major banks; although we hear even the big banks are interested in this technology; interest that they might back with a few coins from their sackfuls of cash.
Shiri Lemel talked about a use case on Cuban cigars that had some of us convinced that zkps were operating in the Cuban tobacco farms in the country-side. That solution combined private (shielded) issuance of tokens to privacy preserving supply chains to factory rolling to control counterfeit cigars. It was more of an NDA preserving technique as she wove a beautiful story around the search for the real Cohibas and the solution to provenance of the product in a zkp setting. Obviously, this meant that there was real interest in productionizing zkps in Enterprises.
The field is in ferment. Advances are happening on all fronts. Cryptographers are setting up startups and real world projects. They are no longer in the backwaters of theory. Startups like #qedit, #algorand, #dfinity, #blockstream, #zcash have paved the way.
What can be expressed in zkps is moving towards generalization. The proofs can be applied to more problems without bespoke brittle circuit creation. This is helped by tools like Merlin, techniques like modular design and composition, dealing with SRS in novel ways etc. Formal verification and other techniques are increasing the trustworthiness of zero knowledge proofs, we need to communicate this sense of security to the early adopters, regulators and the general public. Efficiency improvements using commit and prove, recursion, nesting are making it available to a wider range of problems.
Standards efforts are a sign of a maturing field; when applications built on top of theory need certainty in order to spur adoption and implementations that respect what are called non-functional aspects: namely scale, speed, deployability, interoperation and controlled changes. The construction of standards are a balancing act; act too prematurely and decisions can get baked into implementations too early; over-specification can have the same effect; act too late and adoption becomes problematic as existing bad implementations take over the industry. A good standard will have just enough extensibility and can force regulators to adopt technologies since they will be more inclined to back the opinions of objective experts. The combination of the academic cryptography community, tools and utility builders, early adopters from the industry and bodies like NIST whose procedural and objective backing is required for a true standard to emerge are heartening.