Charles River Crypto Day - The Power of Negations in Cryptography

Subscribers:
349,000
Published on ● Video Link: https://www.youtube.com/watch?v=K-tXhUyEu9s



Duration: 1:12:19
493 views
5


The study of monotonicity and negation complexity for Boolean functions has been prevalent in complexity theory as well as in computational learning theory, but little attention has been given to it in the cryptographic context. Recently, Goldreich and Izsak (2012) have initiated a study of whether cryptographic primitives can be monotone, and showed that one-way functions can be monotone (assuming they exist), but a pseudorandom generator cannot. In this work, we start by filling in the picture and proving that many other basic cryptographic primitives cannot be monotone. We then initiate a quantitative study of the power of negations, asking how many negations are required. We provide several lower bounds, some of them tight, for various cryptographic primitives and building blocks including one-way permutations, pseudorandom functions, small-bias generators, hard-core predicates, error-correcting codes, and randomness extractors. Among our results, we highlight the following. i) Unlike one-way functions, one-way permutations cannot be monotone. ii) We prove that pseudorandom functions require log n−O(1) negations (which is optimal up to the additive term). iii) Error-correcting codes with optimal distance parameters require log n−O(1) negations (again, optimal up to the additive term). iv) We prove a general result for monotone functions, showing a lower bound on the depth of any circuit with t negations on the bottom that computes a monotone function f in terms of the monotone circuit depth of f. This result addresses a question posed by Koroth and Sarma (2014) in the context of the circuit complexity of the Clique problem. Joint work with Siyao Guo, Igor Carboni Oliveira, and Alon Rosen.




Other Videos By Microsoft Research


2016-06-21Designing Technology for Mental Health and Wellbeing
2016-06-21DNN-Based Online Speech Enhancement Using Multitask Learning and Suppression Rule Estimation
2016-06-21Computing Reliably with Molecular Walkers
2016-06-21Challenges in automated verification and synthesis for molecular programming
2016-06-21Decompositions of Natural Signals Still Need Fresh Ideas
2016-06-21Provable Non-convex Projections for High-dimensional Learning Problems -Part2
2016-06-21Optimal Design for Social Learning
2016-06-21Introduction to Machine Learning in Python with Scikit-Learn
2016-06-21Lab tutorial: Learning about Shape
2016-06-21Improving Behavioral Health Intervention Technologies: Harnessing the Human Side of Apps
2016-06-21Charles River Crypto Day - The Power of Negations in Cryptography
2016-06-21Scalable Kernel Methods via Doubly Stochastic Gradients
2016-06-21Molecular Tumor Boards: What they are; What they do; What they need
2016-06-21The Contextual Bandits Problem: A New, Fast, and Simple Algorithm
2016-06-21Cell cycle switching by an algorithm
2016-06-21Provable Non-convex Projections for High-dimensional Learning Problems - Part1
2016-06-21Northwest Probability Seminar 2014 - Sampling from the Fomin-Kirillov distribution
2016-06-21Computer Vision - StAR Lecture Series: Object Recognition
2016-06-21Local Deep Kernel Learning for Efficient Non-linear SVM Prediction
2016-06-21An Introduction to Concentration Inequalities and Statistical Learning Theory
2016-06-21Intelligible Machine Learning Models for HealthCare



Tags:
microsoft research
machine learning
deep neural networks