Abstract
A recent breakthrough of Liu and Pass (FOCS'20) shows that oneway functions exist if and only if the (polynomial)timebounded Kolmogorov complexity, K^t, is boundederror hard on average to compute. In this paper, we strengthen this result and extend it to other complexity measures:
 We show, perhaps surprisingly, that the KT complexity is boundederror averagecase hard if and only if there exist oneway functions in constant parallel time (i.e. NC⁰). This result crucially relies on the idea of randomized encodings. Previously, a seminal work of Applebaum, Ishai, and Kushilevitz (FOCS'04; SICOMP'06) used the same idea to show that NC⁰computable oneway functions exist if and only if logspacecomputable oneway functions exist.
 Inspired by the above result, we present randomized averagecase reductions among the NC¹versions and logspaceversions of K^t complexity, and the KT complexity. Our reductions preserve both boundederror averagecase hardness and zeroerror averagecase hardness. To the best of our knowledge, this is the first reduction between the KT complexity and a variant of K^t complexity.
 We prove tight connections between the hardness of K^t complexity and the hardness of (the hardest) oneway functions. In analogy with the ExponentialTime Hypothesis and its variants, we define and motivate the Perebor Hypotheses for complexity measures such as K^t and KT. We show that a Strong Perebor Hypothesis for K^t implies the existence of (weak) oneway functions of nearoptimal hardness 2^{no(n)}. To the best of our knowledge, this is the first construction of oneway functions of nearoptimal hardness based on a natural complexity assumption about a search problem.
 We show that a Weak Perebor Hypothesis for MCSP implies the existence of oneway functions, and establish a partial converse. This is the first unconditional construction of oneway functions from the hardness of MCSP over a natural distribution.
 Finally, we study the averagecase hardness of MKtP. We show that it characterizes cryptographic pseudorandomness in one natural regime of parameters, and complexitytheoretic pseudorandomness in another natural regime.
BibTeX  Entry
@InProceedings{ren_et_al:LIPIcs.CCC.2021.35,
author = {Ren, Hanlin and Santhanam, Rahul},
title = {{Hardness of KT Characterizes Parallel Cryptography}},
booktitle = {36th Computational Complexity Conference (CCC 2021)},
pages = {35:135:58},
series = {Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN = {9783959771931},
ISSN = {18688969},
year = {2021},
volume = {200},
editor = {Kabanets, Valentine},
publisher = {Schloss Dagstuhl  LeibnizZentrum f{\"u}r Informatik},
address = {Dagstuhl, Germany},
URL = {https://drops.dagstuhl.de/opus/volltexte/2021/14309},
URN = {urn:nbn:de:0030drops143091},
doi = {10.4230/LIPIcs.CCC.2021.35},
annote = {Keywords: oneway function, metacomplexity, KT complexity, parallel cryptography, randomized encodings}
}
Keywords: 

oneway function, metacomplexity, KT complexity, parallel cryptography, randomized encodings 
Collection: 

36th Computational Complexity Conference (CCC 2021) 
Issue Date: 

2021 
Date of publication: 

08.07.2021 