2022
Autores
Buhrman, H; Loff, B; Patro, S; Speelman, F;
Publicação
13th Innovations in Theoretical Computer Science Conference, ITCS 2022, January 31 - February 3, 2022, Berkeley, CA, USA.
Abstract
Many computational problems are subject to a quantum speed-up: one might find that a problem having an Opn3q-time or Opn2q-time classic algorithm can be solved by a known Opn1.5q-time or Opnq-time quantum algorithm. The question naturally arises: how much quantum speed-up is possible? The area of fine-grained complexity allows us to prove optimal lower-bounds on the complexity of various computational problems, based on the conjectured hardness of certain natural, well-studied problems. This theory has recently been extended to the quantum setting, in two independent papers by Buhrman, Patro and Speelman [7], and by Aaronson, Chia, Lin, Wang, and Zhang [1]. In this paper, we further extend the theory of fine-grained complexity to the quantum setting. A fundamental conjecture in the classical setting states that the 3SUM problem cannot be solved by (classical) algorithms in time Opn2´eq, for any e a 0. We formulate an analogous conjecture, the Quantum-3SUM-Conjecture, which states that there exist no sublinear Opn1´eq-time quantum algorithms for the 3SUM problem. Based on the Quantum-3SUM-Conjecture, we show new lower-bounds on the time complexity of quantum algorithms for several computational problems. Most of our lower-bounds are optimal, in that they match known upper-bounds, and hence they imply tight limits on the quantum speedup that is possible for these problems. These results are proven by adapting to the quantum setting known classical fine-grained reductions from the 3SUM problem. This adaptation is not trivial, however, since the original classical reductions require pre-processing the input in various ways, e.g. by sorting it according to some order, and this pre-processing (provably) cannot be done in sublinear quantum time. We overcome this bottleneck by combining a quantum walk with a classical dynamic data-structure having a certain “history-independence” property. This type of construction has been used in the past to prove upper bounds, and here we use it for the first time as part of a reduction. This general proof strategy allows us to prove tight lower bounds on several computational-geometry problems, on Convolution-3SUM and on the 0-Edge-Weight-Triangle problem, conditional on the Quantum-3SUM-Conjecture. We believe this proof strategy will be useful in proving tight (conditional) lower-bounds, and limits on quantum speed-ups, for many other problems.
2008
Autores
Beggs, E; Costa, JF; Loff, B; Tucker, JV;
Publicação
UNCONVENTIONAL COMPUTATION, PROCEEDINGS
Abstract
In this paper we will try to understand how oracles and advice functions, which are mathematical abstractions in the theory of computability and complexity, can be seen as physical measurements in Classical Physics. First, we consider how physical measurements are a natural external source of information to an algorithmic computation, using a simple and engaging case study, namely: Hoyle's algorithm for calculating eclipses at Stonehenge. Next, we argue that oracles and advice functions can help us understand how the structure of space and time has information content that can be processed by Turing machines. Using an advanced case study from Newtonian kinematics, we show that non-uniform complexity is an adequate framework for classifying feasible computations by Turing machines interacting with an oracle in Nature, and that by classifying the information content of such a natural oracle, using Kolmogorov complexity, we obtain a hierarchical structure based on measurements, advice classes and information.
2010
Autores
Buhrman, H; Fortnow, L; Koucký, M; Loff, B;
Publicação
Proceedings of the 25th Annual IEEE Conference on Computational Complexity, CCC 2010, Cambridge, Massachusetts, June 9-12, 2010
Abstract
In this paper we show that BPP is truth-table reducible to the set of Kolmogorov random strings RK. It was previously known that PSPACE, and hence BPP is Turing-reducible to RK. The earlier proof relied on the adaptivity of the Turing-reduction to find a Kolmogorov-random string of polynomial length using the set RK as oracle. Our new non-adaptive result relies on a new fundamental fact about the set RK, namely each initial segment of the characteristic sequence of RK has high Kolmogorov complexity. As a partial converse to our claim we show that strings of very high Kolmogorov-complexity when used as advice are not much more useful than randomly chosen strings. © 2010 IEEE.
2012
Autores
Allender, E; Buhrman, H; Friedman, L; Loff, B;
Publicação
Mathematical Foundations of Computer Science 2012 - 37th International Symposium, MFCS 2012, Bratislava, Slovakia, August 27-31, 2012. Proceedings
Abstract
This paper is motivated by a conjecture [All12, ADF+13] that BPP can be characterized in terms of polynomial-time nonadaptive reductions to the set of Kolmogorov-random strings. In this paper we show that an approach laid out in [ADF+13l] to settle this conjecture cannot succeed without significant alteration, but that it does bear fruit if we consider time-bounded Kolmogorov complexity instead. We show that if a set A is reducible in polynomial time to the set of time-t-bounded Kolmogorov random strings (for all large enough time bounds t), then A is in P/poly, and that if in addition such a reduction exists for any universal Turing machine one uses in the definition of Kolmogorov complexity, then A is in PSPACE. © E. Allender, H. Buhrman, L. Friedman, and B. Loff.
2012
Autores
Ben Amram, AM; Loff, B; Oitavem, I;
Publicação
JOURNAL OF LOGIC AND COMPUTATION
Abstract
A celebrated contribution of Bellantoni and Cook was a function algebra to capture FPTIME. This algebra uses recursion on notation. Later, Oitavem showed that including primitive recursion, an algebra is obtained that captures FPSPACE. The main results of this article concern variants of the later algebra. First, we show that iteration can replace primitive recursion. Then, we consider the results of imposing a monotonicity constraint on the primitive recursion or iteration. We find that in the case of iteration, the power of the algebra shrinks to FPTIME. More interestingly, with primitive recursion, we obtain a new implicit characterization of the polynomial hierarchy (FPH). The idea to consider these monotonicity constraints arose from the results on write-once tapes for Turing machines.We review this background and also note a new machine characterization of delta(P)(2), that similarly to our function algebras, arises by combining monotonicity constraints with a known characterization of PSPACE.
2007
Autores
Costa, JF; Loff, B; Mycka, J;
Publicação
COMPUTATION AND LOGIC IN THE REAL WORLD, PROCEEDINGS
Abstract
We show that, using our more or less established framework of inductive definition of real-valued functions (work started by Cristopher Moore in [9]) together with ideas and concepts of standard computability we can prove theorems of Analysis. Then we will consider our ideas as a bridging tool between the standard Theory of Computability (and Complexity) on one side and Mathematical Analysis on the other, making real recursive functions a possible branch of Descriptive Set Theory. What follows is an Extended Abstract directed to a large audience of CiE 2007, Special Session on Logic and New Paradigms of Computability. (Proofs of statements can be found in a detailed long paper at the address http://fgc.math.ist.utl.pt/papers/iiierarchy.pdf.).
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.