License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.ITCS.2020.53
URN: urn:nbn:de:0030-drops-117382
Go to the corresponding LIPIcs Volume Portal

Lincoln, Andrea ; Polak, Adam ; Vassilevska Williams, Virginia

Monochromatic Triangles, Intermediate Matrix Products, and Convolutions

LIPIcs-ITCS-2020-53.pdf (0.5 MB)


The most studied linear algebraic operation, matrix multiplication, has surprisingly fast O(n^ω) time algorithms for ω < 2.373. On the other hand, the (min,+) matrix product which is at the heart of many fundamental graph problems such as All-Pairs Shortest Paths, has received only minor n^o(1) improvements over its brute-force cubic running time and is widely conjectured to require n^{3-o(1)} time. There is a plethora of matrix products and graph problems whose complexity seems to lie in the middle of these two problems. For instance, the Min-Max matrix product, the Minimum Witness matrix product, All-Pairs Shortest Paths in directed unweighted graphs and determining whether an edge-colored graph contains a monochromatic triangle, can all be solved in Õ(n^{(3+ω)/2}) time. While slight improvements are sometimes possible using rectangular matrix multiplication, if ω=2, the best runtimes for these "intermediate" problems are all Õ(n^2.5).
A similar phenomenon occurs for convolution problems. Here, using the FFT, the usual (+,×)-convolution of two n-length sequences can be solved in O(n log n) time, while the (min,+) convolution is conjectured to require n^{2-o(1)} time, the brute force running time for convolution problems. There are analogous intermediate problems that can be solved in O(n^1.5) time, but seemingly not much faster: Min-Max convolution, Minimum Witness convolution, etc.
Can one improve upon the running times for these intermediate problems, in either the matrix product or the convolution world? Or, alternatively, can one relate these problems to each other and to other key problems in a meaningful way?
This paper makes progress on these questions by providing a network of fine-grained reductions. We show for instance that APSP in directed unweighted graphs and Minimum Witness product can be reduced to both the Min-Max product and a variant of the monochromatic triangle problem, so that a significant improvement over n^{(3+ω)/2} time for any of the latter problems would result in a similar improvement for both of the former problems. We also show that a natural convolution variant of monochromatic triangle is fine-grained equivalent to the famous 3SUM problem. As this variant is solvable in O(n^1.5) time and 3SUM is in O(n^2) time (and is conjectured to require n^{2-o(1)} time), our result gives the first fine-grained equivalence between natural problems of different running times. We also relate 3SUM to monochromatic triangle, and a coin change problem to monochromatic convolution, and thus to 3SUM.

BibTeX - Entry

  author =	{Andrea Lincoln and Adam Polak and Virginia Vassilevska Williams},
  title =	{{Monochromatic Triangles, Intermediate Matrix Products, and Convolutions}},
  booktitle =	{11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
  pages =	{53:1--53:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-134-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{151},
  editor =	{Thomas Vidick},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{},
  URN =		{urn:nbn:de:0030-drops-117382},
  doi =		{10.4230/LIPIcs.ITCS.2020.53},
  annote =	{Keywords: 3SUM, fine-grained complexity, matrix multiplication, monochromatic triangle}

Keywords: 3SUM, fine-grained complexity, matrix multiplication, monochromatic triangle
Collection: 11th Innovations in Theoretical Computer Science Conference (ITCS 2020)
Issue Date: 2020
Date of publication: 06.01.2020

DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI