Complexity Theory for MapReduce Algorithms

1 338
29.7
Следующее
19.08.16 – 3 1391:09:53
Towards Better User Interfaces for 3D
Популярные
Опубликовано 19 августа 2016, 19:14
For many problems, algorithms implemented in MapReduce or similar two-ranks-of-parallel-tasks systems exhibit a tradeoff between memory size and communication. More precisely, the tradeoff is between "reducer size" (the number of inputs received by the second rank of parallel tasks) and the "replication rate" (the average number of key-value pairs generated by the first rank in response to a single input). We begin with the simple but common "all-pairs" problem, where there is some output associated with every pair of inputs. For this problem, the reducer size and replication rate are seen to vary inversely. We then look at the different relationships that exist between these two parameters for a number of other problems, including (dense) matrix multiplication, similarity joins, and computing marginals
Случайные видео
30.05.23 – 6 3763:37
Eleglide M2 E-Bike Unboxing
01.06.21 – 78 1172:03
What is Dataplex?
25.04.21 – 5 584 3881:31
Google — A CODA Story
автотехномузыкадетское