Information and Interactive Communication

30
Следующее
Популярные
Опубликовано 17 августа 2016, 0:45
Notions of entropy and information, pioneered by Shannon, have been very powerful tools in coding theory. Coding theory aims to solve the problem of one-way communication: sending a message from Alice to Bob using as little communication as possible, sometimes over a noisy channel. Communication complexity aims to solve the problem of two-way communication: Alice and Bob aim to implement a functionality f that depends on both partiesΓÇÖ inputs. We will discuss several extensions of information-theoretic notions to the two-way communication setting. We use them to prove a direct sum theorem for randomized communication complexity, showing that implementing k copies of a functionality requires substantially more communication than just one copy, partially settling a long-standing open problem. More generally, we will show that information cost I(f) can be defined as a natural fundamental property of a functionality f. We will describe several new tight connections between I(f), direct sum theorems, interactive compression schemes, and amortized communication complexity.
автотехномузыкадетское