Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Communication complexity
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Distributional Complexity === One approach to studying randomized communication complexity is through distributional complexity. Given a joint distribution <math>\mu</math> on the inputs of both players, the corresponding distributional complexity of a function <math>f</math> is the minimum cost of a ''deterministic'' protocol <math>R</math> such that <math>\Pr[f(x,y) = R(x,y)] \ge 2/3</math>, where the inputs are sampled according to <math>\mu</math>. Yao's minimax principle<ref>{{cite conference |url=https://ieeexplore.ieee.org/document/4567946 |title=Probabilistic computations: Toward a unified measure of complexity |last=Yao |first=Andrew Chi-Chih |author-link=Andrew Yao |date=1977 |publisher=IEEE |book-title=18th Annual Symposium on Foundations of Computer Science (sfcs 1977) |issn=0272-5428 |doi=10.1109/SFCS.1977.24}}</ref> (a special case of [[John von Neumann|von Neumann]]'s [[minimax theorem]]) states that the randomized communication complexity of a function equals its maximum distributional complexity, where the maximum is taken over all joint distributions of the inputs (not necessarily product distributions!). Yao's principle can be used to prove lower bounds on the randomized communication complexity of a function: design the appropriate joint distribution, and prove a lower bound on the distributional complexity. Since distributional complexity concerns deterministic protocols, this could be easier than proving a lower bound on randomized protocols directly. As an example, let us consider the ''disjointness'' function DISJ: each of the inputs is interpreted as a subset of <math>\{1,\dots,n\}</math>, and DISJ({{mvar|x}},{{mvar|y}})=1 if the two sets are disjoint. Razborov<ref>{{cite journal |last=Razborov |first=Alexander |author-link=Alexander Razborov |date=1992 |title=On the distributional complexity of disjointness |journal=Theoretical Computer Science |volume=106 |issue=2 |pages=385β390 |doi=10.1016/0304-3975(92)90260-M |doi-access=free }}</ref> proved an <math>\Omega(n)</math> lower bound on the randomized communication complexity by considering the following distribution: with probability 3/4, sample two random disjoint sets of size <math>n/4</math>, and with probability 1/4, sample two random sets of size <math>n/4</math> with a unique intersection.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)