Difference between revisions of "Functional inequalities"

From Boolean Zoo
Jump to: navigation, search
m
m (General)
 
(3 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
== General ==  
 
== General ==  
 
Boolean functions are, as their name suggests, functions, and many types of results from functional analysis and geometry carry over to the Boolean setting. That is, it is possible to relate between various integrals, derivatives and evaluations of the functions, resulting in a functional inequality. Recurring quantities which tend to show up are the expectation, variance, and [[influence]] of the function.
 
Boolean functions are, as their name suggests, functions, and many types of results from functional analysis and geometry carry over to the Boolean setting. That is, it is possible to relate between various integrals, derivatives and evaluations of the functions, resulting in a functional inequality. Recurring quantities which tend to show up are the expectation, variance, and [[influence]] of the function.
 
TODO:CITE everything, remove todo when all inequalities are cited.
 
  
 
== Isoperimetric inequality ==  
 
== Isoperimetric inequality ==  
Line 10: Line 8:
 
=== An approximation ===
 
=== An approximation ===
 
A good approximation to the above theorem is the following '''isoperimetric inequality'''. Denoting by <math>\mu</math> the uniform measure on <math>\{-1,1\}^n</math>,  
 
A good approximation to the above theorem is the following '''isoperimetric inequality'''. Denoting by <math>\mu</math> the uniform measure on <math>\{-1,1\}^n</math>,  
 
+
<math>\mathrm{Inf}(f) \geq 2 \mu(A) \log_2\left(\frac{1}{\mu(A)}\right) = 2\frac{|A|}{2^n}\log_2\left(\frac{2^n}{|A|}\right)</math>.
+
:::{| class="wikitable"
 
+
|-
 +
|<math>\mathrm{Inf}(f) \geq 2 \mu(A) \log_2\left(\frac{1}{\mu(A)}\right) = 2\frac{|A|}{2^n}\log_2\left(\frac{2^n}{|A|}\right)</math>
 +
|}
 
Equality is attained only when the support of <math>f</math> is a [[cube| subcube]].
 
Equality is attained only when the support of <math>f</math> is a [[cube| subcube]].
  
Line 18: Line 18:
 
Let <math>f</math> be a Boolean function on <math>n</math> bits. Then  
 
Let <math>f</math> be a Boolean function on <math>n</math> bits. Then  
  
<math>\mathrm{Var}(f) \leq 2 \sum_{i=1}^n \mathrm{Inf}_i(f) </math>.
+
:::{| class="wikitable"
 +
|-
 +
|<math>\mathrm{Var}(f) \leq 2 \sum_{i=1}^n \mathrm{Inf}_i(f) </math>.
 +
|}
  
 
== KKL inequality ==
 
== KKL inequality ==
Line 24: Line 27:
 
Let <math>f</math> be a Boolean function on <math>n</math> bits. Then there exists an input bit with large influence:
 
Let <math>f</math> be a Boolean function on <math>n</math> bits. Then there exists an input bit with large influence:
  
<math>\max_i \mathrm{Inf}_i(f) \geq C\cdot \mathrm{Var}(f) \frac{\log(n)}{n}</math>.
+
:::{| class="wikitable"
 +
|-
 +
|<math>\max_i \mathrm{Inf}_i(f) \geq C\cdot \mathrm{Var}(f) \frac{\log(n)}{n}</math>.
 +
|}
  
 
for some universal constant <math>C > 0 </math>.
 
for some universal constant <math>C > 0 </math>.
Line 31: Line 37:
 
This inequality generalizes Poincare's inequality. Let <math>f</math> be a Boolean function on <math>n</math> bits. Then
 
This inequality generalizes Poincare's inequality. Let <math>f</math> be a Boolean function on <math>n</math> bits. Then
  
<math>\mathrm{Var}(f) \leq C \frac{\sum_i \mathrm{Inf}_i(f)}{\log \left(1 / \max_i \mathrm{Inf}_i(f) \right)}</math>
+
:::{| class="wikitable"
 +
|-
 +
|<math>\mathrm{Var}(f) \leq C \frac{\sum_i \mathrm{Inf}_i(f)}{\log \left(1 / \max_i \mathrm{Inf}_i(f) \right)}</math>
 +
|}
  
 
for some universal constant <math>C > 0 </math>.
 
for some universal constant <math>C > 0 </math>.
Line 37: Line 46:
 
This inequality generalizes the KKL inequality. Let <math>f</math> be a Boolean function on <math>n</math> bits. Then  
 
This inequality generalizes the KKL inequality. Let <math>f</math> be a Boolean function on <math>n</math> bits. Then  
  
<math>\mathrm{Var}(f) \leq C \sum_{i=1}^n\frac{\mathrm{Inf}_i(f)}{1+\log \left(1 / \mathrm{Inf}_i(f) \right)}</math>
+
:::{| class="wikitable"
 +
|-
 +
|<math>\mathrm{Var}(f) \leq C \sum_{i=1}^n\frac{\mathrm{Inf}_i(f)}{1+\log \left(1 / \mathrm{Inf}_i(f) \right)}</math>
 +
|}
  
 
for some universal constant <math>C > 0 </math>.
 
for some universal constant <math>C > 0 </math>.
  
== Talagrands surface area inequality conjecture ==
+
== Talagrands surface area inequality==
Let <math>f</math> be a Boolean function on <math>n</math> bits, and for an <math>x \in \{-1,1\}^n</math> define <math>h(x) = \#\{y \mid f(x) \neq f(y), |x-y|=1 \}</math> to be the number of neighbors of <math>x</math> on which <math>f</math> obtains a different value. Then
+
Let <math>f</math> be a Boolean function on <math>n</math> bits, and let <math>h(x)</math> be the local [[sensitivity]] of <math>f</math> (the number of neighbors of <math>x</math> on which <math>f</math> obtains a different value than <math>f(x)</math>). Then  
  
<math>\int \sqrt{h(x)} d\mu(x) \geq C \cdot \mathrm{Var}(f) \left(\log \left(2 + \frac{e}{\sum_i \mathrm{Inf}_i(f)^2} \right) \right) ^ {1/2}</math>
+
:::{| class="wikitable"
 +
|-
 +
|<math>\int \sqrt{h(x)} d\mu(x) \geq C \cdot \mathrm{Var}(f) \left(\log \left(2 + \frac{e}{\sum_i \mathrm{Inf}_i(f)^2} \right) \right) ^ {1/2}</math>
 +
|}
  
 
for some universal constant <math>C > 0 </math>.
 
for some universal constant <math>C > 0 </math>.

Latest revision as of 14:10, 5 April 2021

General

Boolean functions are, as their name suggests, functions, and many types of results from functional analysis and geometry carry over to the Boolean setting. That is, it is possible to relate between various integrals, derivatives and evaluations of the functions, resulting in a functional inequality. Recurring quantities which tend to show up are the expectation, variance, and influence of the function.

Isoperimetric inequality

The precise statement

Let [math]f[/math] be a Boolean function on [math]n[/math] bits, and suppose that the support [math]A[/math] of [math]f[/math] has size [math]m[/math]. Then total influence of [math]f[/math] (which is exactly equal to the number of edges from [math]A[/math] to [math]\{-1,1\}^n \backslash A[/math]) is larger than the total influence of a function whose support is the [math]m[/math]-smallest elements in lexicographical order on [math]\{-1,1\}^n[/math]. In other words, among all functions with support of size [math]m[/math], the radius-[math]m[/math] Hamming ball has the smallest total influence.

An approximation

A good approximation to the above theorem is the following isoperimetric inequality. Denoting by [math]\mu[/math] the uniform measure on [math]\{-1,1\}^n[/math],

[math]\mathrm{Inf}(f) \geq 2 \mu(A) \log_2\left(\frac{1}{\mu(A)}\right) = 2\frac{|A|}{2^n}\log_2\left(\frac{2^n}{|A|}\right)[/math]

Equality is attained only when the support of [math]f[/math] is a subcube.

Poincare inequality

Let [math]f[/math] be a Boolean function on [math]n[/math] bits. Then

[math]\mathrm{Var}(f) \leq 2 \sum_{i=1}^n \mathrm{Inf}_i(f) [/math].

KKL inequality

Maximum influence formulation

Let [math]f[/math] be a Boolean function on [math]n[/math] bits. Then there exists an input bit with large influence:

[math]\max_i \mathrm{Inf}_i(f) \geq C\cdot \mathrm{Var}(f) \frac{\log(n)}{n}[/math].

for some universal constant [math]C \gt 0 [/math].

Influence inequality formulation

This inequality generalizes Poincare's inequality. Let [math]f[/math] be a Boolean function on [math]n[/math] bits. Then

[math]\mathrm{Var}(f) \leq C \frac{\sum_i \mathrm{Inf}_i(f)}{\log \left(1 / \max_i \mathrm{Inf}_i(f) \right)}[/math]

for some universal constant [math]C \gt 0 [/math].

Talagrand's influence inequality

This inequality generalizes the KKL inequality. Let [math]f[/math] be a Boolean function on [math]n[/math] bits. Then

[math]\mathrm{Var}(f) \leq C \sum_{i=1}^n\frac{\mathrm{Inf}_i(f)}{1+\log \left(1 / \mathrm{Inf}_i(f) \right)}[/math]

for some universal constant [math]C \gt 0 [/math].

Talagrands surface area inequality

Let [math]f[/math] be a Boolean function on [math]n[/math] bits, and let [math]h(x)[/math] be the local sensitivity of [math]f[/math] (the number of neighbors of [math]x[/math] on which [math]f[/math] obtains a different value than [math]f(x)[/math]). Then

[math]\int \sqrt{h(x)} d\mu(x) \geq C \cdot \mathrm{Var}(f) \left(\log \left(2 + \frac{e}{\sum_i \mathrm{Inf}_i(f)^2} \right) \right) ^ {1/2}[/math]

for some universal constant [math]C \gt 0 [/math].