Avi Wigderson: Randomness and pseudorandomness:
Is the universe inherently deterministic or probabilistic? Perhaps more importantly - can we tell the difference between the two?
Humanity has pondered the meaning and utility of randomness for millennia.
There is a remarkable variety of ways in which we utilize perfect coin tosses to our advantage: in statistics, cryptography, game theory, algorithms, gambling... Indeed, randomness seems indispensable!
Which of these applications survive if the universe had no randomness in it at all? Which of them survive if only poor quality randomness is available, e.g. that arises from "unpredictable" phenomena like the weather or the stock market?
Pseudorandomness is the study, by mathematicians and computer scientists, of deterministic structures which share some properties of random ones.
Understanding pseudorandom objects and constructing them efficiently leads to a surprisingly positive answers to the questions above, namely that much can be done with poor quality randomness, of even without any randomness at all. I plan to explain key aspects of this theory, and mention some of Endre Szemeredi's contributions to pseudorandomness.
The talk is aimed at a general audience, and no particular background will be assumed.
Endre Szemerédi: In every chaos there is an order
The chaos and order will be defined relative to three problems.
1. Arithmetic progressions
This part is connected to a problem of Erdős and Turán from the 1930's. Related to the van der Waerden theorem, they asked if the density version of that result also holds:
Is it true that an infinite sequence of integers of positive (lower) density contains arbitrary long arithmetic progressions?
The first result in this direction was due to K. F. Roth, who proved that any sequence of integers of positive (lower) density contains a three-term arithmetic progression.
I will give a short history of the generalization of Roth's result and explain some ideas about the "easiest"" proof.
2. Long arithmetic progression in subset sums
I will give exact bound for the size of longest arithmetic progression in subset sums. In addition, I shall describe the structure of the subset sums, and give applications in number theory and probability theory.
3. Embedding sparse graphs into large graphs
László Lovász: The many facets of the Regularity Lemm
The Regularity Lemma of Szemeredi, first obtained in the context of his theorem on arithmetic progressions in dense seuqences, has become one of the most important and most powerful tools in graph theory. It is basic in
extremal graph theory and in the theory of property testing. Weaker versions with better bounds (Frieze and Kannan) and stronger versions (Alon, Fisher, Krivelevich and Szegedy) have been proved and used. However, the
significance of it goes way beyond graph theory: it can be viewed as statement in approximation theory, as a compactness result for the completion of the space of finite graphs, as a result about the dimensionality of a metric space associated with a graph, as a statement in information theory. It serves as the archetypal example of the dichotomy between structure and randomness as pointed out by Tao. Its extensions to hypergraphs, a difficult problem solved by Gowers and by Rodl, Skokan and Schacht, connects with higher order Fourier analysis.
Timothy Gowers: The afterlife of Szemerédi’s theorem
Szemerédi's theorem asserts that every set of integers of positive upper density contains arbitrarily long arithmetic progressions. This result has been extraordinarily influential, partly because of the tools that Szemerédi introduced in order to prove it, and partly because subsequent efforts to understand the result more fully have led to progress in many other areas of mathematics, including combinatorics, ergodic theory, harmonic analysis and number theory. I shall discuss some of these later developments, to which Szemerédi himself made several essential contributions.