## Outline of CMSC858C, Randomized Algorithms, Fall 2011

Instructor: Aravind Srinivasan
Class Venue and Time: CSI 3118, 12:30-1:45AM Tue, Thu

The following is an approximate outline of the main topics covered in CMSC858C, Fall 2011.

High-Level Picture:

• The power of randomized algorithms: abundance of witnesses and zero-sum games, applications of the probabilistic method (e.g., when there is no obvious best choice for a combinatorial structure), symmetry-breaking in networking and distributed computing, laws of large numbers & concentration (e.g., in sampling and streaming-type algorithms).
• Pseudorandom generators (ad hoc and rigorous), computing with weak random sources, "weakly-random" inputs as plausible explanations for the success of algorithms in practice; randomized algorithms can often be derandomized; the "P = BPP" conjecture.
Some Basic Concepts:
• Concavity and convexity, Jensen's inequality, Stirling's approximation, bionomial coefficients and binomial sums, matroids and submodularity.
• Basic information theory: properties and applications of entropy and the Kullback-Leibler divergence.
• Basic facts about convex optimization and linear programming; existence of fractional solutions with good properties based on the rank of the tight constraints (e.g., flow decomposition); the polyhedral approach to discrete optimization.
• A simple "master method" for solving non-standard inequalities.
• (Finite) Fields, polynomials, and Vander Monde matrices.
• The expectation: two interpretations and alternative formula; expectation vs. median; the linearity of expectation.
• The union bound and truncated inclusion-exclusion.
Tail Bounds and Correlation Inequalities:
• Tail bounds: Markov, Chebyshev, Chebyshev-Cantelli, moment inequalities, Bellare-Rompel, Schmidt-Siegel-Srinivasan.
• The Bernstein-Chernoff-Hoeffding bounds and the role of convexity; approaches to "small" and "very large" deviations; lower-bounds on tail probabilities and the Poisson approximation.
• The simplifying role of the elementary symmetric polynomials: Schmidt-Siegel-Srinivasan.
• Random sampling (using "few" random bits).
• The Hajnal-Szemeredi theorem and application to tail bounds; Janson (2004)'s tail bounds parametrized by the fractional chromatic number of the dependency graph.
• Negative correlation and negative association; FKG; Dubhashi-Ranjan; tail bounds under negative correlation.
• Janson's lower-tail inequalities from the late 1980s.
Derandomization and Computing with Weak Random Sources:
• The methods of conditional probabilities and pessimistic estimators.
• Advice strings and their explicit construction; BPP contained in P/poly.
• Randomized algorithms wherein a weak randomness property of the random source is sufficient:
• k-wise independence: connection to linear independence, constructions, lower bounds.
• Almost k-wise independence and Fourier analysis.
• Other applications of small sample spaces: hashing, document filtering, min-wise independence.
• Existence proofs for small sample spaces: probabilistic, or based on linear programming.
• Computing with weak random sources and connection to highly unbalanced "random-like" bipartite graphs.
• Brief discussion of deterministic amplification by random walks on expanders; (strong) randomness extractors and privacy amplification.
The Probabilistic Method and Random Graphs:
• See this very interesting blog post by Terry Tao on the crossing number, and a memorable quote from there: "However, there is an amazing (and unintuitive) principle in combinatorics which states that when there is no obvious "best" choice for some combinatorial object (such as a set of vertices to delete), then often trying a random choice will give a reasonable answer, if the notion of "random" is chosen carefully. (See this paper of Gowers for some further discussion of this principle.) The application of this principle is known as the probabilistic method, first introduced by Erdos."
• Graph Ramsey Theory, random graphs, and Ramsey graphs.
• Alteration and random permutations as integral to the probabilistic method.
• Probabilistic existence proofs via large-deviation bounds.
Applications:
• Connections between the basic randomized complexity classes.
• Randomized Quicksort and selection.
• Hashing and communication complexity.
• Graphs and hypergraphs: dominating sets, independent sets, coloring, crossing numbers, simulatenously-large girth and chromatic number.
• Symmetry-breaking and distributed algorithms: Luby's algorithm for (Δ + 1)-coloring (self-reading by students).
• Balls and bins.
• Randomized geometric algorithms and backwards analysis: smallest enclosing disks.
• Algorithmic game theory and economics: incentive compatibility, mechanism design and VCG, social choice functions, Arrow's Theorem and the Gibbard-Satterthwaite Theorem; auctions, coverage functions, and (approximately) truthful-in-expectation randomized mechanisms.
• Metric embeddings and distortion: Bourgain, Johnson-Lindenstrauss, and embeddings into tree-metrics as initiated by Bartal.
• Approximation algorithms, packing/covering, and fractional relaxations; deterministic vs. randomized rounding ("scale fractional solution up/down and keep many random variables close to their means simultaneously"): independent rounding with/without alteration, the addition of valid constraints.
• Various notions of dependent rounding, with connections to matroids and submodularity.
• The Lovász Local Lemma, its iterative ("nibble-like") application, and algorithmic aspects.
• The Isolating Lemma and matchings.
• PAC learning and the definition of the VC-dimension.
• The multiplicative-weights-update method: basic ideas and applications (including learning large-margin linear classifiers and Boosting).
• Introduction to parallel, online, and streaming computation, and to the role of randomness therein.
Some key related topics not covered due to time constraints:
• Karger et al.'s approach to sampling cuts in graphs.
• Martingales.
• The "power of two choices" in balls-and-bins.
• The Goldreich-Levin Theorem.
• Color-coding.
• Property testing.
• Connections to statistical physics. ("The notion that Math Physics ideas, particularly percolation, are useful in studying computer generated random processes is something I myself picked up from Christian Borgs and Jennifer Chayes and I now recognize as an extremely powerful idea." -- Joel Spencer in his Potpourri.)
• Further applications in learning, networks, and distributed computing.

Web Accessibility