Prologue CZ.1.07/2.2.00/28.0041 Centrum interaktivních a multimediálních studijních opor pro inovaci výuky a efektivní učení evropský sociální soaaini ^h^h ministerstvo školství, opvzdsiavanr v*^^< fondVCR EVROPSKÁ UNIE MLÁDEŽE A TĚLOVÝCHOVY pro konkurenceschopnost ,*4JÍA'»*' INVESTICE DO ROZVOJE VZDĚLÁVÁNÍ You should spent most of your time thinking about what you should think about most of your time. 2/29 RANDOMIZED ALGORITHMS AND PROTOCOLS - 2020 WEB PAGE of the LECTURE RANDOMIZED ALGORITHMS AND PROTOCOLS - 2020 Prof. Jozef Gruska, DrSc Wednesday, 10.00-11.40, B410 http://www.f i.muni.cz/usr/gruska/random20 FINAL EXAM: You need to answer four questions out of five given to you. CREDIT (ZAPOČET): You need to answer three questions out of five given to you. EXERCISES/TUTORIALS EXERCISES/TUTORIALS: Thursdays 14.00-15.40, C525 TEACHER: RNDr. Matej Pivoluška PhD Language English NOTE: Exercises/tutorials are not obligatory 5/29 CONTENTS - preliminary 0 EE! Basic concepts and examples of randomized algorithms Types and basic design methods for randomized algorithms Basics of probability theory Simple methods for design of randomized algorithms Games theory and analysis of randomized algorithms Basic techniques I: moments and deviations Basic techniques II: tail probabilities inequalities Probabilistic method I: Markov chains - random walks Algebraic techniques - fingerprinting Fooling the adversary - examples Randomized cryptographic protocols Randomized proofs Probabilistic method II: Quantum algorithms 6/29 LITERATURE R. Motwami, P. Raghavan: Randomized algorithms, Cambridge University Press, UK, 1995 J. Gruska: Foundations of computing, International Thompson Computer Press, USA. 715 pages, 1997 J. Hromkovic: Design and analysis of randomized algorithms, Springer, 275 pages, 2005 N. Alon, J. H. Spencer: The probabilistic method, Willey-lnterscience, 2008 Part I Chapter 4. SIMPLE METHODS for DESIGN of RANDOMIZED PROLOGUE ALGORITHMS In this chapter we present a new way how to see randomized algorithms and an application of some simple basic techniques how to design randomized algorithms. Especially we deal with: A way to see basics of deterministic, randomized i A unified approach to deterministic, randomized and and quantum computations and their differences. quantum algorithms i Application of the linearity of expectations method i Design of randomized algorithms for games trees. IV054 1. Simple Methods of design of Randomized Algorithms 9/29 IV054 1. Simple Methods of design of Randomized Algorithms 10/29 MATHEMATICAL VIEWS of COMPUTATION 1/3 MATHEMATICAL VIEWS of COMPUTATION 2/3 To describe a randomized computation we need; Let us consider an n bits strings set S C {0,1}". 1:) to specify an initial probability distribution on all n-bit strings. That can be done by To describe a deterministic computation on S we need to specify: a vector of length 2", indexed by n-bit strings, the elements of which are non-negative numbers that sum up to 1. an initial state - by an n-bit string - say so 2:) to specify a randomized evolution, which has to be done, in case of a homogeneous and an evolution (computation) mapping E : S —> S which can be described by a vector of the length 2", the elements and indices of which are n-bit strings. evolution, by a 2" x 2" matrix A of conditional probabilities for obtaining a new state/string from an old state/string. A computation step is then an application of the evolution mapping E to the current state represented by an n-bit string s. The matrix A has to be stochastic - all columns have to sum up to one and A[i,j] is a probability of going from a string representing j to a string representing /. However, for any at least a bit significant task, the number of bits needed to describe To perform a computation step, one then needs to multiply by A the 2"-elements vector specifying the current probability distribution on 2" states. such an evolution mapping, n2", is much too big. The task of programming is then/therefore to replace an application of such an enormously huge mapping by an application of a much shorter circuit/program. However, for any nontrivial problem the number 2" is larger than the number of particles in the universe. Therefore, the task of programming is to design a small circuit/program that can implement such a multiplication by a matrix of an enormous size. IV054 1. Simple Methods of design of Randomized Algorithms 11/29 IV054 1. Simple Methods of design of Randomized Algorithms 12/29 MATHEMATICAL VIEWS of COMPUTATION 3/: In case of quantum computation on n quantum bits: 1:) Initial state has to be given by an 2" vector of complex numbers (probability amplitudes) the sum of the squares of which is one. 2:) Homogeneous quantum evolution has to be described by an 2" x 2" unitary matrix of complex numbers - at which inner products of any two different columns and any two different rows are 0.1 Concerning a computation step, this has to be again a multiplication of a vector of the probability amplitudes, representing the current state, by a very huge 2" x 2" unitary matrix which has to be realized by a "small" quantum circuit (program). 1A matrix A is usually called unitary if its inverse matrix can be obtained from A by transposition around the main diagonal and replacement of each element by its complex conjugate. IV054 1. Simple Methods of design of Randomized Algorithms 13/29 LINEARITY OF EXPECTATIONS A very simple, but very often very useful, fact is that for any random variables X\, X2,... it holds e£>] = £eM- even if X, are dependent and dependencies among X,'s are very complex. Example: A ship arrives at a port, and all 40 sailors on board go ashore to have fun. At night, all sailors return to the ship and, being drunk, each chooses randomly a cabin to sleep in. Now comes the question: What is the expected number of sailors sleeping in their own cabins? Solution: Let X, be a random variable, so called (indicator variable), which has value 1 if the i-th sailor chooses his own cabin, and 0 otherwise. Expected number of sailors who get to their own cabin is 40 40 eE>] = £e[x;] Since cabins are chosen randomly E[X,] = ^ and E[^;=1 X,] = 40.^ = 1 IV054 1. Simple Methods of design of Randomized Algorithms 14/29 EXAMPLE - BINARY PARTITION of a SET of LINE SEGMENTS 1/3 Problem Given a set 5 = {si,..., sn} of non-intersecting line segments, find a partition of the plane such that every region will contain at most one line segment (or at most a part of a line segment). L, sl S3 S3 S2 A (binary) partition will be described by a binary tree + additional information (about nodes). With each node v a region rv of the plane will be associated (the whole plane will be represented by the root) and also a line Lv intersecting rv. Each line Lv will partition the region rv into two regions r/,v and rr,v which correspond to two children of v - to the left and right one. EXAMPLE - BINARY PARTITION of a SET of LINE SEGMENTS 2/3 Notation: /(s,) will denote a line-extension of the segment s,. autopartitions will use only line-extensions of given segments. Algorithm RandAuto: Input: A set S — {si,..., sn} of non-intersecting line segments. Output: A binary autopartition Pn of S. 1: Pick a permutation n of {1,..., n} uniformly and randomly. 2: While there is a region R that contains more than one segment, choose one of them randomly and cut it with /(s,) where / is the first element in the ordering induced by n such that /(s,) cuts the region R. Theorem: The expected size of the autopartition Pn of S, produced by the above RandAuto algorithm is 6(n\nn). Proof: Notation (for line segments u, v). index(u, v) if l(u) intersects / — 1 segments before hitting v; 00 if l(u) does not hit v. u h v will be an event that l(u) cuts v in the constructed (autopartition) tree. IV054 1. Simple Methods of design of Randomized Algorithms 15/29 IV054 1. Simple Methods of design of Randomized Algorithms 16/29 EXAMPLE - BINARY PARTITION of a SET of LINE SEGMENTS 3/3 Probability: Let u and v be segments, index(u, v) — i and let ui,..., be segments the line l(u) intersects before hitting v. The event u h v happens, during an execution of RandPart, only if u occurs before any of {ui,..., Uj-i, v} in the permutation n.Therefore the probability that event u h v happens is = *—a—--■ KK '+1 mdex(u,v)+i Notation: Let Cu,v be the indicator variable that has value 1 if u h v and 0 otherwise. E[CU,V] = P<\u h v] index(u, v) + 1' Clearly, the size of the created partition Pn equals n plus the number of intersections due to cuts. Its expectation value is therefore index(u, v) + 1' For any line segment u and integer / there are at most two v, w such that index(u, v) = index(u, w) = /.Hence J2V^U indeX(u,v)+i ^ T,"=i 7tt and therefore n + E[EW E^u Cu,v] < n + Ew ELY ^ < n + 2nHn. IV054 1. Simple Methods of design of Randomized Algorithms 17/29 GAME TREE EVALUATION Game trees Xj Xg y. y2 y3 y« y5 ys y7 ys Game trees are trees with operations max and min alternating in internal nodes and with values assigned to their leaves. In case all such values are Boolean - 0 or 1 Boolean operation OR and AND are considered instead of max and min. /\ Tk - binary game tree of depth 2k. Goal is to evaluate the tree - the root. IV054 1. Simple Methods of design of Randomized Algorithms 18/29 GAME TREE EVALUATION - II. WORST CASE COMPLEXITY Tk - will denote the binary game tree of depth 2k. Xj X7 Xg y. y2 y3 y« y5 ys y7 ys Evaluation of game trees plays a crucial role in AI, in various game playing programs. 1 Assumption: An evaluation algorithm chooses at each step (somehow) a leaf, reads its value and performs all evaluations of internal nodes it can perform. Cost of an evaluation algorithm is the number of leaves inspected. Determine the total number of such steps needed. Every deterministic algorithm can be forced to inspect all leaves. The worst-case complexity of a deterministic algorithm to evaluate Tk is therefore: n = 4k = 22k. IV054 1. Simple Methods of design of Randomized Algorithms 19/29 IV054 1. Simple Methods of design of Randomized Algorithms 20/29 A RANDOMIZED ALGORITHM - BASIC IDEA: RANDOMIZED ALGORITHMS - SUMMARY of THE BASIC IDEA To evaluate an AND-node v, the algorithm chooses randomly one of its children and evaluates it. If 1 is returned, algorithm proceeds to evaluate other children subtree and Start at the root and in order to evaluate a node evaluate returns as the value of v the value of that subtree. If 0 is returned, algorithm returns immediately 0 for v (without evaluating other subtree). (recursively) a random child of the current node. To evaluate an OR-node v, algorithm chooses randomly one of its children If this does not determine the value of the current node, and evaluates it. evaluate the node of other child. If 0 is returned, algorithm proceeds to evaluate other subtree and returns as the value of v the value of the subtree. If 1 is returned, the algorithm returns 1 for v. IV054 1. Simple Methods of design of Randomized Algorithms 21/29 IV054 1. Simple Methods of design of Randomized Algorithms 22/29 Theorem: Given any instance of Tk, the expected number of steps for the above If the root of T were to return 0 both subtrees have to be evaluated, giving the cost 2 x 3k-\ randomized algorithm is at most 3k. Proof by induction: Consider now the root of Tk- Base step: Case k — 1 easy - verify by computations for all possible inputs. Inductive step: Assume that the expected cost of the evaluation of any instance of Tk-i is at most 3k~1. If the root evaluates to 1, both of its OR-subtrees have to evaluate to 1. The expected cost is therefore Consider an OR-node tree T with both children being Tk-i-trees. 2 x - x 3k-x = 3k. 2 If the root of T were to return 1, at least one of its T/<_i-subtrees has to return 1. With probability \ this child is chosen first, given in average at most 3k~1 leaf-evaluations. With probability \ both subtrees are to be evaluated. If the root evaluates to 0, at least one of the subtrees evaluates to 0. The expected cost The expected cost of determining the value of T is therefore: is therefore 1 x 3*-i + I x 2 x 3"-1 = J x 3k = ^ x 3fc-\ 2 2 2 2 -x2x2x 3"-1 + -x-x 3"-1 < 3k = n^3 = n0793. 2 2 2 ~ Our algorithm is therefore a Las Vegas algorithm. Its running time (number of leaves evaluations) is: n0,793. IV054 1. Simple Methods of design of Randomized Algorithms 23/29 IV054 1. Simple Methods of design of Randomized Algorithms 24/29 APPENDIX The concept of the number of wisdom introduced in the following and related results helped to show that randomness is deeply rooted even in arithmetic. Another way to see self-delimiting programs is to consider only such programming languages L that no program in L is a prefix of another In order to define numbers of wisdom the concept of self-delimiting program in L. programs is needed. A program represented by a binary word p, is self-delimiting for a computer C, if for any input pw the computer C can recognize where p ends after reading p only.. IV054 1. Simple Methods of design of Randomized Algorithms 25/29 IV054 1. Simple Methods of design of Randomized Algorithms 26/29 ft - numbers of wisdom Properties of numbers of wisdom For a universal computer C with only self-delimiting programs, the number of wisdom ftc is the probability that randomly constructed program for C halts. More formally ■ 0 < ftc < 1 ftc= E 2-l'l p halts ■ ftc is an uncomputable and random real number. ■ At least n-bits long theory is needed to determine n bits of ftc- where p are (self-delimiting) halting programs for C. ■ At least n bits long program is needed to determine n bits of ftc ■ Bits of ft can be seen as mathematical facts that are true for no reason. ftc is therefore the probability that a self-delimiting computer program for C generated at random, by choosing each of its bits using an independent toss of a fair coin, will eventually halt. IV054 1. Simple Methods of design of Randomized Algorithms 27/29 IV054 1. Simple Methods of design of Randomized Algorithms 28/29 ■ Greg Chaitin, who introduced numbers of wisdom, designed a specific universal computer C and a two hundred pages long Diophantine equation E, with 17,000 variables and with one parameter k, such that for a given k the equation E has a finite (infinite) number of solutions if and only if the k-th bit of ftc is 0 (is 1).{ As a consequence, we have that randomness, unpredictability and uncertainty occur even in the theory of Diophantine equations of elementary arithmetic.} ■ Knowing the value of Qc with n bits of precision allows to decide which programs for C with at most n bits halt. IV054 1. Simple Methods of design of Randomized Algorithms 29/29