Probability and Computing
Námskeið
- T-755 PRCO.
Lýsing:
Greatly expanded, this new edition requires only an elementary background in discrete mathematics and offers a comprehensive introduction to the role of randomization and probabilistic techniques in modern computer science. Newly added chapters and sections cover topics including normal distributions, sample complexity, VC dimension, Rademacher complexity, power laws and related distributions, cuckoo hashing, and the Lovasz Local Lemma.
Material relevant to machine learning and big data analysis enables students to learn modern techniques and applications. Among the many new exercises and examples are programming-related exercises that provide students with excellent training in solving relevant problems. This book provides an indispensable teaching tool to accompany a one- or two-semester course for advanced undergraduate students in computer science and applied mathematics.
Annað
- Höfundar: Michael Mitzenmacher, Eli Upfal
- Útgáfa:2
- Útgáfudagur: 2017-07-03
- Engar takmarkanir á útprentun
- Engar takmarkanir afritun
- Format:ePub
- ISBN 13: 9781108105958
- Print ISBN: 9781107154889
- ISBN 10: 1108105955
Efnisyfirlit
- Cover
- Half title
- Title
- Copyright
- Dedication
- Table of Contents
- Preface to the Second Edition
- Preface to the First Edition
- 1 Events and Probability
- 1.1 Application: Verifying Polynomial Identities
- 1.2 Axioms of Probability
- 1.3 Application: Verifying Matrix Multiplication
- 1.4 Application: Naïve Bayesian Classifier
- 1.5 Application: A Randomized Min-Cut Algorithm
- 1.6 Exercises
- 2 Discrete Random Variables and Expectation
- 2.1 Random Variables and Expectation
- 2.1.1 Linearity of Expectations
- 2.1.2 Jensen’s Inequality
- 2.2 The Bernoulli and Binomial Random Variables
- 2.3 Conditional Expectation
- 2.4 The Geometric Distribution
- 2.4.1 Example: Coupon Collector’s Problem
- 2.5 Application: The Expected Run-Time of Quicksort
- 2.6 Exercises
- 2.1 Random Variables and Expectation
- 3 Moments and Deviations
- 3.1 Markov’s Inequality
- 3.2 Variance and Moments of a Random Variable
- 3.2.1 Example: Variance of a Binomial Random Variable
- 3.3 Chebyshev’s Inequality
- 3.3.1 Example: Coupon Collector’s Problem
- 3.4 Median and Mean
- 3.5 Application: A Randomized Algorithm for Computing the Median
- 3.5.1 The Algorithm
- 3.5.2 Analysis of the Algorithm
- 3.6 Exercises
- 4 Chernoff and Hoeffding Bounds
- 4.1 Moment Generating Functions
- 4.2 Deriving and Applying Chernoff Bounds
- 4.2.1 Chernoff Bounds for the Sum of Poisson Trials
- 4.2.2 Example: Coin Flips
- 4.2.3 Application: Estimating a Parameter
- 4.3 Better Bounds for Some Special Cases
- 4.4 Application: Set Balancing
- 4.5 The Hoeffding Bound
- 4.6* Application: Packet Routing in Sparse Networks
- 4.6.1 Permutation Routing on the Hypercube
- 4.6.2 Permutation Routing on the Butterfly
- 4.7 Exercises
- 5 Balls, Bins, and Random Graphs
- 5.1 Example: The Birthday Paradox
- 5.2 Balls into Bins
- 5.2.1 The Balls-and-Bins Model
- 5.2.2 Application: Bucket Sort
- 5.3 The Poisson Distribution
- 5.3.1 Limit of the Binomial Distribution
- 5.4 The Poisson Approximation
- 5.4.1* Example: Coupon Collector’s Problem, Revisited
- 5.5 Application: Hashing
- 5.5.1 Chain Hashing
- 5.5.2 Hashing: Bit Strings
- 5.5.3 Bloom Filters
- 5.5.4 Breaking Symmetry
- 5.6 Random Graphs
- 5.6.1 Random Graph Models
- 5.6.2 Application: Hamiltonian Cycles in Random Graphs
- 5.7 Exercises
- 5.8 An Exploratory Assignment
- 6 The Probabilistic Method
- 6.1 The Basic Counting Argument
- 6.2 The Expectation Argument
- 6.2.1 Application: Finding a Large Cut
- 6.2.2 Application: Maximum Satisfiability
- 6.3 Derandomization Using Conditional Expectations
- 6.4 Sample and Modify
- 6.4.1 Application: Independent Sets
- 6.4.2 Application: Graphs with Large Girth
- 6.5 The Second Moment Method
- 6.5.1 Application: Threshold Behavior in Random Graphs
- 6.6 The Conditional Expectation Inequality
- 6.7 The Lovász Local Lemma
- 6.7.1 Application: Edge-Disjoint Paths
- 6.7.2 Application: Satisfiability
- 6.8* Explicit Constructions Using the Local Lemma
- 6.8.1 Application: A Satisfiability Algorithm
- 6.9 Lovász Local Lemma: The General Case
- 6.10* The Algorithmic Lovász Local Lemma
- 6.11 Exercises
- 7 Markov Chains and Random Walks
- 7.1 Markov Chains: Definitions and Representations
- 7.1.1 Application: A Randomized Algorithm for 2-Satisfiability
- 7.1.2 Application: A Randomized Algorithm for 3-Satisfiability
- 7.2 Classification of States
- 7.2.1 Example: The Gambler’s Ruin
- 7.3 Stationary Distributions
- 7.3.1 Example: A Simple Queue
- 7.4 Random Walks on Undirected Graphs
- 7.4.1 Application: An s–t Connectivity Algorithm
- 7.5 Parrondo’s Paradox
- 7.6 Exercises
- 7.1 Markov Chains: Definitions and Representations
- 8 Continuous Distributions and the Poisson Process
- 8.1 Continuous Random Variables
- 8.1.1 Probability Distributions in R
- 8.1.2 Joint Distributions and Conditional Probability
- 8.2 The Uniform Distribution
- 8.2.1 Additional Properties of the Uniform Distribution
- 8.3 The Exponential Distribution
- 8.3.1 Additional Properties of the Exponential Distribution
- 8.3.2* Example: Balls and Bins with Feedback
- 8.4 The Poisson Process
- 8.4.1 Interarrival Distribution
- 8.4.2 Combining and Splitting Poisson Processes
- 8.4.3 Conditional Arrival Time Distribution
- 8.5 Continuous Time Markov Processes
- 8.6 Example: Markovian Queues
- 8.6.1 M/M/1 Queue in Equilibrium
- 8.6.2 M/M/1/K Queue in Equilibrium
- 8.6.3 The Number of Customers in an M/M/∞ Queue
- 8.7 Exercises
- 8.1 Continuous Random Variables
- 9 The Normal Distribution
- 9.1 The Normal Distribution
- 9.1.1 The Standard Normal Distribution
- 9.1.2 The General Univariate Normal Distribution
- 9.1.3 The Moment Generating Function
- 9.2* Limit of the Binomial Distribution
- 9.3 The Central Limit Theorem
- 9.4* Multivariate Normal Distributions
- 9.4.1 Properties of the Multivariate Normal Distribution
- 9.5 Application: Generating Normally Distributed Random Values
- 9.6 Maximum Likelihood Point Estimates
- 9.7 Application: EM Algorithm For a Mixture of Gaussians
- 9.8 Exercises
- 9.1 The Normal Distribution
- 10 Entropy, Randomness, and Information
- 10.1 The Entropy Function
- 10.2 Entropy and Binomial Coefficients
- 10.3 Entropy: A Measure of Randomness
- 10.4 Compression
- 10.5* Coding: Shannon’s Theorem
- 10.6 Exercises
- 11 The Monte Carlo Method
- 11.1 The Monte Carlo Method
- 11.2 Application: The DNF Counting Problem
- 11.2.1 The Naïve Approach
- 11.2.2 A Fully Polynomial Randomized Scheme for DNF Counting
- 11.3 From Approximate Sampling to Approximate Counting
- 11.4 The Markov Chain Monte Carlo Method
- 11.4.1 The Metropolis Algorithm
- 11.5 Exercises
- 11.6 An Exploratory Assignment on Minimum Spanning Trees
- 12 Coupling of Markov Chains
- 12.1 Variation Distance and Mixing Time
- 12.2 Coupling
- 12.2.1 Example: Shuffling Cards
- 12.2.2 Example: Random Walks on the Hypercube
- 12.2.3 Example: Independent Sets of Fixed Size
- 12.3 Application: Variation Distance Is Nonincreasing
- 12.4 Geometric Convergence
- 12.5 Application: Approximately Sampling Proper Colorings
- 12.6 Path Coupling
- 12.7 Exercises
- 13 Martingales
- 13.1 Martingales
- 13.2 Stopping Times
- 13.2.1 Example: A Ballot Theorem
- 13.3 Wald’s Equation
- 13.4 Tail Inequalities for Martingales
- 13.5 Applications of the Azuma–Hoeffding Inequality
- 13.5.1 General Formalization
- 13.5.2 Application: Pattern Matching
- 13.5.3 Application: Balls and Bins
- 13.5.4 Application: Chromatic Number
- 13.6 Exercises
- 14 Sample Complexity, VC Dimension, and Rademacher Complexity
- 14.1 The Learning Setting
- 14.2 VC Dimension
- 14.2.1 Additional Examples of VC Dimension
- 14.2.2 Growth Function
- 14.2.3 VC dimension component bounds
- 14.2.4 ε-nets and ε-samples
- 14.3 The ε-net Theorem
- 14.4 Application: PAC Learning
- 14.5 The ε-sample Theorem
- 14.5.1 Application: Agnostic Learning
- 14.5.2 Application: Data Mining
- 14.6 Rademacher Complexity
- 14.6.1 Rademacher Complexity and Sample Error
- 14.6.2 Estimating the Rademacher Complexity
- 14.6.3 Application: Agnostic Learning of a Binary Classification
- 14.7 Exercises
- 15 Pairwise Independence and Universal Hash Functions
- 15.1 Pairwise Independence
- 15.1.1 Example: A Construction of Pairwise Independent Bits
- 15.1.2 Application: Derandomizing an Algorithm for Large Cuts
- 15.1.3 Example: Constructing Pairwise Independent Values Modulo a Prime
- 15.2 Chebyshev’s Inequality for Pairwise Independent Variables
- 15.2.1 Application: Sampling Using Fewer Random Bits
- 15.3 Universal Families of Hash Functions
- 15.3.1 Example: A 2-Universal Family of Hash Functions
- 15.3.2 Example: A Strongly 2-Universal Family of Hash Functions
- 15.3.3 Application: Perfect Hashing
- 15.4 Application: Finding Heavy Hitters in Data Streams
- 15.5 Exercises
- 15.1 Pairwise Independence
- 16 Power Laws and Related Distributions
- 16.1 Power Law Distributions: Basic Definitions and Properties
- 16.2 Power Laws in Language
- 16.2.1 Zipf’s Law and Other Examples
- 16.2.2 Languages via Optimization
- 16.2.3 Monkeys Typing Randomly
- 16.3 Preferential Attachment
- 16.3.1 A Formal Version
- 16.4 Using the Power Law in Algorithm Analysis
- 16.5 Other Related Distributions
- 16.5.1 Lognormal Distributions
- 16.5.2 Power Law with Exponential Cutoff
- 16.6 Exercises
- 17 Balanced Allocations and Cuckoo Hashing
- 17.1 The Power of Two Choices
- 17.1.1 The Upper Bound
- 17.2 Two Choices: The Lower Bound
- 17.3 Applications of the Power of Two Choices
- 17.3.1 Hashing
- 17.3.2 Dynamic Resource Allocation
- 17.4 Cuckoo Hashing
- 17.5 Extending Cuckoo Hashing
- 17.5.1 Cuckoo Hashing with Deletions
- 17.5.2 Handling Failures
- 17.5.3 More Choices and Bigger Bins
- 17.6 Exercises
- 17.1 The Power of Two Choices
- Further Reading
- Index
UM RAFBÆKUR Á HEIMKAUP.IS
Bókahillan þín er þitt svæði og þar eru bækurnar þínar geymdar. Þú kemst í bókahilluna þína hvar og hvenær sem er í tölvu eða snjalltæki. Einfalt og þægilegt!Rafbók til eignar
Rafbók til eignar þarf að hlaða niður á þau tæki sem þú vilt nota innan eins árs frá því bókin er keypt.
Þú kemst í bækurnar hvar sem er
Þú getur nálgast allar raf(skóla)bækurnar þínar á einu augabragði, hvar og hvenær sem er í bókahillunni þinni. Engin taska, enginn kyndill og ekkert vesen (hvað þá yfirvigt).
Auðvelt að fletta og leita
Þú getur flakkað milli síðna og kafla eins og þér hentar best og farið beint í ákveðna kafla úr efnisyfirlitinu. Í leitinni finnur þú orð, kafla eða síður í einum smelli.
Glósur og yfirstrikanir
Þú getur auðkennt textabrot með mismunandi litum og skrifað glósur að vild í rafbókina. Þú getur jafnvel séð glósur og yfirstrikanir hjá bekkjarsystkinum og kennara ef þeir leyfa það. Allt á einum stað.
Hvað viltu sjá? / Þú ræður hvernig síðan lítur út
Þú lagar síðuna að þínum þörfum. Stækkaðu eða minnkaðu myndir og texta með multi-level zoom til að sjá síðuna eins og þér hentar best í þínu námi.
Fleiri góðir kostir
- Þú getur prentað síður úr bókinni (innan þeirra marka sem útgefandinn setur)
- Möguleiki á tengingu við annað stafrænt og gagnvirkt efni, svo sem myndbönd eða spurningar úr efninu
- Auðvelt að afrita og líma efni/texta fyrir t.d. heimaverkefni eða ritgerðir
- Styður tækni sem hjálpar nemendum með sjón- eða heyrnarskerðingu
- Gerð : 208
- Höfundur : 6421
- Útgáfuár : 2017
- Leyfi : 379