• Tidak ada hasil yang ditemukan

Chapter 4: Monte Carlo Methods

N/A
N/A
Protected

Academic year: 2024

Membagikan "Chapter 4: Monte Carlo Methods"

Copied!
52
0
0

Teks penuh

(1)

Chapter 4: Monte Carlo Methods

Paisan Nakmahachalasint [email protected]

http://pioneer.chula.ac.th/~npaisan

(2)

Introduction

z Monte Carlo Methods are a class of

computational algorithms that rely on repeated random sampling to compute their results.

z Monte Carlo simulation methods are especially useful in studying systems

with a large number of coupled degrees of freedom.

(3)

Introduction

z Monte Carlo methods describe a large and

widely-used class of approaches which tend to follow a particular pattern:

z Define a domain of possible inputs.

z Generate inputs randomly from the domain using a certain specified probability distribution.

z Perform a deterministic computation using the inputs.

z Aggregate the results of the individual computations into the final result.

(4)

Random Number Generators

z Monte Carlo simulations are inherently

probabilistic, and thus make frequent use of programs to generate random numbers.

z On a computer these numbers are not random at all -- they are strictly deterministic and

reproducible, but they look like a stream of random numbers. For this reason such

programs are more correctly called pseudo- random number generators.

(5)

Random Number Generators

Essential Properties

z Repeatability -- the same sequence should be produced with the same seeds.

z Randomness -- should produce independent uniformly distributed random variables that pass all statistical tests for randomness.

z Long period – The period should be much longer than the amount of random numbers needed.

z Insensitive to seeds -- period and randomness properties should not depend on the initial seeds.

(6)

Random Number Generators

Additional Properties

z Portability -- should give the same results on different computers.

z Efficiency -- should be fast (small number of floating point operations) and not use much memory.

z Disjoint subsequences -- different seeds should produce long independent (disjoint) subsequences so that there are no correlations between simulations with different initial seeds.

z Homogeneity -- sequences of all bits should be random.

(7)

Random Number Generators

Middle-Square Method

z Start with a four-digit number x0, called the seed.

z Square it to obtain an eight-digit number (add a leading zero if necessary).

z Take the middle four digits as the next random number.

64 80

1004 1049

0180 1009

7423 1656

xn 2041

8 7

6 5

4 3

2 1

n 0

(8)

Random Number Generators

Linear Congruence Generator (LCG)

z Introduced by D. H. Lehmer in 1951. A great majority of pseudo-random numbers used

today are based on this method.

xn+1 = (axn + b) (mod m)

Example: if x0 = 7, a = 1, b = 7, m = 10, we obtain the following sequence:

7, 4, 1, 8, 5, 2, 9, 6, 3, 0, 7, 4, …

(9)

Random Number Generators

z Choice of the modulus, m.

z m should be as large as possible since the period can never be larger than m.

z One usually chooses m to be near the largest integer that can be represented.

z Choice of the multiplier, a

It was proven by M.Greenberger in 1961 that the sequence will have period m if and only if

z b is a relative prime to m.

z a – 1 is a multiple of p for all p | m.

z 4|m 4|a – 1.

(10)

Random Number Generators

z With b = 0, one cannot get the full

period, but in order to get the maximum possible, the followings should be

satisfied:

z x0 is a relative prime to m.

z a is a primitive root modulo m.

z It is possible to obtain a period of length m – 1, but usually the period is around m/4.

(11)

Random Number Generators

A Disadvantage of LCG

z RANDU is a popular random number generator distributed by IBM in 1960’s with the algorithm

xn+1 = 65539xn (mod 231).

z This generator was later found to have a serious problem – all random numbers

fall into hyperplanes.

(12)

Random Number Generators

z 5000 random points in 1 and 2 dimensions generated from RANDU.

0.2 0.4 0.6 0.8 1.

10 20 30 40 50 60

0 0.2 0.4 0.6 0.8 1

0 0.2 0.4 0.6 0.8 1

Histogram of 5000 random numbers.

Distribution of 5000 points in 2 dimensions

(13)

Random Number Generators

z 5000 random points in 3 dimensions generated from RANDU viewed at two different view

points.

(14)

Random Number Generators

LCGs in rand() in various compilers.

In all above compilers, m = 232

16 - 30 12345

1103515245 ANSI C

32 - 63 1

134775813 Borland Delphi

16 - 30 2531011

214013 Microsoft C/C++

0 - 30 12345

1103515245 Glibc (GCC)

16 - 30 1

22695477 Borland C/C++

Output bits in rand() b

a Source

(15)

Random Number Generators

Lagged-Fibonacci Generator (LFG)

z Lagged Fibonacci pseudo-random number

generators have become increasingly popular in recent years. These generators are so named

because of their similarity to the familiar Fibonacci sequence. LFG uses the algorithm

xn = xn l + xn k (mod m) where l > k > 0 with m initial values {x0, x1, …, xl – 1}.

(16)

Random Number Generators

z For most applications of interest, m is a power of two; i.e., m = 2M.

z With a proper choice of l, k and the initial values, the period P of LFG is (2l – 1)×2M – 1.

z Two commonly used LFGs.

xn = xn – 17 + xn – 5 (mod 231) period ≈ 247 xn = xn – 55 + xn – 24 (mod 231) period ≈ 285

(17)

Random Number Generators

z Unlike LCG, the value of the modulus, m, does not by itself limit the period of the generator.

z LFG is computationally simple: an integer add, a logical AND (to

accomplish the mod 2M operation), and the advancing of two array pointers are the only operations required to produce a new random number.

(18)

Random Number Generators

Non-uniform Random Numbers

z Most random number generators give uniformly distributed numbers.

z Non-uniform distributions can be simulated by

z Inverse CDF Method

z Acceptance/Rejection Method

(19)

Random Number Generators

Inverse CDF Method

z If X is a random variable having the

probability density function f (x) and the cumulative distribution function (CDF)

then the random variable U = F(X)

is uniformly distributed on the interval [0,1].

z Thus X = F–1(U) has the desired distribution.

( ) x ( )

F x f t dt

=

−∞
(20)

Random Number Generators

An exponential distribution

has for x ≥ 0.

z If u is uniform on (0,1), then

has the exponential distribution.

for 0

( ) 0 for 0

e x x

f x x

λ λ

⎧⎪ >

= ⎨⎪⎪⎪⎪⎩

0.5 1 1.5 2 2.5 3

x 0.2

0.4 0.6 0.8 1 FHxL

0.5 1 1.5 2 2.5 3

x 0.5

1 1.5 2 fHxL

( ) x

f x =λeλ

( ) x

F x =eλ

( ) 1 x

F x = −eλ

ln u x = − λ

(21)

Random Number Generators

Acceptance/Rejection Method

z We can generate x having the density g(x) and f (x) ≤ c g(x) for some constant c.

z We can generate x

with the density f (x).

z Generate x with the density g(x).

z Generate a uniform random number u in (0,1).

z If , then x is accepted

otherwise repeat the step with the generation of x.

( ) f x ( ) cg x

x

( ) ( ) u f x

cg x

(22)

Random Number Generators

Normal Distribution

z The standard normal distribution can be generated with the Box-Muller method.

z Let u1 and u2 are independently distribution as uniform on the interval (0,1). Then

are independently distributed as the standard normal.

1 1 2

2 1 2

cos(2 ) 2 ln sin(2 ) 2 ln

x u u

x u u

π π

=

=

(23)

Mean, Variance, and Central Limit Theorem

z A random variable can be thought of as an unknown value that may change

every time it is inspected. Thus, a

random variable can be thought of as a function mapping the sample space of a random process to the real numbers.

z A random variable can be discrete or continuous.

(24)

Mean, Variance, and Central Limit Theorem

z Mean of a random variable the integral of the random variable with respect to its probability measure.

E[ ] i ( )i

i

X =

x P x

E[ ]X x f x dx( )

=

−∞

Discrete RVs

Continuous RVs

(25)

Mean, Variance, and Central Limit Theorem

z The variance of a variable X with the mean μ = E[X] is given by

( )2

2 2

Var[ ] E E

X X

X

μ μ

=

=

( )2

Var[ ] i ( )i

i

X =

x μ P x

( )2

Var[ ]X x μ f x dx( )

=

−∞

Discrete RVs Continuous RVs

(26)

Mean, Variance, and Central Limit Theorem

Example: Binomial Distribution X ~ B(n,p)

( )

( ) n x 1 n x 0,1, ,

P X = x = ⎛ ⎞⎟⎜ ⎟⎜ ⎟⎜ ⎟⎜⎝ ⎠x p p x = n

( ) ( )

0

2 2 0

E( ) ( )

Var( ) ( ) 1

n

x n

x

X xP x np

X x P x np np p

=

=

= =

⎞⎟

= =

(27)

Mean, Variance, and Central Limit Theorem

Example: Continous Uniform Distribution

X ~ U(a,b) 1

( ) for

0 otherwise

a x b b a

f x = ⎨⎧⎪⎪⎪⎪ − ≤ ≤

⎪⎪⎪⎪⎩

( )

(

2

)

2 ( )2

1 2

1 12

E( ) ( )

Var( ) ( )

2

X xf x dx a b

a b

X x f x dx b a

−∞

−∞

= = +

+ ⎟

= ⎜⎝ =

(28)

Mean, Variance, and Central Limit Theorem Central Limit Theorem (CLT)

The sum of a sufficiently large number of independent and identically distributed (i.i.d.) random variables, each with finite mean and variance, will be

approximately normally distributed.

(29)

Mean, Variance, and Central Limit Theorem

z Let X1, X2, ..., Xn be a sequence of n

i.i.d. random variables each having finite values of expectation μ and variance

σ2 > 0.

z Let Sn = X1 + X2 + … + Xn. If we define

then Zn → N(0,1).

n n

X n

Z n

μ σ

=

(30)

Mean, Variance, and Central Limit Theorem

Example: If a fair coin is tossed 106 times, and we let X = number of heads, then

E[ ] 500, 000

Var[ ] (1 ) 250, 000 X np

X np p

= =

= =

500, 000

(0,1) 500

X N

(31)

Monte Carlo Simulation

z Monte Carlo simulation was named for Monte Carlo, Monaco, where

the primary attractions are casinos containing games of chance. Games of chance such as roulette wheels, dice, and slot machines, exhibit random behavior.

z Monte Carlo simulation randomly generates values for uncertain

variables over and over to simulate a model.

(32)

Monte Carlo Simulation

Why Simulation?

z In some circumstances, it may not be feasible to observe the behavior directly or to conduct experiment.

z A system of elevators during rush hour.

z Automobile traffic controlling.

z In other situations, the system for which

alternative procedures need to be tested may not even exist yet.

z Communication networks in an office building

z Locations of machines in a new industrial plant.

(33)

Monte Carlo Simulation

z In such instances, the modeler might simulate the behavior and test the

various alternatives being considered to estimate how each affects the behavior.

z Build a scaled model.

z Drag force on a proposed submarine.

z Designs of a jet airplane.

z Monte Carlo simulation which is typically accomplished with the aid of a computer.

(34)

Monte Carlo Simulation

z Processes with an element of chance involved are called probabilistic, as opposed to

deterministic.

z Monte Carlo simulation is a probabilistic

model. It can also be used to approximate a deterministic behavior.

Behavior Probabilistic Deterministic

Model Probabilistic Deterministic

(35)

Monte Carlo Simulation

Probabilistic Behavior: an Unfair Die

z Consider a probability model where each event is not equally likely. Assume a die is biased

according to the following distribution.

0.1 6

0.2 5

0.3 4

0.2 3

0.1 2

0.1 1

P(roll) Roll value

SIX (0.9, 1.0]

FIVE (0.7, 0.9]

FOUR (0.4, 0.7]

THREE (0.2, 0.4]

TWO (0.1, 0.2]

ONE [0, 0.1]

Assignment Value of xi

(36)

Monte Carlo Simulation

0.1 0.1045

0.1044 0.104

0.120 0.150

6

0.2 0.2011

0.2012 0.201

0.184 0.280

5

0.3 0.3081

0.3082 0.308

0.320 0.350

4

0.2 0.1962

0.1962 0.192

0.199 0.210

3

0.1 0.0992

0.0992 0.099

0.099 0.080

2

0.1 0.0948

0.0948 0.094

0.098 0.130

1

Expected Results 40000

10000 5000

1000 Die 100

Value

(37)

Monte Carlo Integration

z Area under a curve

area under curve number of points counted below curve area of rectangle total number of random points

(38)

Monte Carlo Integration

z The area under the curve y = cos x over the interval -π/2 ≤ x ≤ π/2.

2.01186 30000

1.9664 1000

2.01093 20000

1.99666 900

2.00978 15000

1.99491 800

2.00873 10000

2.02857 700

2.00669 8000

2.09440 600

2.02319 6000

2.04832 500

2.01439 5000

2.12058 400

1.99962 4000

2.01064 300

1.97711 3000

2.13628 200

1.94465 2000

2.07345 100

Approximation to Area

Number of points Approximation

to area Number of

points

(39)

Monte Carlo Integration

z Let the area under the curve be I, while the area of the rectangle is R. Suppose that n out of N points fall under the curve, then we

estimate the integral to be

z There is a probability I/R that each random point will fall under the curve.

ˆ n

I R

= N

(40)

Monte Carlo Integration

z Thus, the number of random points fall under the curve will be binomially

distributed.

E( )

Var( ) 1

n N I

R

I I

n N

R R

=

⎞⎟

= ⎜⎝ − ⎟

(41)

Monte Carlo Integration

z We calculate

z For we have I = 2, R = π.

( )

2

E( )ˆ E( )

Var( )ˆ Var( )

I R n I

N

I R I

I R n

N N

= =

⎛ ⎞⎟

= ⎜ ⎟⎜⎝ ⎠ =

/ 2

/ 2 cosx dx,

π π

( )

2 2 1.5

Var( )Iˆ

N N

π

=

(42)

Monte Carlo Optimization

z Powerful applications of Monte Carlo simulation lie in numerical optimization.

z There are many Monte Carlo

optimization methods including Genetic Algorithm and Simulated Annealing.

z The method is relatively simple, but can effectively solve many famous complex problems.

(43)

Monte Carlo Optimization

Simulated Annealing

z Simulated annealing is an analogy with thermodynamics when liquids freeze and crystallize, or metals cool and anneal.

z At high temperatures, the molecules of liquids move freely. If the liquid is cooled slowly, thermal mobility is lost. Atoms line up to form a pure crystal, which is the state of minimum energy.

(44)

Monte Carlo Optimization

z If the liquid is cooled quickly or

quenched, it will not reach the ground state but ends up in some other states having somewhat higher energy.

z The essence of the process is slow cooling, allowing ample time for

redistribution of the atoms as they lose mobility. This is the technical definition of annealing.

(45)

Monte Carlo Optimization

z A system in thermal equilibrium at temperature T has its energy

probabilistically distributed among all

different energy states E according to the Boltzman distribution:

where k is the Boltzman’s constant.

( ) E kT/

P E e

(46)

Monte Carlo Optimization

z At low temperatures, the system can still be in a high energy state with a small

probability. This allow the system to get out of a local minimum in favor of

finding a better, more global, one.

z As the temperature is lowered, the system will be less likely to be in a higher energy state.

(47)

Monte Carlo Optimization

z In 1953, Metropolis et al propose that a system will change its configuration from energy E1 to energy E2 with probability

This general scheme is known as the Metropolis algorithm.

2 1

( )/

2 1

1 2

1 2

( ) if

1 if

E E kT

e E E

P E E

E E

⎧⎪

= ⎨⎪⎪⎪⎪⎩ <

(48)

Monte Carlo Optimization

z To make use of the Metropolis algorithm in general systems, we must provide:

z Possible system configurations.

z Random changes in the configuration

z An objective function E to be minimized.

z A control parameter T and an annealing schedule.

z Could be accomplished with a physical insight or an trial-and-error experiments.

(49)

Monte Carlo Optimization

The Traveling Salesman Problem

z A salesman visits N cities with given

positions (xi,yi), finally returning to the city of origin. Each city is to be visited only once, and the route is to be made as short as possible.

z The cities are number i = 1, 2, …, N.

(50)

Monte Carlo Optimization

z Configuration: A configuration is the order in which the cities are visited;

thus, it is a permutation of 1, 2, …, N.

z Rearrangments: An efficient suggested moves are done by

z Reversal: A section of path is removed and replaced with the same cities in opposite order.

z Transport: A section of path is removed and

replaced in between two cities on another randomly chosen part of the path.

(51)

Monte Carlo Optimization

z Objective Function: E is just the total length of journey

where the city N+1 is just the city 1.

(

1

)

2

(

1

)

2

1 N

i i i i

i

E x x + y y +

=

=

+
(52)

Monte Carlo Optimization

z Annealing Schedule: Require some

experiments. First generate some random rearrangements to determine the range

of ΔE, then proceed in multiplicative

steps each amounting to a 10% decrease in T. Hold a new T for 100N

reconfiguration or 10N successful

reconfiguration, whichever comes first.

Until efforts to reduce E further becomes discouraging.

Referensi

Dokumen terkait

Pada dasarnya simulasi Monte Carlo dilakukan dengan cara membangkitkan bilangan random berdasarkan karakteristik dari data yang akan dibangkitkan yang kemudian dapat

Simulasi Monte Carlo melibatkan penggunaan angka acak (angka random) untuk memodalkan sistem sehingga pada penelitian ini digunakan dua software untuk menentukan angka

Simulasi Monte Carlo memiliki sifat dasar stokastik yang artinya metode ini berdasarkan pada penggunaan angka-angka yang bersifat acak (random number) dan kemungkinan

Metode yang digunakan adalah Simulasi Monte Carlo dimana perhitungannya dilakukan ber- ulang-ulang dengan menggunakan bilangan random dalam jumlah yang besar pada variabel input yang

Dengan komputasi menggunakan metode monte carlo kita dapat mendistribusikan nilai random untuk dapat memunculkan semua kemungkinan yang terjadi dari path ini..

Simulasi Monte Carlo melibatkan penggunaan angka acak (angka random) untuk memodalkan sistem sehingga pada penelitian ini digunakan dua software untuk menentukan angka

Pada dasarnya simulasi Monte Carlo dilakukan dengan cara membangkitkan bilangan random berdasarkan karakteristik dari data yang akan dibangkitkan yang kemudian dapat

Article Massive Monte Carlo simulations-guided interpretable learning of two-dimensional Curie temperature Graphical abstract Highlights d Data-driven models for Curie temperature