• Tidak ada hasil yang ditemukan

Monte Carlo Methods

N/A
N/A
Protected

Academic year: 2024

Membagikan "Monte Carlo Methods"

Copied!
9
0
0

Teks penuh

(1)

Chapter 4: Monte Carlo Methods

Paisan Nakmahachalasint [email protected] http://pioneer.chula.ac.th/~npaisan

zMonte Carlo Methods are a class of computational algorithms that rely on repeated random sampling to compute their results.

zMonte Carlo simulation methods are especially useful in studying systems with a large number of coupled degrees of freedom.

Introduction

zMonte Carlo methods describe a large and widely-used class of approaches which tend to follow a particular pattern:

zDefine a domain of possible inputs.

zGenerate inputs randomly from the domain using a certain specified probability distribution.

zPerform a deterministic computation using the inputs.

zAggregate the results of the individual computations into the final result.

Random Number Generators

zMonte Carlo simulations are inherently probabilistic, and thus make frequent use of programs to generate random numbers.

zOn a computer these numbers are not random at all -- they are strictly deterministic and reproducible, but they look like a stream of random numbers. For this reason such programs are more correctly called pseudo- random number generators.

Random Number Generators

Essential Properties

z Repeatability-- the same sequence should be produced with the same seeds.

z Randomness-- should produce independent uniformly distributed random variables that pass all statistical tests for randomness.

z Long period– The period should be much longer than the amount of random numbers needed.

z Insensitive to seeds-- period and randomness properties should not depend on the initial seeds.

Random Number Generators

Additional Properties

z Portability-- should give the same results on different computers.

z Efficiency-- should be fast (small number of floating point operations) and not use much memory.

z Disjoint subsequences-- different seeds should produce long independent (disjoint) subsequences so that there are no correlations between simulations with different initial seeds.

z Homogeneity-- sequences of all bits should be random.

(2)

Random Number Generators

Middle-Square Method

z Start with a four-digit number x0, called the seed.

z Square it to obtain an eight-digit number (add a leading zero if necessary).

z Take the middle four digits as the next random number.

64 80 1004 1049 0180 1009 7423 1656 xn 2041

8 7 6 5 4 3 2 1 n 0

Random Number Generators

Linear Congruence Generator (LCG)

zIntroduced by D. H. Lehmer in 1951. A great majority of pseudo-random numbers used today are based on this method.

xn+1= (axn+ b) (mod m) Example: if x0= 7, a= 1, b= 7, m= 10,

we obtain the following sequence:

7, 4, 1, 8, 5, 2, 9, 6, 3, 0, 7, 4, …

Random Number Generators

zChoice of the modulus, m.

zmshould be as large as possible since the period can never be larger than m.

zOne usually chooses mto be near the largest integer that can be represented.

zChoice of the multiplier, a

It was proven by M.Greenberger in 1961 that the sequence will have period mif and only if

zbis a relative prime to m.

za– 1 is a multiple of pfor all p| m.

z4|m4|a – 1.

Random Number Generators

zWith b= 0, one cannot get the full period, but in order to get the maximum possible, the followings should be

satisfied:

zx0is a relative prime to m.

zais a primitive root modulo m.

zIt is possible to obtain a period of length m– 1, but usually the period is around m/4.

Random Number Generators

A Disadvantage of LCG

zRANDU is a popular random number generator distributed by IBM in 1960’s with the algorithm

xn+1= 65539xn(mod 231).

zThis generator was later found to have a serious problem – all random numbers fall into hyperplanes.

Random Number Generators

z5000 random points in 1 and 2 dimensions generated from RANDU.

0.2 0.4 0.6 0.8 1.

10 20 30 40 50 60

0 0.2 0.4 0.6 0.8 1

0 0.2 0.4 0.6 0.8 1

Histogram of 5000 random numbers.

Distribution of 5000 points in 2 dimensions

(3)

z5000 random points in 3 dimensions generated from RANDU viewed at two different view points.

LCGs in rand() in various compilers.

In all above compilers, m= 232

16 - 30 12345

1103515245 ANSI C

32 - 63 1

134775813 Borland Delphi

16 - 30 2531011

214013 Microsoft C/C++

0 - 30 12345

1103515245 Glibc (GCC)

16 - 30 1

22695477 Borland C/C++

Output bits in rand() b

a Source

Random Number Generators

Lagged-Fibonacci Generator (LFG)

zLagged Fibonacci pseudo-random number generators have become increasingly popular in recent years. These generators are so named because of their similarity to the familiar Fibonacci sequence. LFG uses the algorithm

xn= xnl+ xnk(mod m) where l> k> 0 with minitial values {x0, x1, …, xl– 1}.

Random Number Generators

zFor most applications of interest, mis a power of two; i.e., m= 2M.

zWith a proper choice of l, kand the initial values, the period Pof LFG is (2l– 1)×2M– 1.

zTwo commonly used LFGs.

xn= xn– 17+ xn– 5(mod 231) period ≈247 xn= xn– 55+ xn– 24(mod 231) period ≈285

Random Number Generators

zUnlike LCG, the value of the modulus, m, does not by itself limit the period of the generator.

zLFG is computationally simple: an integer add, a logical AND (to

accomplish the mod 2Moperation), and the advancing of two array pointers are the only operations required to produce a new random number.

Random Number Generators

Non-uniform Random Numbers

zMost random number generators give uniformly distributed numbers.

zNon-uniform distributions can be simulated by

zInverse CDF Method

zAcceptance/Rejection Method

(4)

Random Number Generators

Inverse CDF Method

zIf Xis a random variable having the probability density function f(x) and the cumulative distribution function (CDF)

then the random variable U= F(X)

is uniformly distributed on the interval [0,1].

zThus X= F–1(U) has the desired distribution.

( ) x ( )

F x f t dt

=

−∞

Random Number Generators

An exponential distribution

has for x≥0.

zIf uis uniform on (0,1), then

has the exponential distribution.

for 0

( ) 0 for 0

e x x

f x x

λ λ

⎧⎪ >

= ⎨⎪⎪⎪⎪⎩

0.5 1 1.5 2 2.5 3x 0.2

0.4 0.6 0.8 1 FHxL

0.5 1 1.5 2 2.5 3x

0.5 1 1.5 2 fHxL

( ) x

f x=λeλ

( ) x F x=eλ

( ) 1 x

F x = −eλ

lnu x= − λ

Random Number Generators

Acceptance/Rejection Method

zWe can generate xhaving the density g(x) and f(x) ≤c g(x) for some constant c.

zWe can generate x with the density f(x).

zGenerate xwith the density g(x).

zGenerate a uniform random number uin (0,1).

zIf , then xis accepted

otherwise repeat the step with the generation of x.

( ) f x ( ) cg x

x

( ) ( ) u f x

cg x

Random Number Generators

Normal Distribution

zThe standard normal distribution can be generated with the Box-Muller method.

zLet u1and u2are independently distribution as uniform on the interval (0,1). Then

are independently distributed as the standard normal.

1 1 2

2 1 2

cos(2 ) 2 ln sin(2 ) 2 ln

x u u

x u u

π π

=

=

Mean, Variance, and Central Limit Theorem

zA random variablecan be thought of as an unknown value that may change every time it is inspected. Thus, a random variable can be thought of as a function mapping the sample space of a random process to the real numbers.

zA random variable can be discreteor continuous.

Mean, Variance, and Central Limit Theorem zMeanof a random variable the integral

of the random variable with respect to its probability measure.

E[ ] i ( )i

i

X =

x P x

E[ ]X x f x dx( )

=

−∞

Discrete RVs

Continuous RVs

(5)

zThe variance of a variable X with the mean μ= E[X] is given by

( )2

2 2

Var[ ] E E

X X

X μ

μ

=

=

( )2

Var[ ] i ( )i

i

X =

x μ P x

( )2

Var[ ]X x μ f x dx( )

=

−∞ Discrete RVs

Continuous RVs

Example: Binomial Distribution X ~ B(n,p)

( )

( ) n x 1 n x 0,1, ,

P X=x =⎛ ⎞⎟⎜ ⎟⎜ ⎟⎜ ⎟⎜⎝ ⎠x p p x= n

( ) ( )

0 2 2 0

E( ) ( )

Var( ) ( ) 1

n

x n

x

X xP x np

X x P x np np p

=

=

= =

⎞⎟

= =

Mean, Variance, and Central Limit Theorem

Example:Continous Uniform Distribution

X~ U(a,b) 1

( ) for

0 otherwise a x b b a

f x = ⎨⎧⎪⎪⎪⎪ − ≤ ≤

⎪⎪⎪⎪⎩

( )

(

2

)

2 ( )2

1 2

1 12

E( ) ( )

Var( ) ( )

2

X xf x dx a b

a b

X x f x dx b a

−∞

−∞

= = +

+ ⎟

= ⎜⎝ =

Mean, Variance, and Central Limit Theorem Central Limit Theorem (CLT)

The sum of a sufficiently large number of independent and identically distributed (i.i.d.) random variables, each with finite mean and variance, will be

approximately normally distributed.

Mean, Variance, and Central Limit Theorem

zLet X1, X2, ..., Xnbe a sequence of n i.i.d. random variables each having finite values of expectation μand variance σ2 > 0.

zLet Sn= X1+ X2+ … + Xn. If we define

then Zn →N(0,1).

n n

X n

Z n

μ σ

=

Mean, Variance, and Central Limit Theorem Example: If a fair coin is tossed 106times,

and we let X= number of heads, then E[ ] 500, 000

Var[ ] (1 ) 250, 000 X np

X np p

= =

= =

500, 000

(0,1) 500

X N

(6)

Monte Carlo Simulation

zMonte Carlo simulation was named for Monte Carlo, Monaco, where the primary attractions are casinos containing games of chance. Games of chance such as roulette wheels, dice, and slot machines, exhibit random behavior.

zMonte Carlo simulation randomly generates values for uncertain variables over and over to simulate a model.

Monte Carlo Simulation

Why Simulation?

zIn some circumstances, it may not be feasible to observe the behavior directly or to conduct experiment.

zA system of elevators during rush hour.

zAutomobile traffic controlling.

zIn other situations, the system for which alternative procedures need to be tested may not even exist yet.

zCommunication networks in an office building

zLocations of machines in a new industrial plant.

Monte Carlo Simulation

zIn such instances, the modeler might simulatethe behavior and test the various alternatives being considered to estimate how each affects the behavior.

zBuild a scaled model.

zDrag force on a proposed submarine.

zDesigns of a jet airplane.

zMonte Carlo simulation which is typically accomplished with the aid of a computer.

Monte Carlo Simulation

zProcesses with an element of chance involved are called probabilistic, as opposed to deterministic.

zMonte Carlo simulation is a probabilistic model. It can also be used to approximate a deterministic behavior.

Behavior Probabilistic Deterministic

Model Probabilistic Deterministic

Monte Carlo Simulation

Probabilistic Behavior: an Unfair Die

zConsider a probability model where each event is not equally likely. Assume a die is biased according to the following distribution.

0.1 6

0.2 5

0.3 4

0.2 3

0.1 2

0.1 1

P(roll) Roll value

SIX (0.9, 1.0]

FIVE (0.7, 0.9]

FOUR (0.4, 0.7]

THREE (0.2, 0.4]

TWO (0.1, 0.2]

ONE [0, 0.1]

Assignment Value of xi

Monte Carlo Simulation

0.1 0.1045 0.1044 0.104 0.120 0.150 6

0.2 0.2011 0.2012 0.201 0.184 0.280 5

0.3 0.3081 0.3082 0.308 0.320 0.350 4

0.2 0.1962 0.1962 0.192 0.199 0.210 3

0.1 0.0992 0.0992 0.099 0.099 0.080 2

0.1 0.0948 0.0948 0.094 0.098 0.130 1

Expected Results 40000 10000 5000 1000 Die 100

Value

(7)

zArea under a curve

area under curve number of points counted below curve area of rectangle total number of random points

zThe area under the curve y= cosx over the interval -π/2 ≤x≤ π/2.

2.01186 30000

1.9664 1000

2.01093 20000

1.99666 900

2.00978 15000

1.99491 800

2.00873 10000

2.02857 700

2.00669 8000

2.09440 600

2.02319 6000

2.04832 500

2.01439 5000

2.12058 400

1.99962 4000

2.01064 300

1.97711 3000

2.13628 200

1.94465 2000

2.07345 100

Approximation to Area Number of

points Approximation

to area Number of

points

Monte Carlo Integration

zLet the area under the curve be I, while the area of the rectangle is R. Suppose that n out of N points fall under the curve, then we estimate the integral to be

zThere is a probability I/Rthat each random point will fall under the curve.

ˆ n

I R

=N

Monte Carlo Integration

zThus, the number of random points fall under the curve will be binomially distributed.

E( )

Var( ) 1

n N I R

I I

n N

R R

=

⎞⎟

= ⎜⎝ − ⎟

Monte Carlo Integration

zWe calculate

zFor we have I= 2, R= π.

( )

2

E( )ˆ E( ) Var( )ˆ Var( )

I R n I

N

I R I

I R n

N N

= =

⎛ ⎞⎟

=⎜ ⎟⎜⎝ ⎠ =

/ 2 / 2cosx dx,

π π

( )

2 2 1.5

Var( )Iˆ

N N

π

=

Monte Carlo Optimization

zPowerful applications of Monte Carlo simulation lie in numerical optimization.

zThere are many Monte Carlo

optimization methods including Genetic Algorithm and Simulated Annealing.

zThe method is relatively simple, but can effectively solve many famous complex problems.

(8)

Monte Carlo Optimization

Simulated Annealing

zSimulated annealing is an analogy with thermodynamics when liquids freeze and crystallize, or metals cool and anneal.

zAt high temperatures, the molecules of liquids move freely. If the liquid is cooled slowly, thermal mobility is lost. Atoms line up to form a pure crystal, which is the state of minimum energy.

Monte Carlo Optimization

zIf the liquid is cooled quickly or quenched, it will not reach the ground state but ends up in some other states having somewhat higher energy.

zThe essence of the process is slow cooling, allowing ample time for redistribution of the atoms as they lose mobility. This is the technical definition of annealing.

Monte Carlo Optimization

zA system in thermal equilibrium at temperature Thas its energy

probabilistically distributed among all different energy states E according to the Boltzman distribution:

where kis the Boltzman’s constant.

( ) E kT/

P E e

Monte Carlo Optimization

zAt low temperatures, the system can still be in a high energy state with a small probability. This allow the system to get out of a local minimum in favor of finding a better, more global, one.

zAs the temperature is lowered, the system will be less likely to be in a higher energy state.

Monte Carlo Optimization

zIn 1953, Metropolis et al propose that a system will change its configuration from energy E1to energy E2with probability

This general scheme is known as the Metropolis algorithm.

2 1

( )/

2 1

1 2

1 2

( ) if

1 if

E E kT

e E E

P E E

E E

⎧⎪

= ⎨⎪⎪⎪⎪⎩ <

Monte Carlo Optimization

zTo make use of the Metropolis algorithm in general systems, we must provide:

zPossible system configurations.

zRandom changes in the configuration

zAn objective function Eto be minimized.

zA control parameter Tand an annealing schedule.

zCould be accomplished with a physical insight or an trial-and-error experiments.

(9)

The Traveling Salesman Problem

zA salesman visits Ncities with given positions (xi,yi), finally returning to the city of origin. Each city is to be visited only once, and the route is to be made as short as possible.

zThe cities are number i= 1, 2, …, N.

zConfiguration: A configuration is the order in which the cities are visited;

thus, it is a permutation of 1, 2, …, N.

zRearrangments: An efficient suggested moves are done by

zReversal: A section of path is removed and replaced with the same cities in opposite order.

zTransport: A section of path is removed and replaced in between two cities on another randomly chosen part of the path.

Monte Carlo Optimization

zObjective Function: Eis just the total length of journey

where the city N+1 is just the city 1.

( 1)2 ( 1)2

1 N

i i i i

i

E x x+ y y+

=

=

+

Monte Carlo Optimization

zAnnealing Schedule: Require some experiments. First generate some random rearrangements to determine the range of ΔE, then proceed in multiplicative steps each amounting to a 10% decrease in T. Hold a new Tfor 100N

reconfiguration or 10Nsuccessful reconfiguration, whichever comes first.

Until efforts to reduce E further becomes discouraging.

Referensi

Dokumen terkait

Pada dasarnya simulasi Monte Carlo dilakukan dengan cara membangkitkan bilangan random berdasarkan karakteristik dari data yang akan dibangkitkan yang kemudian dapat

Metode yang digunakan adalah Simulasi Monte Carlo dimana perhitungannya dilakukan ber- ulang-ulang dengan menggunakan bilangan random dalam jumlah yang besar pada variabel input yang

Dengan komputasi menggunakan metode monte carlo kita dapat mendistribusikan nilai random untuk dapat memunculkan semua kemungkinan yang terjadi dari path ini..

Simulasi Monte Carlo melibatkan penggunaan angka acak (angka random) untuk memodalkan sistem sehingga pada penelitian ini digunakan dua software untuk menentukan angka

Pada dasarnya simulasi Monte Carlo dilakukan dengan cara membangkitkan bilangan random berdasarkan karakteristik dari data yang akan dibangkitkan yang kemudian dapat

Article Massive Monte Carlo simulations-guided interpretable learning of two-dimensional Curie temperature Graphical abstract Highlights d Data-driven models for Curie temperature

Finally, this article showed that the Refined Monte Carlo method decreased the computa- tional time of the value of the ERBO by approximately 98% compared to the Simplified Monte Carlo

The corrosion prediction involved discrete random numbers generation from historic data, estimation of yearly corrosion rate with Brownian random walk, prediction of the corrosion