Introduction To Simulation Using R - Free Textbook Course

4m ago
23 Views
0 Downloads
459.29 KB
19 Pages
Transcription

Chapter 13Introduction to Simulation Using RA. Rakhshan and H. Pishro-Nik13.1Analysis versus Computer SimulationA computer simulation is a computer program which attempts to represent the real world basedon a model. The accuracy of the simulation depends on the precision of the model. Suppose thatthe probability of heads in a coin toss experiment is unknown. We can perform the experimentof tossing the coin n times repetitively to approximate the probability of heads.P (H) Number of times heads observedNumber of times the experiment executedHowever, for many practical problems it is not possible to determine the probabilities by executing experiments a large number of times. With today’s computers processing capabilities, weonly need a high-level language, such as R, which can generate random numbers, to deal withthese problems.In this chapter, we present basic methods of generating random variables and simulateprobabilistic systems. The provided algorithms are general and can be implemented in anycomputer language. However, to have concrete examples, we provide the actual codes in R. Ifyou are unfamiliar with R, you should still be able to understand the algorithms.13.2Introduction: What is R?R is a programming language that helps engineers and scientists find solutions for given statistical problems with fewer lines of codes than traditional programming languages, such as C/C or Java, by utilizing built-in statistical functions. There are many built-in statistical functionsand add-on packages available in R. It also has high quality customizable graphics capabilities.R is available for Unix/Linux, Windows, and Mac. Besides all these features, R is free!13.3Discrete and Continuous Random Number GeneratorsMost of the programming languages can deliver samples from the uniform distribution to us(In reality, the given values are pseudo-random instead of being completely random.) The rest1

2CHAPTER 13. INTRODUCTION TO SIMULATION USING RA. RAKHSHAN AND H. PISHRO-NIKof this section shows how to convert uniform random variables to any other desired randomvariable. The R code for generating uniform random variables is:U runif (n, min 0, max 1)which returns a pseudorandom value drawn from the standard uniform distribution on the openinterval (0,1).13.3.1Generating Discrete Probability Distributions from Uniform DistributionLet’s see a few examples of generating certain simple distributions:Example 1. (Bernoulli) Simulate tossing a coin with probability of heads p.Solution: Let U be a Uniform(0,1) random variable. We can write Bernoulli random variableX as: 1U pX 0U pThus,P (H) P (X 1) P (U p) pTherefore, X has Bernoulli(p) distribution. The R code for Bernoulli(0.5) is:p 0.5;U runif (1, min 0, max 1);X (U p);Since the “runif(1, min 0, max 1)” command returns a number between 0 and 1, we dividedthe interval [0, 1] into two parts, p and 1 p in length. Then, the value of X is determined basedon where the number generated from uniform distribution fell.Example 2. (Coin Toss Simulation) Write codes to simulate tossing a fair coin to see how thelaw of large numbers works.Solution: You can write:

13.3. DISCRETE AND CONTINUOUS RANDOM NUMBER GENERATORS3n 1000;U runif (n, min 0, max 1);toss U 0.5;a numeric(n 1);avg numeric(n);f or(iin2 : n 1){a[i] a[i 1] toss[i 1];avg[i 1] a[i]/(i 1);}plot(1 : n, avg[1 : n], type ”l”, lwd 5, col ”blue”, ylab ”P roportionof Heads”,xlab ”CoinT ossN umber”, cex.main 1.25, cex.lab 1.5, cex.axis 1.75)If you run the above codes to compute the proportion of ones in the variable “toss,” the resultwill look like Figure 13.1. You can also assume the coin is unbiased with probability of headsequal to 0.6 by replacing the third line of the previous code with:toss U 0.6;Figure 13.1: R - coin toss simualtion

4CHAPTER 13. INTRODUCTION TO SIMULATION USING RA. RAKHSHAN AND H. PISHRO-NIKExample 3. (Binomial) Generate a Binomial(50, 0.2) random variable.Solution: To solve this problem, we can use the following lemma:Lemma 1. If X1 , X2 , ., Xn are independent Bernoulli(p) random variables, then the randomvariable X defined by X X1 X2 . Xn has a Binomial(n, p) distribution.To generate a random variable X Binomial(n, p), we can toss a coin n times and count thenumber of heads. Counting the number of heads is exactly the same as finding X1 X2 . Xn ,where each Xi is equal to one if the corresponding coin toss results in heads and zero otherwise.Since we know how to generate Bernoulli random variables, we can generate a Binomial(n, p)by adding n independent Bernoulli(p) random variables.p 0.2;n 50;U runif (n, min 0, max 1);X sum(U p);Generating Arbitrary Discrete DistributionsIn general, we can generate any discrete random variables similar to the above examples usingthe following algorithm. Suppose we would like to simulatethe discrete random variable X withPrange RX {x1 , x2 , ., xn } and P (X xj ) pj , so j pj 1.To achieve this, first we generate a random number U (i.e., U U nif orm(0, 1)). Next, wedivide the interval [0, 1] into subintervals such that the jth subinterval has length pj (Figure13.2). Assume x0 if (U p0 ) x1 if (p0 U p0 p1 ) .X P Pj j 1 xifp U p j k 0 kk 0 k . .In other wordsX xjif F (xj 1 ) U F (xj ),where F (x) is the desired CDF. We haveP (X xj ) Pj 1Xk 0 pjpk U jXk 0!pk

13.3. DISCRETE AND CONTINUOUS RANDOM NUMBER GENERATORSp0 p1 p2 p3···5pj01Figure 13.2: Generating discrete random variablesExample 4. Give an algorithm to simulate the value of a random variable X such thatP (X 1) 0.35P (X 2) 0.15P (X 3) 0.4P (X 4) 0.1Solution: We divide the interval [0, 1] into subintervals as follows:A0 [0, 0.35)A1 [0.35, 0.5)A2 [0.5, 0.9)A3 [0.9, 1)Subinterval Ai has length pi . We obtain a uniform number U . If U belongs to Ai , then X xi .P (X xi ) P (U Ai ) piP c(0.35, 0.5, 0.9, 1);X c(1, 2, 3, 4);counter 1;r runif (1, min 0, max 1);while(r P [counter])counter counter 1;endX[counter]13.3.2Generating Continuous Probability Distributions from the UniformDistribution- Inverse Transformation MethodAt least in principle, there is a way to convert a uniform distribution to any other distribution.Let’s see how we can do this. Let U U nif orm(0, 1) and F be a CDF. Also, assume F iscontinuous and strictly increasing as a function.

6CHAPTER 13. INTRODUCTION TO SIMULATION USING RA. RAKHSHAN AND H. PISHRO-NIKTheorem 1. Let U U nif orm(0, 1) and F be a CDF which is strictly increasing. Also,consider a random variable X defined asX F 1 (U ).Then,X F(The CDF of XisF)Proof:P (X x) P (F 1 (U ) x) P (U F (x))(increasing function) F (x)Now, let’s see some examples. Note that to generate any continuous random variable X withthe continuous cdf F , F 1 (U ) has to be computed.Example 5. (Exponential) Generate an Exponential(1) random variable.Solution: To generate an Exponential random variable with parameter λ 1, we proceedas followsF (x) 1 e xx 0U U nif orm(0, 1)X F 1 (U ) ln(1 U )X FThis formula can be simplified since1 U UU nif orm(0, 1)1 U01Figure 13.3: Symmetry of UniformHence we can simulate X usingX ln(U )U runif (1, min 0, max 1);X log(U )

13.3. DISCRETE AND CONTINUOUS RANDOM NUMBER GENERATORS7Example 6. (Gamma) Generate a Gamma(20,1) random variable.Solution: For this example, F 1 is even more complicated than the complicated gamma cdfF itself. Instead of inverting the CDF, we generate a gamma random variable as a sum of nindependent exponential variables.Theorem 2. Let X1 , X2 , · · · , Xn be independent random variables with Xi Exponential(λ).DefineY X1 X2 · · · XnBy the moment generating function method, you can show that Y has a gamma distributionwith parameters n and λ, i.e., Y Gamma(n, λ).Having this theorem in mind, we can write:n 20;lambda 1;X ( 1/lambda) sum(log(runif (n, min 0, max 1)));Example 7. (Poisson) Generate a Poisson random variable. Hint: In this example, use the factthat the number of events in the interval [0, t] has Poisson distribution when the elapsed timesbetween the events are Exponential.Solution: We want to employ the definition of Poisson processes. Assume N represents thenumber of events (arrivals) in [0,t]. If the interarrival times are distributed exponentially (withparameter λ) and independently, then the number of arrivals occurred in [0,t], N , has Poissondistribution with parameter λt (Figure 13.4). Therefore, to solve this problem, we can repeatgenerating Exponential(λ) random variables while their sum is not larger than 1 (choosingt 1). More specifically, we generate Exponential(λ) random variablesTi 1ln(Ui )λby first generating uniform random variables Ui ’s. Then we defineX max {j : T1 · · · Tj 1}The algorithm can be simplified: 1X max j :ln(U1 · · · Uj ) 1λ

8CHAPTER 13. INTRODUCTION TO SIMULATION USING RA. RAKHSHAN AND H. PISHRO-NIKLambda 2;i 0;U runif (1, min 0, max 1);Y (1/Lambda) log(U );sum Y ;while(sum 1){U runif (1, min 0, max 1);Y (1/Lambda) log(U );sum sum Y ;i i 1; }X iExp(λ)Exp(λ)Exp(λ)Exp(λ)01X Maximum number of exponential random variablesFigure 13.4: Poisson Random VariableTo finish this section, let’s see how to convert uniform numbers to normal random variables.Normal distribution is extremely important in science because it is very commonly occuring.Theorem 3. (Box-Muller transformation) We can generate a pair of independent normal variables (Z1 , Z2 ) by transforming a pair of independent U nif orm(0, 1) random variables (U1 , U2 )[1]. Z1 2 ln U1 cos(2πU2 )Z2 2 ln U1 sin(2πU2 )Example 8. (Box-Muller) Generate 5000 pairs of normal random variables and plot bothhistograms.Solution: We display the pairs in Matrix form.

13.4. R COMMANDS FOR SPECIAL DISTRIBUTIONS9n 5000;U 1 runif (n, min 0, max 1)U 2 runif (n, min 0, max 1)Z1 sqrt( 2 log(U 1)) cos(2 pi U 2)Z2 sqrt( 2 log(U 1)) sin(2 pi U 2)hist(Z1, col ”wheat”, label T )Figure 13.5: Histogram of Z1, a normal random variable generated by Box-Muller transformation13.4R Commands for Special DistributionsIn this section, we will see some useful commands for commonly employed distributions. Functions which start with “p”,“q”,“d”, and “r” give the cumulative distribution function (CDF),the inverse CDF, the density function (PDF), and a random variable having the specified distribution respectively.DistributionsCommands

10CHAPTER 13. INTRODUCTION TO SIMULATION USING RA. RAKHSHAN AND H. cpgeomqgeomdgeomrgeomN egativeBinomialqnbinomdnbinomrnbinomP pGamma13.5pnbinompgammaStudenttptU nif rcises1. Write R programs to generate Geometric(p) and Negative Binomial(i,p) random variables.Solution: To generate a Geometric random variable, we run a loop of Bernoulli trials untilthe first success occurs. K counts the number of failures plus one success, which is equalto the total number of trials.K 1;p 0.2;while(runif (1) p)K K 1;KNow, we can generate Geometric random variable i times to obtain a Negative Binomial(i, p)variable as a sum of i independent Geometric (p) random variables.

13.5. EXERCISES11K 1;p 0.2;r 2;success 0;while(success r){if(runif (1) p){K K 1;print 0#F ailure}else{success success 1;print 1}} #SuccessK r 1#N umberoftrialsneededtoobtainrsuccesses2. (Poisson) Use the algorithm for generating discrete random variables to obtain a Poissonrandom variable with parameter λ 2.Solution: We know a Poisson random variable takes all nonnegative integer values withprobabilitiespi P (X xi ) e λλii!for i 0, 1, 2, · · ·To generate a P oisson(λ), first we generate a random number U . Next, we divide theinterval [0, 1] into subintervals such that the jth subinterval has length pj (Figure 13.2).Assume x0 if (U p0 ) x if (p0 U p0 p1 ) 1. .X P Pj j 1 xifp U p jkk k 0k 0 .Here xi i 1, soX i if p0 · · · pi 1 U p0 · · · pi 1 piF (i 1) U F (i)F is CDF

12CHAPTER 13. INTRODUCTION TO SIMULATION USING RA. RAKHSHAN AND H. PISHRO-NIKlambda 2;i 0;U runif (1);cdf exp( lambda);while(U cdf ){i i 1;cdf cdf exp( lambda) lambda i/gamma(i 1); }X i;3. Explain how to generate a random variable with the density f (x) 2.5x x for 0 x 1if your random number generator produces a Standard Uniform random variable U . Hint:use the inverse transformation method.Solution:5FX (X) X 2 UX U(0 x 1)25U runif (1);2X U 5;We have the desired distribution.4. Use the inverse transformation method to generate a random variable having distributionfunctionx2 xF (x) , 0 x 12Solution:X2 X U211(X )2 2U24 r11X 2U 24r1 1X 2U 4 2(X, U [0, 1])

13.5. EXERCISES13By generating a random number, U , we have the desired distribution.U runif (1); 11X sqrt 2U ;425. Let X have a standard Cauchy distribution.FX (x) 11arctan(x) π2Assuming you have U U nif orm(0, 1), explain how to generate X. Then, use this resultto produce 1000 samples of X and compute the sample mean. Repeat the experiment 100times. What do you observe and why?Solution: Using Inverse Transformation Method:U 1π U 211 arctan(X)2π arctan(X) 1X tan π(U )2Next, here is the R code:U numeric(1000);n 100;average numeric(n);f or(iin1 : n){U runif (1000);X tan(pi (U 0.5));average[i] mean(X); }plot(1 : n, average[1 : n], type ”l”, lwd 2, col ”blue”)Cauchy distribution has no mean (Figure 13.6), or higher moments defined.6. (The Rejection Method) When we use the Inverse Transformation Method, we need asimple form of the c