• No results found

Markov Chains (273023), Exercise session 1, Tue 15 Jan 2013. Exercise 1.1. Let us study a two-state Markov chain (X

N/A
N/A
Protected

Academic year: 2021

Share "Markov Chains (273023), Exercise session 1, Tue 15 Jan 2013. Exercise 1.1. Let us study a two-state Markov chain (X"

Copied!
1
0
0

Loading.... (view fulltext now)

Full text

(1)

Markov Chains (273023), Exercise session 1, Tue 15 Jan 2013.

Exercise 1.1. Let us study a two-state Markov chain (X0, X1, . . . ) with state space Ω = {W, E} where the probability of jumping from state W to state E is p (0 ≤ p ≤ 1) and the probability of jumping from state E to W is q (0 ≤ q ≤ 1). Assuming that the system starts at state W , i.e. µ0 = (1, 0), i.e. P(X0 = W ) = 1, show that if p + q = 1, then µk = µ1 for all k = 1, 2, . . . . Does µ1 depend on µ0?

Exercise 1.2 (Adapted from Ross: Ex. 1 p. 263). Four white and four black balls are distributed in two urns in such a way that each contains four balls. We say that the system is in state i, i = 0, 1, 2, 3, 4, if the first urn contains i white balls. At each step, we draw one ball from each urn and place the ball drawn from the first urn into the second, and conversely with the ball from the second urn. Let Xn denote the state of the system after the nth step. Explain why (X0, X1, . . . ) is a Markov chain and find the transition probability matrix.

Exercise 1.3. In the previous exercise, assume that all the white balls are in the first urn in the beginning. With the help of a computer program (e.g. Mathematica) find the probabilities for X20 = 0 and X20= 4. Find also the probability that there are both black and white balls in the first urn after 20 steps.

Hint: Modify the frog.nb file that is available on the course web page.

Exercise 1.4 (Ross: Ex. 5 p. 264). A Markov chain (Xn)n=0 with states Ω = {0, 1, 2} has the transition probability matrix

P =

1/2 1/3 1/6 0 1/3 2/3 1/2 0 1/2

. If µ0 = (1/4 1/4 1/2), find E[X3].

Exercise 1.5 (Levin, Peres, Wilmer: Ex. 1.7 p. 18). A transition matrix P is symmetric if P (x, y) = P (y, x) for all x, y ∈ Ω. Let n =

|Ω| < ∞. Show that the uniform distribution π = (1/n, . . . , 1/n) is stationary, i.e. π = πP .

Exercise 1.6 (Levin, Peres, Wilmer: Ex. 1.1 p. 18). Let P be the transition matrix of random walk on the n-cycle, where n is odd. Find the smallest value t such that Pt(x, y) > 0 for all states x, y.

1

References

Related documents

where, in the last step, we have used the Galerkin orthogonality to eliminate terms

Show that the uniform distribution is a stationary distribution for the associated Markov chain..

Suppose that P is a transition probability matrix of an irreducible Markov chain (with possibly countably many states).. Let P be a symmetric transition probability matrix of

(1) The row sums and the column sums equal one for every tran- sition probability matrix of a Markov chain with finite state space.. (2) There exists a Markov chain with finite

Att förhöjningen är störst för parvis Gibbs sampler beror på att man på detta sätt inte får lika bra variation mellan de i tiden närliggande vektorerna som när fler termer

In chapter two, the influence of additional information on the effectiveness of ethically certified goods on the purchasing decision of consumers is

Number theory, Talteori 6hp, Kurskod TATA54, Provkod TEN1 June 4, 2019.. LINK ¨ OPINGS UNIVERSITET Matematiska Institutionen Examinator:

Number theory, Talteori 6hp, Kurskod TATA54, Provkod TEN1 August 26, 2017.. LINK ¨ OPINGS UNIVERSITET Matematiska Institutionen Examinator: