Off-topic Talk Where overpaid, underworked S2000 owners waste the worst part of their days before the drive home. This forum is for general chit chat and discussions not covered by the other off-topic forums.

Can anyone help me work with Markov Chains (Stats)

Thread Tools
 
Old 01-25-2006, 06:07 PM
  #1  
Registered User

Thread Starter
 
brent_strong's Avatar
 
Join Date: Jul 2005
Location: Franklin TN
Posts: 2,282
Likes: 0
Received 0 Likes on 0 Posts
Default Can anyone help me work with Markov Chains (Stats)

I've noticed there are some brilliant mathamaticians on here, so hopefully someone who knows a bit of statistics can help me out.

First off, I'm not just trying to get answers. I understand the basics of what a Markov chain is and the things like what the initial distribution is and w hat the transitional probabilities are and what both mean. My question comes in how to manipulate these matrices to determine probabilties.

I have looked online and am continuing to do so to try to find a tutorial or an example that will help. My professor for the class is more or less useless and unfortunately I can't make it to any office hours this week.

First, I need to find condition probabilities. P(X2 = a | X1 = c) and things like that, including two and three step probabilties like P(X23 = a | X20 = c). I think for these, I just need to use the transition matrix for one step transitions like the first problem I mentioned and then raise the matrix to the Nth power for a transition of N steps. Right?

I may need to take into account the starting distribution here though...

I'll keep on going if anyone here has an idea of what I'm talking about.

Thanks!
Old 01-25-2006, 07:14 PM
  #2  
Moderator

 
magician's Avatar
 
Join Date: Jul 2001
Location: Yorba Linda, CA
Posts: 6,592
Likes: 0
Received 0 Likes on 0 Posts
Default

Originally Posted by brent_strong,Jan 25 2006, 07:07 PM
First, I need to find condition probabilities. P(X2 = a | X1 = c) and things like that, including two and three step probabilties like P(X23 = a | X20 = c). I think for these, I just need to use the transition matrix for one step transitions like the first problem I mentioned and then raise the matrix to the Nth power for a transition of N steps. Right?
Right.

Seriously. That is how you compute n-step probabilities. If you think about the definition of matrix multiplication, you can see this easily for the 2-step case: x1 can get to x2 in two steps via any of k routes:
x1 -> x1 -> x2
x1 -> x2 -> x2
x1 -> x3 -> x2
.
.
.
x1 -> xk -> x2

The corresponding probabilities are:

p11 * p12
p12 * p22
p13 * p32
.
.
.
p1k * pk2

These routes are mutually exclusive, so the total probability is the sum:

p11 * p12 + p12 * p22 + p13 * p32 + . . . + p1k * pk2

which is the (1,2)-entry of M^2.
Old 01-25-2006, 07:21 PM
  #3  
Registered User

Thread Starter
 
brent_strong's Avatar
 
Join Date: Jul 2005
Location: Franklin TN
Posts: 2,282
Likes: 0
Received 0 Likes on 0 Posts
Default

Thanks! That is what was explained in a chapter I have on Markov chains, but they were also talking about taking the initial distribution into account as well. I was a bit confused as to whether it affected the result or not.

The matrix multiplication explanation does make sense though, it's actually quite elegant how simply raising the matrix to a power will account for those possibilities.
Old 01-25-2006, 07:27 PM
  #4  
Registered User

Thread Starter
 
brent_strong's Avatar
 
Join Date: Jul 2005
Location: Franklin TN
Posts: 2,282
Likes: 0
Received 0 Likes on 0 Posts
Default

Actually, with the initial distribution not being uniform, shouldn't that be taken into account. Since at time 0, you would have, say a (.2, .3, .3, .2) chance of states 1,2,3,4, you would need to multiply that by the transitional probabilities to get the probability of each state at time 1.

I think I'm getting this more and more just by typing it out and having to think through it to present it semi-coherently.
Old 01-25-2006, 07:40 PM
  #5  
Moderator

 
magician's Avatar
 
Join Date: Jul 2001
Location: Yorba Linda, CA
Posts: 6,592
Likes: 0
Received 0 Likes on 0 Posts
Default

If you want to know the probability of being in a particular state at a particular step, the initial distribution is relevant. If you want to know the probability of getting from one state to another in n steps, the initial distribution is irrelevant.
Old 01-25-2006, 07:59 PM
  #6  
Registered User

Thread Starter
 
brent_strong's Avatar
 
Join Date: Jul 2005
Location: Franklin TN
Posts: 2,282
Likes: 0
Received 0 Likes on 0 Posts
Default

Ok, cool. That should hold me for the first couple problems! Thanks again.
Old 01-25-2006, 08:09 PM
  #7  
Moderator

 
magician's Avatar
 
Join Date: Jul 2001
Location: Yorba Linda, CA
Posts: 6,592
Likes: 0
Received 0 Likes on 0 Posts
Default

My pleasure.
Old 01-26-2006, 08:36 AM
  #8  

 
JonBoy's Avatar
 
Join Date: May 2002
Posts: 19,697
Received 225 Likes on 159 Posts
Default

Bill, I think you should run for President.
Old 01-26-2006, 09:28 AM
  #9  
Registered User
 
ImportSport's Avatar
 
Join Date: Jan 2004
Location: Minneapolis, MN
Posts: 6,869
Likes: 0
Received 2 Likes on 2 Posts
Default

I would suggest that you approach the graduate students in your department for assistance if you are unable to meet with the professor. Although they lack the experience of teaching, most tyically grad students will be close to the material and maintain a grasp of what level the material needs to be presented on, to allow for students to wrap their minds around the material.

Dont be afraid to approach the grad students they are most typically hiding in a dark corner in the basement of the building but if you are polite to them they should bend over backward to help you out.
Old 01-26-2006, 09:35 AM
  #10  
Moderator

 
magician's Avatar
 
Join Date: Jul 2001
Location: Yorba Linda, CA
Posts: 6,592
Likes: 0
Received 0 Likes on 0 Posts
Default

Originally Posted by JonBoy,Jan 26 2006, 09:36 AM
Bill, I think you should run for President.
Aww, you made me blush.

(He he he.)


Quick Reply: Can anyone help me work with Markov Chains (Stats)



All times are GMT -8. The time now is 06:39 AM.