Hailo's blog

By Hailo, history, 8 years ago, In English

Hi, I'm having trouble solving this problem. I was thinking of applying a Markov sequence property ( The average length of the cycle path of a node is the inverse of the probability of being in this node ), so the problem is reduced to an equation system. However, it's a 105 x 105 sparse matrix (at most 105 non-zero elements). I don't know if there is a gaussian elimination method for this kind of matrix or if the approach is incorrect. Thanks.

  • Vote: I like it
  • +14
  • Vote: I do not like it

»
8 years ago, # |
  Vote: I like it 0 Vote: I do not like it

Omg, I just got accepted and I proved my solution, but I still don't believe it can be that simple. Code is like extremely short, that may be a hint.

  • »
    »
    8 years ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    Still, why does that work? What are your starting ideas on proving the correctness of the solution?

    • »
      »
      »
      8 years ago, # ^ |
      Rev. 2   Vote: I like it +13 Vote: I do not like it

      Spoilers. Revert to see them.

      • »
        »
        »
        »
        8 years ago, # ^ |
        Rev. 2   Vote: I like it +10 Vote: I do not like it
        1. Nice spoiler trick ;)
        2. This was basically the same path I went through solving the problem, which is rather funny
        3. The statement I was actually interested in was the first step. Is that like a well known fact? I have never seen it before (well, I don't have much experience in Markov networks either, that might explain it). Did you come up with a proof? If so, I'd like to know some insights or starting points. I'll think about it more thoroughly, nonetheless.
        • »
          »
          »
          »
          »
          8 years ago, # ^ |
            Vote: I like it +10 Vote: I do not like it

          Yes, it is rather a well-known fact (I remember it being mentioned on my university probability course, but probably I have seen also somewhere else) and OP also mentioned it. Imagine an infinite random walk. What can you say about occurrences of v in that walk? How does it relate to average time of coming back from v to v?

          • »
            »
            »
            »
            »
            »
            7 years ago, # ^ |
            Rev. 2   Vote: I like it +3 Vote: I do not like it

            can you please tell why the answer is not for any vertex i 1/(wi) where w is a vector such that w=wp where p=transition matrix . Swistakk update- problem done.

»
8 years ago, # |
  Vote: I like it +6 Vote: I do not like it

It's a classic property of random walks on graphs. A more strong fact for any irreducible markov chain it's explained here http://research.microsoft.com/en-us/um/people/peres/markovmixing.pdf , sections 1.4 and 1.5