rasen__shuriken's blog

By rasen__shuriken, history, 5 years ago, In English

Problem : Uva 12585

Lets say Alice has $a and Bob has $b, let c = min(a,b). In every round they will bet $c, whoever wins the round get $c . The game stops when Alice or Bob has no money to play. we have to find the expected number of rounds will be played for given a and b.

My Idea was to find answer for DP[a][b]=0.5*(DP[a+c][b-c] + 1)+ 0.5*(DP[a-c][b+c] +1 ).But as we see if a!=b , there always will be cycle. I need help to eliminate this cycle.

TIA

  • Vote: I like it
  • +1
  • Vote: I do not like it