Блог пользователя rasen__shuriken

Автор rasen__shuriken, история, 5 лет назад, По-английски

Problem : Uva 12585

Lets say Alice has $a and Bob has $b, let c = min(a,b). In every round they will bet $c, whoever wins the round get $c . The game stops when Alice or Bob has no money to play. we have to find the expected number of rounds will be played for given a and b.

My Idea was to find answer for DP[a][b]=0.5*(DP[a+c][b-c] + 1)+ 0.5*(DP[a-c][b+c] +1 ).But as we see if a!=b , there always will be cycle. I need help to eliminate this cycle.

TIA

Полный текст и комментарии »

  • Проголосовать: нравится
  • +1
  • Проголосовать: не нравится