Блог пользователя duckladydinh

Автор duckladydinh, история, 3 года назад, По-английски

Hi,

Recently I came across a problem like this:

Given h[1..n] where h[i] denotes the height of column i, find the largest x such that there exists some i where h[i..i+x-1] >= x.

The best I could think of is O(nlogn): binary search for the value of x and do a linear check. However, this problem reminds me of finding the largest rectangle which could be done in O(n). So I wonder if there also exists a linear solution to this problem?

Thanks

  • Проголосовать: нравится
  • +25
  • Проголосовать: не нравится

»
3 года назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Auto comment: topic has been updated by duckladydinh (previous revision, new revision, compare).

»
3 года назад, # |
  Проголосовать: нравится +20 Проголосовать: не нравится

It's kinda like the largest rectangle. You can use a stack of minimums to find the closest positions to the left and to the right from every position (i) such that their value is less than h[i]. Then you have a rectangle from left to right position with a height of h[i]. The side of the square is just the minimum of the sizes of this rectangle.

  • »
    »
    3 года назад, # ^ |
      Проголосовать: нравится 0 Проголосовать: не нравится

    Oh, thanks for your reply. I honestly didn't expect it would work. Because the rectangle solution is based on the assumption that the height must be exact that of one column, while it isn't in this case. But seems like it really passed all tests so it is really correct. But I still don't understand why. Can you explain a bit more?

    • »
      »
      »
      3 года назад, # ^ |
        Проголосовать: нравится +8 Проголосовать: не нравится

      Let's take a look at the biggest square. Its side should either be a minimum height on its segment or the length of this segment. So we should just take all these rectangles and put squares inside them.