### brdy's blog

By brdy, history, 14 months ago, ,

In the recent snackdown round A qualifier, there was a problem I could not figure out.

From skimming the solutions and from making my own observations I have realized the following:

• Most solution use some type of DP with two passes storing DP1[i][j] and DP2[i][j], where i and j presumably mean row and column.
• For any given distance D, there are O(D) other nodes D away in manhattan distance
• We can construct ans from dp1 and dp2 where ans[i] stores the amount of pairs of nodes i away from each other in manhattan distance

The editorial is not out, as codechef editorials can be quite slow (some in 2016 still pending).

Can anybody explain the solution here?

EDIT:

Possible solutions:

• Prefix sums over diagonals (previous mentioned observation), dp1[][] and dp2[][] represent theses where each of them deal with a diagonal in each direction

• Rotate the matrix by 45 degrees. Now you can solve it with normal prefix sums (over columns and rows) or matrix sums because the distance will be a "square away". One such conversion method is mat[x][y] --> mat[x-y][x+y]. Note that this conversion is space inefficient, and therefore you may need to define a NxN matrix as mat[-n...n][0...2n]

• FFT and mathy approaches .-.

• +15

 » 14 months ago, # |   +1 But editorial is actually out?I think the best solution is to rotate the coordinates 45 degrees, so (X, Y) becomes (X+Y, X-Y). Now (old manhattan distance) = (new chebyshev distance). Wiki linkSo we can imagine the chebyshev distance range from a house as a square, and keep expanding the square, and count new houses in each expansion. We have O(N*M) houses, and we do O(N) (or O(M)) square expansion for each house. This gives O(N^3) or so so
•  » » 14 months ago, # ^ |   0 Sorry I don't use codechef often (actually I never used before this contest) so I'm not so sure about how their website it organized.Although I am quite shocked that search of "snackdown a19 editorial" on google brought nothing.Where do you find these? Because it's also not linked to the contest itself...
•  » » » 14 months ago, # ^ |   0 Just google the contest code + "editorial". "SNCK1A19 editorial" brings up the AVGMAT tutorial as the fourth result for me.
 » 14 months ago, # |   0 My solution: Split the rectangle into 4 subrectangles, like that:1234Now you have 10 variants for each pair of houses in which subrectangles they are:-pairs 1-1, 2-2, 3-3, 4-4: solve recursively;-pairs 1-4 and 2-3: any path that begins in 1 and ends in 4 goes through some center cell; so calculate for each distance how many houses in 1 and in 4 have this distace to this center cell and then multiply these two vectors by fft and add to answer; same for 2-3;-pair 1-3: split it like that:1.1 1.23.1 3.2multiply by fft vectors of 1.1 and 3.2 and add to answer, same for 3.1-1.2, recursively go to 1.1-3.1 and 1.2-3.2; same for 2-4 and same for 1-2 and 3-4, but split horizontally.btw, when the sizes of rectangles in recursion become <100 i stopped recursion and calculated distances manually
•  » » 14 months ago, # ^ |   -9 I thinks its better to stop at (.53 * sqrtlogn) iterations because it allows optimal bitset converging useful for fft.But the main trick is to maintain of bitset.
 » 14 months ago, # |   0 Bitset works quite well. O(N^4 / 32) ~ 2.5 * 10^8 operations per test case.
•  » » 14 months ago, # ^ |   0 Can you explain your solution?
•  » » » 14 months ago, # ^ |   +1 For finding distances of ones between each row, we shift them and calculate bitwise and. Click for the code.
 » 14 months ago, # |   0 Let dp[row][col][dist] =  number of vertices (_row, _col) such that _row ≤ row and the manhattan-distance from (row, col) to (_row, _col) equals dist. Then, to calculate dp[row][col][dist] we can split it into two cases: - we calculate the case when _row = row in O(n2). - the answer for the case _row < row equals dp[row - 1][col][dist - 1],  since to go from (row, col) to any other vertex above (row, col) we must go upwards at least once, thus landing on (row - 1, col). PS: dp[row][col][dist] might exceed the memory limit I'm not sure(I'm back to cp after almost a year). But since we're only using dp[row - 1] to calculate dp[row],  we can just keep one single dp[col][dist] and update it for each new row.
•  » » 14 months ago, # ^ |   0 Actually, I too used the same approach and my solution passed for the given array size.
 » 14 months ago, # |   0 Can someone explain the idea described by @gorre_morre in comments in the editorial https://discuss.codechef.com/questions/137695/avgmat-editorial
 » 14 months ago, # | ← Rev. 2 →   +1 I knew that I was doing something wrong when I had to write this monster of a 200 line solution: Link. Glad to know that more elegant solutions exist.BTW, you can go to the discuss.codechef.com to see the editorials and click on the appropriate tag for the contest.
 » 14 months ago, # |   0 Auto comment: topic has been updated by brdy (previous revision, new revision, compare).
 » 14 months ago, # |   0 O(n^3logn) solution worked, by applying binary search on each diagonal for every house.
•  » » 14 months ago, # ^ |   0 I had TLE with O(n^3 log(n)), but you can easily change it to O(n^3) by just keeping track of two pointers, and that passed.
 » 14 months ago, # |   0 Can somebody tell if the complexity of this is O(T*N*M)?
•  » » 14 months ago, # ^ | ← Rev. 2 →   0 I think you are right. As u have used unoreder_map with complexity of O(1). so the whole code must runs in O(T*N*M), then why its Time OUT?
•  » » » 14 months ago, # ^ |   0 That's what I was thinking, Can somebody help me here!!
•  » » 14 months ago, # ^ |   0 Your 'v' is an O(n*m) vector.
•  » » » 14 months ago, # ^ |   0 got it Thanks, bro.