Tutorial is loading...
Tutorial is loading...
Tutorial is loading...
Tutorial is loading...
Tutorial is loading...
Tutorial is loading...
Tutorial is loading...
# | User | Rating |
---|---|---|
1 | Benq | 3650 |
2 | Radewoosh | 3648 |
3 | tourist | 3579 |
4 | ecnerwala | 3551 |
5 | Um_nik | 3494 |
6 | Petr | 3452 |
7 | ksun48 | 3432 |
8 | jiangly | 3349 |
9 | maroonrk | 3338 |
10 | Miracle03 | 3314 |
# | User | Contrib. |
---|---|---|
1 | 1-gon | 211 |
2 | awoo | 188 |
3 | Errichto | 187 |
4 | rng_58 | 186 |
5 | SecondThread | 183 |
6 | Um_nik | 176 |
6 | Ashishgup | 176 |
8 | maroonrk | 174 |
9 | antontrygubO_o | 171 |
10 | -is-this-fft- | 169 |
Name |
---|
Solution of G with maxflow idea.
http://codeforces.com/contest/978/submission/38193150
Nice!
I also solve it with maxflow like an assignment problem. 38199894
Very fast! Thank you
Thanks for quick explanation. Is there any solution which works faster than "brute solution" for problem D?
Faster than O(n)? I think, no.
We need to iterate over massive in cases of changing first and second elements. So as our first element could be changed in 3 ways and second element could be changed in 3 ways we have O(n*m) where m = number of combinations of first and second element changes (in our case 3*3). Maybe my question was not clearly. Is there any way to generalize algorithm to work not only with changes between [-1, 1] but also with general case [-i, i] not in O(n*m) complexity?
even 5 ways: a1-a2, a1-a2(+-)1,a1-a2(+-)2
Hello! I've solved it by calculating an average value of the common difference. But there are tricky cases when n equals 4 or 5. 38261362
For G you don't need to iterate all the exams every day.
38195700 complexity is O(n*logm) insted of O(n*m)
Excellent thanks for posting
A simple way of understanding 978E - Bus Video System:
Define pre[i] =
Let's take an arbitrary variable initial which could hold all possible initial values.
It is easy to see this inequality holds true:
0 ≤ initial + pre[i] ≤ w,
Which implies:
- pre[i] ≤ initial ≤ w - pre[i]
We can reduce this to:
max(0, - pre[i]) ≤ initial ≤ min(w, w - pre[i])
Solving this inequality and finding the range gives us the final answer.
For F instead of using vectors with quarrels another nice way is to process the quarrels by incrementing an array containing the "number of quarrels with less skilled programmers". Then we can just directly subtract the number in that array at the end.
My solution 38178463 to problem 978E - Bus Video System is O(n).
My solution 38166452 to problem 978B - File Name was similar to the one in the tutorial, but easiest, it's how many ocurrences have the string "xxx" in the string s.
fcspartakm I think there was a typo in the tutorial of problem F: "we can user array of pairs" for "we can use array of pairs".
Regexp solution is very easy to write for B :P
Problem E can be solvable using binary search on answer . :) . 2 times binary search , first for negative upper bound , second for exceeding lower bound .
Other "different" solution for the problem F is using BIT compressed
In problem 978D: "Almost Arithmetic Progression" I'm failing in test case 17. 3 34 70 52 But this is already an AP. The answer should be 0. But the jury's answer for this is -1. Please help where I am mistaken.
My solution: https://codeforces.com/contest/978/submission/38594533
I got my mistake.
can anyone please explain problem D in detail, Please.
You are given the difference between consecutive elements in an array. An element can be decreased by 1, an element can be increased by 1 so that all difference between consecutive elements is equal. Let the 1st element is A , 2nd element is B . you can make A into A-1 , A , A + 1 , also B into B-1 , B , B + 1. so for A and B there will be 9 combination and you can do that by two for loops by adding -1 to 1 with A and B. After that you have to find abs(A-B) difference for other elements. You have to make that in minimum moves ( brute force check will do that ).
To remove duplicates in Remove Duplicates, if a set is used, the overall time complexity will be O(nlogn) as find operation of set requires O(logn). Can we do better? Absolutely Yes! By using two auxiliary arrays, iterating the original array from left to right, overwriting the last occurrence of each element and then traversing the aux array to store the index of the respective elements in the second aux array. This will reduce the overall complexity to O(n), with additional space.