You are given two arrays of integers a, b both with length n <= 10^5. For each 0 <= x <= n print the sum of a[i]*b[x-i] for all 0 <= i <= x.
It's obvious this can be done in quadratic time, but can we do any better?
# | User | Rating |
---|---|---|
1 | ecnerwala | 3650 |
2 | Benq | 3582 |
3 | Geothermal | 3570 |
3 | orzdevinwang | 3570 |
5 | cnnfls_csy | 3569 |
6 | tourist | 3565 |
7 | maroonrk | 3532 |
8 | Radewoosh | 3522 |
9 | Um_nik | 3483 |
10 | jiangly | 3468 |
# | User | Contrib. |
---|---|---|
1 | maomao90 | 174 |
2 | adamant | 164 |
2 | awoo | 164 |
4 | TheScrasse | 160 |
5 | nor | 159 |
6 | maroonrk | 157 |
7 | -is-this-fft- | 151 |
8 | SecondThread | 150 |
9 | orz | 146 |
10 | pajenegod | 145 |
Help with problem
You are given two arrays of integers a, b both with length n <= 10^5. For each 0 <= x <= n print the sum of a[i]*b[x-i] for all 0 <= i <= x.
It's obvious this can be done in quadratic time, but can we do any better?
Name |
---|