weily's blog

By weily, history, 2 hours ago, In English
unordered_map<uint64_t, uint64_t> ump;
uint64_t solve(uint64_t n) {
  if (ump.count(n))
    return ump[n];
  uint64_t p = cube(n - 1);
  uint64_t c = p * p * p;
  uint64_t res = solve(c) + solve(n - c) + (n - c);
  ump[n] = res;
  return res;
}

I'm curious about the time complexity of this recursive function.

Here, the function cube is used to calculate the cube root (not the cube). You can assume that it's $$$O(1)$$$ or $$$O(\log n)$$$.

I believe it is $$$\Omega(\sqrt[3]{n})$$$, but I'd like to know a more precise result and analysis.

You can consider that I've preprocessed $$$[0,m]\cap\mathbb{Z}$$$ as a base case for the recursion (which doesn't seem to offer significant optimization).

Can anyone help me?

English is not my native language; please excuse typing errors.

  • Vote: I like it
  • +3
  • Vote: I do not like it

»
2 hours ago, # |
  Vote: I like it 0 Vote: I do not like it

Auto comment: topic has been updated by weily (previous revision, new revision, compare).

»
80 minutes ago, # |
  Vote: I like it 0 Vote: I do not like it

Auto comment: topic has been updated by weily (previous revision, new revision, compare).

»
27 minutes ago, # |
Rev. 3   Vote: I like it +3 Vote: I do not like it

Edit for completeness: In the original post, I mistakenly assumed that the calls are $$$O(\log n)$$$ instead of $$$O(\log \log n)$$$.


Assume that we traverse the branch $$$\mathrm{solve}(c)$$$ first when given an input $$$n$$$. Let $$$k$$$ be the maximum integer such that $$$k^3 = n$$$. The main branch will go through all perfect cubes and resulting in calls of $$$\mathrm{solve}(n - k^3)$$$ and $$$\mathrm{solve}((i+1)^3 - 1 - i^3) = \mathrm{solve}(3i^2 + 3i)$$$ for $$$i < k$$$ (the lower bound depends on your preprocessing).

This is a very rough sketch, but notice that the calls to $$$\mathrm{solve}(3i^2 + 3i)$$$ will always be present in any call of $$$\mathrm{solve}$$$. Call these important calls. There are $$$O(\sqrt[3]{n})$$$ of them.

Assume that we have precomputed all of them. How many calls will $$$\mathrm{solve}(n - k^3)$$$ perform? Well, note that $$$n - k^3 \le (k+1)^3 - k^3 = O(k^2)$$$. Therefore, ignoring the important calls, we see that $$$n$$$ becomes $$$O(n^{2/3})$$$ in one call, thus we will have $$$O(\log \log n)$$$ non-important calls for a value of $$$n$$$.

Now, what do we do with the important calls? We can also assume an upper bound that all the important calls also will produce this amount of calls in precomputing them.

Thus, the number of states visited by $$$\mathrm{solve}$$$ are at most $$$O(\sqrt[3]{n} \log \log n)$$$.

  • »
    »
    18 minutes ago, # ^ |
    Rev. 3   Vote: I like it +2 Vote: I do not like it

    The complexity is lower, there's no log. You didn't consider that most of those states are repeated. If you precompute the states until O(N^(1/3)) then it takes 3 calls to reduce the number to that, so O(1) calls until the part that you've precomputed.

    • »
      »
      »
      13 minutes ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      I wasn't sure on how to count the repeated states, so I didn't make that claim.

      But using the point of view of "precomputing until $$$\sqrt[3]{n}$$$" makes sense to say that there are $$$O(\sqrt[3]{n})$$$ visited states. Thanks!

  • »
    »
    16 minutes ago, # ^ |
      Vote: I like it +1 Vote: I do not like it

    Unlike repeatedly dividing by a constant value, repeatedly taking the root should be $$$O(\log \log n)$$$.

    So it's $$$O(\sqrt[3]{n}\log\log n)$$$?

    • »
      »
      »
      12 minutes ago, # ^ |
      Rev. 3   Vote: I like it 0 Vote: I do not like it

      Oh, right, you're correct.

      That seems like the $$$3$$$ calls in tfg's comment above actually comes from $$$O(\log \log n)$$$, then, making the complexity $$$O(\sqrt[3]{n} \log \log n)$$$.

      Edit: Now that I think about it, it really should be multiplied, because you need to reduce a lot of calls of $$$n - c$$$, not just the one from the original call.

»
25 minutes ago, # |
Rev. 2   Vote: I like it 0 Vote: I do not like it

Let's call the original value of n as S.

At each step, we can see clearly that (n-c) will always be less than cbrt(S).

Furthermore, it's also pretty obvious that, for everytime solve(n) is called where n is larger than cbrt(S), n will be a cube (except for the first call).

Since n is always a cube, we can write it as d^3. Since c is also a cube, we can precisely write it as (d-1)^3 as well.

Note that the difference between n and c is O(d^2)=O(n^(2/3)). So, in each step solve(n), we will need to call a number that's O(n^(2/3)) less than it. The number of calls where n>cbrt(S) very roughly works out to be O(n/(n^(2/3)))=O(n^(1/3)).

For all steps where n<cbrt(S), we only need to calculate all of them at most O(cbrt(S)) times. For all steps otherwise, it doesnt really matter since there are only O(cbrt(S)) of them.

So the total complexity is indeed O(cbrt(S)). Of course I made some algebraic gymnastics but I guess an intuition is enough.

  • »
    »
    20 minutes ago, # ^ |
    Rev. 3   Vote: I like it 0 Vote: I do not like it

    $$$n - c$$$ is actually $$$O(S^{2/3})$$$, not $$$O(S^{1/3})$$$. (Consider the case where $$$S = 1000$$$ and $$$c = 9^3 = 729$$$.)

»
22 minutes ago, # |
Rev. 2   Vote: I like it 0 Vote: I do not like it

EDIT:

I made a mistake when calculating cnt_ops.

I did cnt_ops[i] = cnt_ops[c] + cnt_ops[i - c];

when I should've done cnt_ops[i] = cnt_ops[c] + (c != i - c ? cnt_ops[i - c] : 0);

ORIGINAL COMMENT:

I am not completely sure about the complexity of the function but I would think that it runs in $$$O(n)$$$ time (assuming that the cuberoot function runs in $$$O(1)$$$ time.) This is because when solving for $$$n$$$ you make a call to $$$c = \lfloor \sqrt[3]{n - 1} \rfloor^3$$$. It is easy to prove that $$$c < n$$$ always but it is also true $$$c ~ n$$$. I wrote some code to count the number of operations done when solve(i) is called. I found on printing the results for small values of $$$i$$$ that the number of operations is exactly equal to $$$i$$$ except when $$$i = 0$$$ (base case, one operation). To check for larger values of $$$i$$$ I added an assert to the code.

I ran the code the below for $$$n = 10^5$$$ and it did not terminate with the error message implying that that for $$$n \le 10^5$$$ the runtime of the recursive function is $$$O(n)$$$.

#include "bits/stdc++.h"
using namespace std;

int64_t cube_root(int64_t n) { // rounded down
    if(n == 0) return 0;
    int64_t lo = 1, hi = n;
    while(lo + 1 < hi) {
        int64_t mid = (lo + hi) / 2;
        (mid * mid * mid <= n ? lo : hi) = mid;
    }
    return lo;
}

void CHECK(bool ch, int64_t x, int64_t y) {
    if(ch) return;
    cout << "Fails for " << x << " which takes " << y << " operations" << endl;
    abort(); // terminates program
}

signed main() {
    ios::sync_with_stdio(0);
    cin.tie(0);

    int64_t n; cin >> n;
    int64_t ump[n + 1], cnt_ops[n + 1];
    for(int64_t i = 0; i <= n; ++i) {
        // base case
        if(i < 2) {
            ump[i] = i;
            cnt_ops[i] = 1;
        }
        else {
            int64_t p = cube_root(i - 1);
            int64_t c = p * p * p;
            ump[i] = ump[c] + ump[i - c] + (i - c);
            cnt_ops[i] = cnt_ops[c] + cnt_ops[i - c];
        }
        CHECK(i == 0 || i == cnt_ops[i], i, cnt_ops[i]);
    }
}
  • »
    »
    13 minutes ago, # ^ |
      Vote: I like it +2 Vote: I do not like it

    It can't be $$$O(n)$$$, because solve(1e17) can be computed in one second.

    • »
      »
      »
      12 minutes ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      I made a bad mistake, have added an edit to my comment. Sorry.