```
unordered_map<uint64_t, uint64_t> ump;
uint64_t solve(uint64_t n) {
if (ump.count(n))
return ump[n];
uint64_t p = cube(n - 1);
uint64_t c = p * p * p;
uint64_t res = solve(c) + solve(n - c) + (n - c);
ump[n] = res;
return res;
}
```

I'm curious about the time complexity of this recursive function.

Here, the function `cube`

is used to calculate the **cube root** (not the cube). You can assume that it's $$$O(1)$$$ or $$$O(\log n)$$$.

I believe it is $$$\Omega(\sqrt[3]{n})$$$, but I'd like to know a more precise result and analysis.

You can consider that I've preprocessed $$$[0,m]\cap\mathbb{Z}$$$ as a base case for the recursion (which doesn't seem to offer significant optimization).

Can anyone help me?

*English is not my native language; please excuse typing errors.*

Auto comment: topic has been updated by weily (previous revision, new revision, compare).Auto comment: topic has been updated by weily (previous revision, new revision, compare).Edit for completeness:In the original post, I mistakenly assumed that the calls are $$$O(\log n)$$$ instead of $$$O(\log \log n)$$$.Assume that we traverse the branch $$$\mathrm{solve}(c)$$$ first when given an input $$$n$$$. Let $$$k$$$ be the maximum integer such that $$$k^3 = n$$$. The main branch will go through all perfect cubes and resulting in calls of $$$\mathrm{solve}(n - k^3)$$$ and $$$\mathrm{solve}((i+1)^3 - 1 - i^3) = \mathrm{solve}(3i^2 + 3i)$$$ for $$$i < k$$$ (the lower bound depends on your preprocessing).

This is a very rough sketch, but notice that the calls to $$$\mathrm{solve}(3i^2 + 3i)$$$ will always be present in any call of $$$\mathrm{solve}$$$. Call these

important calls. There are $$$O(\sqrt[3]{n})$$$ of them.Assume that we have precomputed all of them. How many calls will $$$\mathrm{solve}(n - k^3)$$$ perform? Well, note that $$$n - k^3 \le (k+1)^3 - k^3 = O(k^2)$$$. Therefore, ignoring the important calls, we see that $$$n$$$ becomes $$$O(n^{2/3})$$$ in one call, thus we will have $$$O(\log \log n)$$$ non-important calls for a value of $$$n$$$.

Now, what do we do with the important calls? We can also assume an upper bound that all the important calls also will produce this amount of calls in precomputing them.

Thus, the number of states visited by $$$\mathrm{solve}$$$ are at most $$$O(\sqrt[3]{n} \log \log n)$$$.

The complexity is lower, there's no log. You didn't consider that most of those states are repeated. If you precompute the states until O(N^(1/3)) then it takes 3 calls to reduce the number to that, so O(1) calls until the part that you've precomputed.

I wasn't sure on how to count the repeated states, so I didn't make that claim.

But using the point of view of "precomputing until $$$\sqrt[3]{n}$$$" makes sense to say that there are $$$O(\sqrt[3]{n})$$$ visited states. Thanks!

Unlike repeatedly dividing by a constant value, repeatedly taking the root should be $$$O(\log \log n)$$$.

So it's $$$O(\sqrt[3]{n}\log\log n)$$$?

Oh, right, you're correct.

That seems like the $$$3$$$ calls in tfg's comment above actually comes from $$$O(\log \log n)$$$, then, making the complexity $$$O(\sqrt[3]{n} \log \log n)$$$.

Edit:Now that I think about it, it really should be multiplied, because you need to reduce a lot of calls of $$$n - c$$$, not just the one from the original call.Let's call the original value of n as S.

At each step, we can see clearly that (n-c) will always be less than cbrt(S).

Furthermore, it's also pretty obvious that, for everytime solve(n) is called where n is larger than cbrt(S), n will be a cube (except for the first call).

Since n is always a cube, we can write it as d^3. Since c is also a cube, we can precisely write it as (d-1)^3 as well.

Note that the difference between n and c is O(d^2)=O(n^(2/3)). So, in each step solve(n), we will need to call a number that's O(n^(2/3)) less than n. The number of calls where n>cbrt(S)

veryroughly works out to be O(n/(n^(2/3)))=O(n^(1/3)).For all steps where n<cbrt(S), we only need to calculate all of them at most O(cbrt(S)) times. For all steps otherwise, it doesnt really matter since there are only O(cbrt(S)) of them.

So the total complexity is indeed O(cbrt(S)). Of course I made some algebraic gymnastics but I guess an intuition is enough.

$$$n - c$$$ is actually $$$O(S^{2/3})$$$, not $$$O(S^{1/3})$$$. (Consider the case where $$$S = 1000$$$ and $$$c = 9^3 = 729$$$.)

Yes, I fixed the misunderstanding.

I also just realized there is a much easier (and more correct) way to find the number of calls for n>cbrt(S).

Since all calls for n>cbrt(S) require that n be a cube, and there are at most O(cbrt(S)) cubes less than S, the total number of calls must be exactly O(cbrt(S))

In practice it's even a little bit faster since there is also a -S^(1/9) term that I've ignored. I ran some tests on my own and this does check out.

EDIT:

I made a mistake when calculating

`cnt_ops`

.I did

`cnt_ops[i] = cnt_ops[c] + cnt_ops[i - c];`

when I should've done

`cnt_ops[i] = cnt_ops[c] + (c != i - c ? cnt_ops[i - c] : 0);`

ORIGINAL COMMENT:

I am not completely sure about the complexity of the function but I would think that it runs in $$$O(n)$$$ time (assuming that the cuberoot function runs in $$$O(1)$$$ time.) This is because when solving for $$$n$$$ you make a call to $$$c = \lfloor \sqrt[3]{n - 1} \rfloor^3$$$. It is easy to prove that $$$c < n$$$ always but it is also true $$$c ~ n$$$. I wrote some code to count the number of operations done when

`solve(i)`

is called. I found on printing the results for small values of $$$i$$$ that the number of operations is exactly equal to $$$i$$$ except when $$$i = 0$$$ (base case, one operation). To check for larger values of $$$i$$$ I added an assert to the code.I ran the code the below for $$$n = 10^5$$$ and it did not terminate with the error message implying that that for $$$n \le 10^5$$$ the runtime of the recursive function is $$$O(n)$$$.

It can't be $$$O(n)$$$, because

`solve(1e17)`

can be computed in one second.I made a bad mistake, have added an edit to my comment. Sorry.