Runtime of finding the GCD of an array?

Revision en4, by SecondThread, 2021-07-11 03:06:47

I started thinking about this during my livestream of the solution to round 731, and discussed this in a comment a bit earlier, but it seems novel enough and might require enough debate that I thought it makes sense to be it's own blog post:

Suppose I have an array $$$a$$$ of $$$n$$$ integers, each up to a million. What is the worst-case runtime of the following code?

int gcd(int a, int b) {
   return b==0?a:gcd(b, a%b);
}

//in main
int gcd=a[0];
for (int i:a) gcd=gcd(i, gcd);

Conventionally, I have always thought it was $$$n*log(MAXVAL)$$$. However, each time the gcd call runs, the answer decreases by an expected factor of about 1.6 (the golden ratio). But really, we only do all $$$log(n)$$$ calls if the gcd gets quite small. In that case, future gcd calls will be $$$O(1)$$$, not $$$O(log)$$$. So do we have a clean potential function like: the number of prime factors in gcd, which can bound the total GCD recursion? I know that this is some very hand-wavey logic, but it seems almost legit enough for me to believe it. Or is this totally not true? I'm secretly wishing it's true because it would be cool, but also I feel like I would have seen this before if it was both commonly known and true.

TLDR: is the above code really $$$O(n + log(MAXVAL))$$$, instead of $$$O(n * log(MAXVAL))$$$?

History

 
 
 
 
Revisions
 
 
  Rev. Lang. By When Δ Comment
en4 English SecondThread 2021-07-11 03:06:47 2 Tiny change: 'ly do all log(n) calls if ' -> 'ly do all $log(n)$ calls if '
en3 English SecondThread 2021-07-11 03:04:59 5 Tiny change: 'of $O(n * MAXVAL)$?' -> 'of $O(n * log(MAXVAL))$?'
en2 English SecondThread 2021-07-11 03:04:37 3
en1 English SecondThread 2021-07-11 03:04:11 1345 Initial revision (published)