August 8th, 2011, 03:07 AM

Integer square root algorithm
Hi,
i'm searching for efficient algorithm which computes integer square root of 64 bit (say, signed) integer number (long long).
That is: ISQ(n) = max k (integer) such that k*k <= n.
My approach is stupid.
I first compute k as integer part of sqrt(n) function (from <math.h>) and then try to increase or decrease k in a loop.
Thanks in advance for any suggestions.
Last edited by leszek31417; August 8th, 2011 at 03:15 AM.
August 8th, 2011, 04:46 AM

August 8th, 2011, 07:19 AM

Thank you.
Especially the second citation is something that i should read.
August 9th, 2011, 11:10 AM

This little Python function is not optimal, but I think is nonetheless pretty fast:
Code:
def isqrt(n):
# construct an initial guess value
d = len(str(n)) # count of decimal digits
x = '3'
d = d / 2
while d > 0:
d = d  1
x = x + '1'
x = int(x)
# perform NewtonRaphson iteration
prev = 0
while x != prev:
prev = x
q = n / x
x = (x + q) / 2
return x
Python doesn't have an integer square root, or at least didn't back when I wrote this function.
The first block makes a crude approximation to the square root; the second repeats NewtonRaphson until it converges. For extra credit, prove that the while loop will always terminate (or, prove that it can fail to terminate)
August 10th, 2011, 03:57 AM

Thank you again !
I'll play with this algorithm.
How many steps the second loop (NewtonRaphson iteration) needs?
August 11th, 2011, 02:22 AM

How many steps the second loop (NewtonRaphson iteration) needs?
NewtonRaphson for square root is a wonderful algorithm, ranking in my mind with Euclid's. It is both very simple, and extremely efficient. When my crude technique is used for the initial "guess" value, the maximum number of iterations required to reach the nearest integer square root of n is well approximated by log2(log2(n)). [log2 means "log base 2", or roughly the number of bits.]
Because the program I posted uses one more iteration to detect convergence, a good estimate is 1 + log2(log2(n)). Note well that in some cases, the number of iterations will be much lower than this worstcase figure.
Numbers up to 10,000 bits require no more than about 13 iterations; up to 100,000 bits, no more than about 16 iterations.
As I hinted in the previous post, the Python script I gave doesn't always terminate: in integer math, the iteration can stably alternate between two values, in which case the test for same value as before will never evaluate true. Here's a better version:
Code:
def isqrt(n):
# restrict to safe arguments
if n < 1: return 0
# construct an initial guess value
d = len(str(n)) # count of decimal digits
x = '1'
d = d / 2
while d > 0:
d = d  1
x = x + '0'
x = int(x)
# perform NewtonRaphson iteration
prev = 0
prv2 = 0
while x != prev and x != prv2:
prv2 = prev
prev = x
q = n / x
x = (x + q) / 2
return x
It can be improved by computing a more refined initial value for NewtonRaphson, which could reduce the worst case by probably one or two iterations.
August 12th, 2011, 01:53 AM

Thank you very much Mr. Mah$us !
I played with this algorithm a little.
Originally Posted by mah$us
... is well approximated by log2(log2(n)) ...
Where i can find the proofs of the above?
Here is my C version (without good initial guess).
PHP Code:
int isqrt ( int n ){ int p,r;
r = n; // initial guess
do{
p = r;
r = ( r + n/r )/2; // integer division!
}while( r < p );
return p;
}
August 12th, 2011, 12:46 PM

1. Though the original post specified 64 bits, and integers this size are easily handled in modern C implementations, computation related to cryptography sometimes requires taking square roots of much bigger numbers (more than 1000 bits).
Anyone who wants to write a squareroot program in C for large integers can use a largeinteger library (there are several available). Python natively handles large integers, and so is very convenient for this kind of computation.
2. leszek, the loop termination in the C program is valid, if and only if the successive approximations decrease monotonically. I suggest that you prove (or disprove) this monotonicity.
3. The proof for the number of iterations is not difficult, and rather fun, so I'll leave it as an exercise. Hint: if y is the true square root, define the most recent approximation as y (1 + epsilon), where epsilon = 2^n. Speaking a little loosely, we can say that this approximation is "good to n bits" (that is, the first n bits are correct). Using the iteration formula to compute the next approximation  that approximation is good to how many bits? Once you know this, it is not difficult to compute the number of iterations required.
August 13th, 2011, 02:43 AM

second version
PHP Code:
// second version
int isqrt ( int n ){ int p,r;
p = 1; while ( p <= n/p ) p = 2*p;
r = p; // initial guess: isqrt(n) < r
do{
p = r;
r = ( r + n/r )/2;
}while( r<p );
return p;
}
It works very fast! Now i believe in log2(log2(n)) convergence!
I am now working on proof of correctness.
August 16th, 2011, 02:31 AM

Let k be the integer root of n (i.e., k is the number we are looking for).
Let r' = ( r + n/r )/2 (integer divisions and r>0).
Lemma. r'<r if and only if r>k.
Proof. Easy.
So the sequence of iterations is strictly decreasing, provided it stays ABOVE k.
This proves the termination of algorithm: if we start with initial value > k,
then there must be a step such that r' <= k < r .
The next iteration value (say r'') is greater than or equal to r' (from the "only if" part),
so the loop terminates.
Claim. r' = k.
Proof. A little bit more involved. As main tool use the well known fact
that geometric mean <= arithmetic mean.
Mr. Mah$us, thank you very much for bringing to my attention this beautiful algorithm !
Last edited by leszek31417; August 17th, 2011 at 01:32 AM.
Reason: english language ...