December 18, 2009
Before computers, arithmetic was done by hand, and one of the primary tools of the practicing arithmetician was a table of logarithms, which was printed in a large book and used to perform multiplication, division and exponentiation. But how to calculate the logarithms?
Leonhard Euler, the Swiss mathematician, explained the algorithm in his 1748 book Introductio in analysin infinitorum. The basic idea is that, given two numbers a and b, a < n < b, the geometric mean of a and b is equal to the arithmetic mean of their logarithms. Thus, to calculate the logarithm of a given number n, find two powers of the logarithmic base that bound n, calculate the geometric mean (square root of the product) of the two numbers and the arithmetic mean (half of the sum) of the two numbers, and recur on the the smaller interval between one of the two numbers and the geometric mean that bounds n, continuing until the desired accuracy is reached.
But how did Euler calculate the square roots? Sir Isaac Newton, the English physicist, described a method based on derivatives. Given an initial estimate x of the square root of n, a better estimate is given by . Again, this calculation is iterated until the desired accuracy is reached.
Your task is to write a function to compute square roots using Newton’s method, then use that function to compute logarithms using Euler’s method. When you are finished, you are welcome to read or run a suggested solution, or to post your solution or discuss the exercise in the comments below.