## Fenderbender’s Square Reckoner

### November 29, 2016

In many of our programs involving prime numbers and number theory, we need to be able to determine if a number *n* is a perfect square. One way to do that is to determine the integer square root of the number, using Newton’s method, then multiply to determine if the original number is a square. But that’s slow. In a previous exercise, we used a method devised by Henri Cohen to calculate the quadratic residues of *n* to various moduli, which can quickly determine that some *n* cannot be perfect squares.

Over at Mersenne Forum, `fenderbender`

extends Cohen’s idea to make a ridiculously fast square predicate: he precalculates multiple moduli to reduce the operation from big integers to 32-bit integers, chooses the moduli after extensive testing, and tests the quadratic residues using a 64-bit bloom filter. The result is impressive. Where Cohen eliminates the expensive square root calculation in 99% of cases, `fenderbender`

eliminates the expensive square root calculation in 99.92% of cases, and does it faster than Cohen. Go read `fenderbender`

‘s explanation to see a beautiful combination of number theory, wonky programming, and sheer artistry.

Your task is to implement `fenderbender`

‘s square predicate. When you are finished, you are welcome to read or run a suggested solution, or to post your own solution or discuss the exercise in the comments below.

## Three Amazon Interview Questions

### November 25, 2016

These three questions come from Career Cup:

First:A kidnapper wants to write a ransom note by cutting characters from the text of a magazine. Given two strings containing the characters of the ransom note and the characters of the magazine, write a program to determine if the ransom note can be formed from the magazine.

Second:Write a program that operates in linear time that finds the item in a list that appears the most times consecutively.

Third:Given two finite streams of integers that are too large to fit in memory, write a program that finds the integers that appear in both streams; it must operate in time linear in the length of the longer of the two streams.

Your task is to write the three programs described above. When you are finished, you are welcome to read or run a suggested solution, or to post your own solution or discuss the exercise in the comments below.

## Missing Items

### November 22, 2016

We have today a simple exercise; we’ve seen variants of it previously.

Given two lists, find all the items in the first list that are not present in the second list. For instance, if (5 15 2 20 30 40 8 1) is the first list and (2 20 15 30 1 40 0 8) is the second list, the item 5 is present in the first list but not in the second list.

Your task is to write a program to find missing items. When you are finished, you are welcome to read or run a suggested solution, or to post your own solution or discuss the exercise in the comments below.

## RIP Leibniz

### November 18, 2016

Gottfried Wilhelm Leibnez was a German mathematician and philosopher, and a developer, independently of Isaac Newton, of calculus; it was he who invented the d/dx notation used in writing integrals. He died three hundred years ago, on November 14, 1716, so today (a few days late, sorry) we have an exercise about calculus:

Write a program that computes the average number of comparisons required to determine if a random sequence is sorted. For instance, in the sequence 1 2 3 5 4 6, the first

inversionappears between 5 and 4, so it takes four comparisons (1<2, 2<3, 3<5, 5<4) to determine that the sequence is not sorted.

Your task is to write a program as described above. When you are finished, you are welcome to read or run a suggested solution, or to post your own solution or discuss the exercise in the comments below.

## Marzullo’s Algorithm

### November 15, 2016

This one is tricky:

Given a list of events with arrival and departure times, write a program that determines the time at which the greatest number events occurred.

For instance, you may have ten employees who arrived at work and departed at the times shown below (for instance, employee 9 arrived at 12:00noon and departed at 5:00pm):

employee 1 2 3 4 5 6 7 8 9 10 -- -- -- -- -- -- -- -- -- -- arrival 10 12 11 13 14 12 9 14 12 10 departure 15 14 17 15 15 16 13 15 17 18

Then the maximum employee count was at 2:00pm:

9 | 7 | 1 10 | 1 7 10 | 3 11 | 1 3 7 10 | 4 12 | 1 2 3 6 7 9 10 | 7 13 | 1 2 3 4 6 7 9 10 | 8 14 | 1 2 3 4 5 6 8 9 10 | 9 15 | 1 3 4 5 6 8 9 10 | 8 16 | 3 6 9 10 | 4 17 | 3 9 10 | 3 18 | 10 | 1

There were 9 employees at work at time 14.

Your task is to write a program that determines the start and end times of the time block where the greatest number of events occurred. When you are finished, you are welcome to read or run a suggested solution, or to post your own solution or discuss the exercise in the comments below.

## Introspective Sort

### November 11, 2016

The sorting algorithm that we have been working up to in three previous exercises is introspective sort, or introsort, invented by David Musser in 1997 for the C++ Standard Library. Introsort is basically quicksort, with median-of-three partitioning and a switch to insertion sort when the partitions get small, but with a twist. The problem of quicksort is that some sequences have the property that most of the recursive calls don’t significantly reduce the size of the data to be sorted, causing a quadratic worst case. Introsort fixes that by switching to heapsort if the depth of recursion gets too large; since heapsort has guaranteed O(*n* log *n*) behavior, so does introsort. The changeover from quicksort to heapsort occurs after *k* * floor(log(length(*A*))) recursive calls to quicksort, where *k* is a tuning parameter, frequently set to 2, that can be used to adjust performance of the sorting algorithm.

Your task is to implement introsort. When you are finished, you are welcome to read or run a suggested solution, or to post your own solution or discuss the exercise in the comments below.

## A Median-Of-Three Killer Sequence

### November 8, 2016

In two previous exercises we’ve been working toward a variant of quicksort that has guaranteed O(n log n) performance; there is no quadratic worst case. Before we do that, however, it is instructive to look at the case where our optimized median-of-three version of quicksort fails. Consider this sequence, due to David Musser:

1 11 3 13 5 15 7 17 9 19 2 4 6 8 10 12 14 16 18 20

At the first partitioning, the pivot element will be the median of 1, 2 and 20, which is 2, and the only two elements that change will be 2 and 11, with the partition point after the 2, indicated by the vertical bar:

1 2 | 3 13 5 15 7 17 9 19 11 4 6 8 10 12 14 16 18 20

At the next step, the pivot element will be the median of 3, 4 and 20, which is 4, and again the partition will advance only by two:

1 2 3 4 | 5 15 7 17 9 19 11 13 6 8 10 12 14 16 18 20

And so on. Each partition contributes the least possible amount toward the solution, and the time complexity becomes quadratic.

Your task is to write a program that creates a “killer sequence” for the median-of-three partition, then compare its time to the time required for sorting a random partition. When you are finished, you are welcome to read or run a suggested solution, or to post your own solution or discuss the exercise in the comments below.

## Two Stacks

### November 4, 2016

Different programs manage memory in different ways. One common pattern uses two stacks of variable sizes; memory is arranged so that one stack starts at the bottom of memory and grows up, the other stack starts at the top of memory and grows down.

Your task is to write a program that simulates memory management in an array, using two stacks. When you are finished, you are welcome to read or run a suggested solution, or to post your own solution or discuss the exercise in the comments below.

## Hoare’s Partition, Improved

### November 1, 2016

In today’s exercise we will make several improvements to the quicksort program of the previous exercise, working in small steps and measuring our progress throughout. We make the following improvements:

- Move the swap and comparison inline.
- Early cutoff and switch to insertion sort.
- Improved pivot with median-of-three.

All three improvements are well-known. The first improvement eliminates unneeded function-calling overhead. The second improvement reduces the number of recursive calls on very small sub-arrays, replacing them with a sorting algorithm that is well-adapted to nearly-sorted arrays. The third improvement improves the likelihood of a good partition and eliminates some cases where the algorithm performs poorly.

Your task is to write an improved quicksort and measure your improvement. When you are finished, you are welcome to read or run a suggested solution, or to post your own solution or discuss the exercise in the comments below.

## Hoare’s Partition

### October 28, 2016

I’ve recently rekindled my interest in sorting algorithms (it’s a fascination that never really goes away), and I’ve been looking at quick sort. The most common academic version of quick sort uses a partition due to Nick Lomuto, which takes a single pointer that scans through the arry, concludes with the partitioning element in the middle, all the less-than items to its left, and all the greater-than items to its right, then recurs on the two halves; the partitioning element is never part of any subsequent recursive call in the quick sort.

The original partitioning algorithm by C. A. R. Hoare works differently, with two pointers instead of one. One pointer starts at the left end of the array, and is advanced until it finds an item out of place. The other pointer starts at the right end of the array, and marches to the left end of the array until it finds an item out of place. Then the array items are swapped, each pointer takes one step toward the other, and the process continues, stopping when the two pointers cross, at which time there is a final swap. Hoare’s partition concludes by returning the right-side pointer; all items to the left of the pointer, plus the item at the pointer itself, are smaller than all items to the right of the pointer. The partitioning element is somewhere in the left-hand partition, but not necessarily at its end, which requires a change to the recursive call in the quick sort algorithm, which includes the partitioning element.

Your task is to implement Hoare’s partition and use it to write a quick sort program. When you are finished, you are welcome to read or run a suggested solution, or to post your own solution or discuss the exercise in the comments below.