the corner office

a blog, by Colin Pretorius

Binary search busticated for big-ass arrays

My RSS reader went bing! bing! this weekend as the news broke that there's a decades-old bug in most Binary Search implementations. Joshua Bloch's blog post about it sparked what looked like near-hysteria in some quarters. Wailing, gnashing of teeth and rending of clothes as the IT industry was forced to stop and take stock of the fact that integer operations are susceptible to overflows.

Shock horror.

Well, technically, it is a bug for large values of n, and yes, as time passes, it's more likely that this 'ceiling' will be encountered. How often, though? As this TSS article points out, the Java API's bug only gets triggered if you have an array with over a billion elements (1,073,741,825 to be exact), and only then in limited circumstances. When last did you need to search a billion-element array?

As another poster points out, an int array in Java (4 bytes per element on 32-bit systems) with 1 billion elements chomps up over 4 gigabytes of memory. Statically initialise that puppy in C or C++ and your compiler would probably spit out sparks and die.

In other words, it's all a bit of a storm in a teacup. No doubt, the more overflow-friendly implementations should be used, but it's not like the world would fall apart if people didn't bother. I think we can safely predict, though, that no algorithms textbook will ever use the 'overflowing' implementation, ever again.

{2006.06.07 17:48}

« X-Men 3

» Teeth and celebrities