In August 1977, Rivest, Shamir and Adleman issued
a ciphertext challenge worth one hundred dollars
to Scientific American readers when Martin
Gardner described their revolutionary RSA cryptographic
system in his monthly “Mathematical
Games” column. This sounded very safe because it
was estimated at the time that the fastest existing
computer using the most efficient known algorithm
could not earn the award until it had run without
interruption for millions of times the age of the Universe.
This particular challenge involved the factorization
of a 129-figure number r, which appeared
to be so out-of-reach at the time that Martin
Gardner reported: “Rivest and his associates have
no proof that at some future time no one will discover
a fast algorithm for factoring composites as
large as r [...]. They consider [the] possibility extremely
remote.’’ Nevertheless, the $100 reward was
cashed last year—and donated to the Free Software
Foundation—after a mere eight months of intensive
computation led by Derek Atkins and Arjen
Lenstra. What happened?
The increase in raw computing power during those
years cannot be discounted, nor can the fact that
hundreds of workstations around the world spent
all their otherwise idle cycles on the task for the
better part of one year. Indeed, the several thousand
MIPS-years that were spent on the calculation
might not have been available to poor academics
back in 1977. But in the Preface of my new textbook
Fundamental of Algorithmics, soon to be released
by Prentice-Hall with Paul Bratley as coauthor, I
am quick to point out that far more significant was
the discovery of more sophisticated factorization
algorithms. When I put on my hat as teacher of
algorithmics, I like to use this example to illustrate
that more efficient algorithms can produce much
more dramatic results than better hardware. To
make life more interesting, however, I am wearing
quite a different hat as I write this essay!
a ciphertext challenge worth one hundred dollars
to Scientific American readers when Martin
Gardner described their revolutionary RSA cryptographic
system in his monthly “Mathematical
Games” column. This sounded very safe because it
was estimated at the time that the fastest existing
computer using the most efficient known algorithm
could not earn the award until it had run without
interruption for millions of times the age of the Universe.
This particular challenge involved the factorization
of a 129-figure number r, which appeared
to be so out-of-reach at the time that Martin
Gardner reported: “Rivest and his associates have
no proof that at some future time no one will discover
a fast algorithm for factoring composites as
large as r [...]. They consider [the] possibility extremely
remote.’’ Nevertheless, the $100 reward was
cashed last year—and donated to the Free Software
Foundation—after a mere eight months of intensive
computation led by Derek Atkins and Arjen
Lenstra. What happened?
The increase in raw computing power during those
years cannot be discounted, nor can the fact that
hundreds of workstations around the world spent
all their otherwise idle cycles on the task for the
better part of one year. Indeed, the several thousand
MIPS-years that were spent on the calculation
might not have been available to poor academics
back in 1977. But in the Preface of my new textbook
Fundamental of Algorithmics, soon to be released
by Prentice-Hall with Paul Bratley as coauthor, I
am quick to point out that far more significant was
the discovery of more sophisticated factorization
algorithms. When I put on my hat as teacher of
algorithmics, I like to use this example to illustrate
that more efficient algorithms can produce much
more dramatic results than better hardware. To
make life more interesting, however, I am wearing
quite a different hat as I write this essay!
Post a Comment