Alan Turing is considered to be the father of computing (at least for those who don’t believe in mayan computers, secret alien infiltrations, or Atlantis). He would have turned 100 this year.
Computers are everywhere nowadays, and pretty much anyone can learn very quickly to use one. But you have to remember that up until the fifties, people were paid to do calculus. In the case of all the complicated operations for astronomical charts and stuff, the post of calculator was in high regard, and the fastest (and more accurate) one could name his price.
Machines have been around for a long time, but there was no adaptability to them: the intelligence was in the hand of the user. Complicated clockwork machinery could perform very delicate stuff, but not by themselves. And repurposing one of them to do something it wasn’t built for was close to impossible.
Basically that’s what Turing pioneered: a machine that could be repurposed (reprogrammed) all the time, and could modify its own behavior (program) based on its inputs.
Before Turing, what you had is an input -> tool -> output model for pretty much everything.
After him (and we can’t help but smile when seeing how pervasive these things are today — even my TV!), the model switched to input + memory -> tool -> output + modified memory (+ modified tool).
Meaning that two consecutive uses of the same tool might yield different results with the exact same inputs. Of course, it’s better if the modification is intentional and not the result of a bug, but he opened a whole new field of possibles.
So happy birthday Alan, and thanks for giving an outlet to my skills. If you hadn’t been around there would have precious few other ways for me to whore my faulty brain!