I had cause today at lunch to unleash random bletherings upon a colleague of mine regarding some subjects I had not considered since university. I realised soon after that I could probably write-up my rants in a series of articles.
This is the first of two articles, and it surrounds my thoughts on the true ‘heroes’ of computing.
My colleague and I were talking about how computing and programming is often overlooked as a scientific discipline. A career physicist will be well aware of the key contributory figures to their discipline; names such as Einstein, Newton, and Hawking are practically ubiquitous, and are easily recognisable names far outside the community. In chemistry, the names Curie, Nobel, and Rutherford stand out, whilst mathematicians can look towards Fermat, Archimedes, Newton (again), and, more recently, Andrew Wiles, famed with proving the former’s “Last Theorem“.
In computing, however, it appears we have few such “household” names recognisable outside the profession. Charles Babbage and Alan Turing are probably amongst the most mentioned, but it is likely they only remain in the public consciousness due to the ignominy of their final demise.
When Babbage Died in 1871 he was virtually unknown. The Times newspaper ridiculed him and the Royal Society did not publish an obituary. In 1878, a committee of the British Association for the Advancement of Science recommended against constructing the Analytical Engine. Eventually, in 1991, the London Science Museum commissioned a fully working specimen from Babbage’s specification of Babbage’s Difference Engine No. 2.
It is impossible sum up Alan Turing in a single paragraph. He is regarded by many as the father of computing formalised the concepts of ‘algorithm’ and ‘computation’. During the 2nd Word War he played a pivotal role in the breaking of German cyphers, most famously Enigma. He devised the concept of the ‘Turing Test’ which is a test of a machine’s ability to exhibit intelligent behaviour. This idea is now prominent in internet CAPTCHAs – the funny obscured text you are often asked to type on a form order to confirm you are human!
In 1952, Turing was arrested and charged with gross indecency after admitting having had a homosexual relationship. He elected to undergo hormonal treatment as a punishment (for being homosexual) instead of imprisonment, and accepted chemical castration via oestrogen hormone injections. He was found dead two years later having ingested a lethal dose of cyanide. An inquest determined that he had committed suicide, though this is subject to much speculation.
So what of the practice of our scientific disciplines?
It is therefore no surprise that a seminal list of recognised industry pioneers is not mainstream, not even within the industry itself.
I conducted a short poll around the office for two others that I felt should be instantly recognisable, Donald Knuth and Ada Lovelace. This selection was based on whatever I was reading at the time, and indicates no perception of seniority or significance over any other name I could have chosen.
For those that are unfamiliar, Donald Knuth (pronounced Ka-NOOTH, according to his website) is the author of a volume of books entitled The Art of Computer Programming. American Scientist included this title its list of the top 12 physical-sciences monographs of the century, in the company of works by Russell and Whitehead, Einstein, Dirac, Feynman, and von Neumann.
Ada Lovelace is credited by many as writing the very first computer algorithm between 1842 and 1843, designed as a program to operate on Charles Babbage’s analytical engine. For this, she is often portrayed as the world’s first programmer! She also met a tragic end, dying at the age of 36 from uterine cancer and bloodletting by her physicians. Her name lives on today in computing in the Ada Computing Language
The result of my totally unscientific straw-poll was that while they were familiar with Babbage and Turing, no-one I asked had heard of either Knuth or Lovelace (apart from Malcolm and Ron, who I fully expected would).
There are others I could have mentioned, from John von Neumann who devised the von Neumann architecture that underpins most modern computers to Edsger Dijkstra who won the Turing Award (the Nobel Prize of Computing) in 1972 for his contributions to developing programming languages. Even Noam Chomsky, one of the fathers of modern linguistics, can be cited. His Chomsky language hierarchy is said to describe a fundamental part of any modern language compiler.
I think we’re tragically unaware of our history, and I’m often really disappointed to see that people who are now practicing this craft having no intellectual curiosity about where this stuff came from…
Continuing on the same theme, in the same book, Donald Knuth was asked if he felt that programmers and computer scientists were aware enough of the [pretty short] history of the field. He responded:
I was reading in American Scientist last week about people who had rediscovered an algorithm that Boyer and Moore had discovered in 1980. It happens all the time that people don’t realize the glorious history that we have. The idea that people knew a thing or two in the ’70s is strange to a lot of young programmers.
It’s inevitable that in such a complicated field that people will be missing stuff. Hopefully with things like Wikipedia, achievements don’t get forgotten the way they were before.
One could argue that the age of the discipline has not been sufficient to afford computing its pioneering figureheads their legendary status. Archimedes died in 212BC, a full 2090 years before Babbage, so we may have some years to go. It could also be that the speed of advancement of the computer industry, and the inherent obsolescence of many of the transient articles of computing over the years robs the incumbent professionals of their immortality.
So what of today’s names be memorable to the masses in 100, or even 2000 years time? Tim Berners-Lee, founder of the world wide web will surely get a mention, as may Steve Jobs, co-founder of Apple. And regardless of any sentiments you may hold of him, I also expect history will be kind to Bill Gates.
Again, I’m plucking random names off the top of my head, but would love to hear your opinions on others.
So, who exactly will be our modern-day heroes of computing…tomorrow?