This month marks 40 years since I first started writing code, and a time for reflecting how much has changed since 1972. Hardware capabilities have grown by many orders of magnitude [see table below], and software capabilities grew to match. The days of stacks of green-bar paper have been replaced with immersive experiences that can access unfathomable amounts of data. With very few exceptions, carefully optimized assember coding has been replaced with programming languages (and meta-languages) based on much higher levels of abstraction
Yet, despite all of this growth, the fundamentals of a computer program at the lowest levels has not changed significantly. Data is still loaded from and stored to (relatively) slow persistent media into memory. Values are transfered between memory and the core of the CPU, and processed with various bit level operations, conditional checks (at the binary level), and arithmetic calculations. The real difference is that the human programmer is increasingly isolated from having to worry about such things
Over the past few years, I have been performing an informal experiment: looking for situations where this low level knowledge is applicable, evaluating developers ability to think at such a level, and the resulting outcomes. While the majority of cases studied would not be significantly impacted by applying this low level knowledge, there have been a significant number of cases where the final result was directly impacted.
In some situations, the need is clear. This was the case in my work with projects such as Otari’s Advanta Audio Console and NanoEtch’s Machine Control System; both systems which required “true realtime” processing of data. But in many more situations the need was more subtle. As an example, one major financial client in 2010 had developed a set of .NET libraries that they believed were highly optimized for speed. The .NET implementation was a deritive of work they had done for their Java server applications. However, upon close examination (and direct measurements) the approach was actually slower than “simple” implementations due to the was .NET works under the covers, and the impact of multi-level caches in a multi-core envrionment.
I remain confident that having a understanding of what happens “under the covers”, all the way down to the hardware level is of significant benefit to any developer. I am disappointed when I hear of University degrees [in computer related fields] that contain little, if any, substantive material on these topics, in spite of this seeming to be the normal condition these days
If you have stories where fundamental or historical knowledge has helped you achieve your goal, I would like to hear from you.
Computer Hardware 1972-2012
From the Beginning….
My journey started when I (along with three other students) outpaced the self-paced math program by completing 3 years of study in just one year. As we retuned to school the following year, the district did not have an appropriate course, and decided to assign us to one math teacher who was writing software for the brand new DEC PDP-8. The program being developed was a automated test recorder [using #2 pencil mark sense forms] and dynamic class scheduler – all in PAL-8 assembler. Every day was filled with wonder and excitement as we discovered what could be accomplished by the machine.
|*0200 / code starts at 0200o
Main, cla cll / clear AC and Link
tad a / load a
tad b / add b
dca c / deposit at c
hlt / halt
jmp Main / goto main on continue
*0300 / data starts at 0300o
|PAL 8 Source||ASR 33 Teletype||PDP 8|
Soon thereafter, the Micro Processor started being adopted, and it was possible to have a computer that did not fill a small room. Although the Altair (and the IMSI clone) were the leaders [see: http://www.i-programmer.info/history/9-machines/207-altair.html], the price tag made it beyond reach for a student such as myself. When Popular Electronics published instructions in 1976 on building a computer based on the RCA 1802 chip known as the COSMAC ELF, I was finally able to have my own computer.
By early 1978, Radio Shack had introduced the TRS-80 Model I, and along with Edward Baer, my first consulting company [Software Design Services] was up and running.
Around the same time, I had graduated high school , and started a summer job as an electronics technician. In an amazing coincidence, the company had been using a PDP-8 for automated testing, and the “guru” (Bob Damus) had just left the company. What was to have been a just a summer position became the first 15 years of my career. The PDP-8 was replaced with the PDP-11 (usually an 11/23) and then augmented with VAX computers (initially a 750, and eventually uVax). As the IBM PC (and clones) began to increase in power, they replaced dumb terminals on the desktop and in the lab, and the mini-computers saw less and less usage.
In 1984, I was a co-founder of Dynamic Concepts as a part-time venture. Our initial clients included Hearst Business Publishing and Philips Norelco. Unlike my regular job, where the “users” were highly technical engineers in their own right, the focus was now on dealing with business people who had little, if any, understanding of how computers worked.
In 1992, the defense industry had taken a major hit, and it was time to transform Dynamic Concepts from a part-time venture into my primary role. Our initial client was CMP Publications, and our task was to create their first Electronic Publication, NetSource. While quite common a few years later, in 1992 a CD-ROM containg full-text searchable content from a doxen trade publications, along with a database of product information that could easily be queried was quite extrodinary.
It has now been a little over 20 years since I took the “leap of faith” into running my own company, and it has been an amazing journey.