Note: I had to split my "ramblings" into multiple parts due to board limitations.

Something I wrote late at night so bear with me if it rambles or became incoherent.
Quoting Mr. Peabody; “Sherman; get the wayback machine”!
Back in the 1960’s I watched a show called “Star Trek”. One of the basic themes that ran thru many of those shows was the impact of computers on both the starship and the worlds/societies it visited. From that moment on, I became fascinated with computers and machine intelligence. Moore’s law had yet to be defined and the microprocessor had not yet been invented. Heck, not too long earlier, in 1947, the very first transistor was invented and shortly thereafter, April 25, 1961, the first patent was granted for the integrated circuit. And as they like to say: “the rest is history”.
Anyhow, when Star Trek was being aired for the first time, RTL, DTL, and TTL were still king of the hill. Many of the computers of that time were large mainframes, discrete circuitry, and used electromechanical interfaces such as teletypes. Often there were articles about learning machines in publications such as Scientific American and the like; however, they only gave the first glimmer of the possibilities of where computer technology could (and did) lead us. NASA’s Mercury, Gemini, and Apollo were the ultimate in spacecraft design at the time and the future of manned spaceflight seemed to have no end in sight.
Then the economic “crash” of the late 70s reared its ugly head. Inflation reigned supreme, a hostage crisis became front line news, unemployment loomed over the average American’s head, and the space program pretty much dragged to a halt. Yet even amongst these nation-shaking issues, work on those tiny chips continued. Large scale TTL and other early technology were slowly being superseded by CMOS resulting in Moore’s law continuing unabated. These days, other than some small-scale integration, CMOS has pretty much replaced much of those early silicon designs.
So what is CMOS I hear you cry?
"CMOS ("see-moss"), which stands for complementary metal-oxide semiconductor, is a major class of integrated circuits. CMOS chips include microprocessors, microcontrollers, static RAM, and other digital logic circuits. The central characteristic of the technology is that it only uses significant power when its transistors are switching between on and off states. Consequently, CMOS devices use little power and do not produce as much heat as other forms of logic. CMOS also allows a high density of logic functions on a chip."
"The phrase "metal-oxide-semiconductor" is a reference to the nature of the fabrication process originally used to build CMOS chips. That process created field effect transistors (FETs) having a metal gate electrode placed on top of an oxide insulator, which in turn is on top of a semiconductor material. Instead of metal, today the gate electrodes are almost always made from a different material, polysilicon, but the name CMOS nevertheless continues to be used for the modern descendants of the original process.”
Let me digress a bit further into the culture of those times. Overpopulation was a great worry, the Vietnam War caused a huge backlash by the counter culture types, and racial clashes were commonplace. I still remember walking into bookstores and seeing the posters of the time showing a ruined society with waves of people everywhere. Bike gangs, drugs and hippies were all the rage. During this tumultuous time progress continued on these tiny circuits which would later evolve into the microprocessor. The record player/radio, television, and the telephone, were about all that the average house contained and certainly did not reflect the great technological leaps that were happening at the time. Banking was the closest that folks got to real world computing and the news often had horror stories about computer banking errors. This certainly did not instill trust by the general public into this budding technology. I often heard “what good are they” or “new fangled” during that time frame. Oh most folks knew NASA needed computers, however, they were more of a bother instead of a boon to the general masses.
This mindset cumulated towards the end of the 60’s and into the first part of the 70’s. Many books were written about the “information age”, e.g., “Future Shock” by Alvin Toffler. The supposition was that technology was going to “explode” at such a pace that the average person would be lost in this sea of technology and end up rejecting and/or being buried by the same. I personally did not adhere to that mindset and I so wanted my own computer. Unfortunately, computers were still the realm of either science fiction or large companies. The average individual could not own their very own computer. I would lament this fact to my dad and others and would receive this reply more often than not; “what in the heck would you do with it?”
To this day I remember a neighbor taking me to a Control Data Corporation magnetic core memory plant in or about 1971. What a treat! I stood in awe looking at all of the machines, terminals, teletypes, and computers in this huge building. I was in heaven. This was the place where the memory arrays were built for those huge mainframes. I was allowed into the room with the sea of workstations where women strung these tiny ferrite beads on strands of wire under magnifying glasses. If you have never seen a core memory plane in real life, you are missing out on a real work of art. Each core plane was strung with gold, red and green wires not much bigger than sewing thread. The cores themselves were so tiny; you could barely see the hole in the center of each. All these colored wires caused the core plane to glitter with tiny rainbows of light under the fluorescent lighting. I was in awe. I did make one BIG faux pas. I brushed a fingertip over the top plane of one of the core stacks to feel it not realizing I was causing almost two days work worth of damage to repair. I still feel bad about that to this day.
At the same time, there was another revolution going on that many have even yet not extrapolated to its logical conclusion. A scientist by the name of Dr. Gordon Moore made a postulation (in a paper to the April 19th 1965 edition of Electronics magazine titled, “Cramming more components onto integrated circuits”) that industry would be able to double the number of circuits onto an electronic chip while reducing the cost by one half every year. Quoting from that paper:
“The complexity for minimum component costs has increased at a rate of roughly a factor of two per year. Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will remain nearly constant for at least 10 years." (Moore 1965)
This was just a simple log-linear relationship between complexity and cost. In 1975 (ten years later), Dr. Moore delivered a paper to the 1975 IEEE International Electron Device Meeting where he showed a plot of semiconductor devices that remarkably followed his prediction very nicely. There were some minor revisions to the curve, most notably his prediction for doubling jumped to about every 18 months as apposed to one year. This curve became to be known as Moore’s Law and even got its own equation.
In a physics class I took, I heard a professor remark “if airplanes had progressed with the same rapidity and complexity of the microprocessor, we would have landed on the Moon ten years after the Wright Brothers flew at Kitty Hawk”. This may have been a bit over dramatic, however, it pushes the point of just how complex and capable these “tiny brains” called microprocessors are becoming.
Enter the microprocessor. Up until this point, the central processing unit of a computer had been made up of a number of different circuits, starting with tubes and relays in the very early machines which then migrated to transistors and finally to discrete integrated circuits.
In 1971, Working at Intel, Dr. Federico Faggin designed the first microprocessor called the 4004. Interesting anecdote: Dr. Faggin was in his lab all alone late one evening in January of 1971. He received his first 4004 Central Processing Unit (CPU) wafer late that day and wanted to test it. Working late into the night testing this processor, I can imagine the whoop as he realized it worked! His wife, Elivia, was the first person to share his triumph. If you ever peel the cover off of a 4004 CPU and look at it under a microscope, (I would not recommend it, they are becoming collector pieces commanding big bucks on EBay) you will notice etched along side of the circuit the initials FF for Federico Faggin.
Note: a circuit wafer is a round slice of silicon with a group of chips etched onto the surface, which are then cut up and packaged into the individual chips we see in our computers.
This little chip was a four-bit processor that contained 2300 transistors designed specifically as a calculator chip. This four-bit “brain” on a chip was rapidly superseded by eight-bit processors that ended up became the mainstay of processor technology throughout the rest of the 70s. However, over the following decades 8-bits has expended to 64 and even 256-bits for some designs.