Originally Posted by 046
I google image searched byte magazine and saw some really old computers from the past.
Boy are they bulky.
You do not know the meaning of bulky.
I got my start in 1977 in Air Force tech school as a 305x4, an "Electronic Computer Systems Repairman". We trained on the BUIC (BackUp Intercept Computer). The BUIC was a late-1960's computer built with transistors (AKA "discrete components" as opposed to integrated circuits). The CPU was in a cabinet the size of a rather large refrigerator that was packed solid (a decade later, I worked a short while on a computer system with cabinets half that size that were almost completely empty because everything was on a single large circuit board). The memory cabinet was the same size as was the I/O cabinet. All three were duplicated for redundancy (if one failed, its backup would switch in immediately). The magnetic drum mass memories were a bit smaller as were the tape transports. The entire system filled a space about twice or thrice the size of a living room. The BUIC was supposed to first back up and then replace the Semi-Automatic Ground Environment (SAGE) computer which handled the bulk of the US Air Force's real-time radar data, but the BUIC couldn't do the job so in 1977 the last system was operating to provide training for apprentice technicians. The SAGE had been built in the 1950's, so instead of transistors it had vacuum tubes. It would be housed in a concrete block of a building that took up about half a city block of ground (OK, measuring the building at my old base it was about 155 ft by 155 ft) and was five stories high: the top two floors were administrative offices and radar scopes operations, one floor was maintenance, and two whole floors were computer. The building even had its own air-conditioning cooling tower (OK, not quite a huge cooling tower, but A/C was still vitally crucial). We got a couple conflicting stories about what would happen if the air conditioning ever failed in that building: within 20 minutes everybody inside would be dead and within 20 minutes the racks would start to melt down. I think the first one is more likely, but the bottom line is still that that computer kicked out some very serious heat.
For fun and for your own edification, run Google Earth and go to Grand Forks AFB, ND, at 47°56'48.93" N and 97°22'56.66" W, at the corner of Steen Blvd and G St. Now go back to 2002 Oct 16. There you will see the old SAGE building, which in my time there (1977 to 1982) had been converted to the Strategic Missile Wing building. A year later, the building had been demolished.
So, you wanna talk about "bulky computers"?
On YouTube, search for "Triumph of the Nerds: Accidental Empires" (or portions thereof). That was a PBS show from a couple decades ago that followed the creation of Apple and Microsoft and the rest of the PC industry. On YouTube there's also "Pirates of the Silicon Valley", a TV movie dramatizing those same events. Jobs and Wozniak got started working through and demonstrating to a "home-brew" club, which in the 1970's was the state of micro-computers. The Altair 8000 was the first computer kit you could buy, but nobody had any idea what to make it do. Through the home-brew club demonstrations, Jobs and Wozniak got their first prototype working and from that were able to get financing to start production. Microsoft was at first solely a language company that supplied BASIC interpreters for the various microcomputers (eg, Radio Shack's TRS-80, based on the 8080-A, which everybody referred to as the "Trash-80"). The King of microcomputer operating systems was Digital Research, the creator of CP/M. Circa 1980, when IBM was trying to get into the PC market, they approached Digital Research for the operating system, but with no success (the standard story is that the boss of DR was too busy flying in his plane to talk to them, but that's not supposed to be true). So IBM turned back to Bill Gates who was providing BASIC for the IBM PC and asked him if he could do the operating system and he said he could. Then he quickly bought some graduate student's project, QDOS ("Quick and Dirty OS"), and massaged it into PC-DOS v1. There's a scene in "Pirates of Silicon Valley" in which Bill Gates is negotiating with IBM over the rights to the operating system. IBM, ever stuck in the "big iron" paradigm in which the hardware is everything while software is just freebies that you throw in to the deal, allowed Microsoft to retain the rights to DOS. And now you know the rest of the story. (there was a radio broadcaster, Paul Harvey, who would tell an inspirational story from the beginning and as soon as he told you the name of the person involved, "and now you know the rest of the story.")
BYTE magazine started out in the mid-1970's as a home-brew magazine and it grew and kept pace with the growth of the PC industry well into the 1980's, after which it started to lose steam, especially as the industry became more industrialized. It was part of what we grew up with in the PC industry. Their cover artwork was always (in the beginning, at least) very clever and inspirational.
I remember reading through older issues in the university library. Back in 1975 or so, one of the big issues was the ever increasing densities of semiconductor memory. Memory used to be one of the most expensive parts of the hardware. Ferrite-core memory (manually threading ultra-thin wires through extremely tiny iron rings) was the norm (after mercury pool delay-line memories) and of course was extremely expensive. I think that the BUIC had 2kB of ferrite core memory (or at the very most 16kB). Why does C build projects the way that it does? Because it was designed when memory was very scarse and very expensive. In 1977, I window-shopped in a local store a semiconductor memory upgrade kit for an AppleII; it cost about $280. Circa 1990, I heard a quote of one meg of RAM for one dollar.
Anyway, in that back issue of BYTE that I read circa 1975 to 1977, the editorial was about the ever increasing densities of semiconductor RAM. The editorial quoted an electrical engineer acquaintance as stating that "nobody could ever possibly use more than 1 K of RAM." And now we routinely waste megs of RAM, because we can.
If you are a fan of science fiction or even if you are not, please start reading the first of Isaac Asimov's robot novels, The Caves of Steel
; Will Smith's movie, "I, Robot", was taken mainly from it. It was written in 1951, the year of my birth. On the very first page, the protagonist, a police officer, requests some information, so as it's rippling through a pool of mercury, it's recorded on a wire which is handed to him.
1. Recording technology at that time used wire, not tape. Tape recording was being researched by Nazi Germany through the German company, BASF. On NetFlix, watch an early first-season episode of "Mission Impossible" and you will see that the information they are seeking was recorded on a wire which is hidden in plain sight in a window planter. FWIW, my father had some radio equipment in the garage before he sold it to a local Japanese farmer (long post WWII; he had fought in the Pacific, so his being able to befriend local Japanese says a lot -- funny thing is that he took me along to drop that equipment off and all along the way he cautioned me to not say anything about the guy's eyes being slanted. I didn't know what to expect, but all I could think of when I saw his eyes was that there was nothing at all wrong about them.
You might want to read the Wikipedia page on Bing Crosby. In WWII he supported the USO effort very much, but as they advanced into Germany he also "liberated" a lot of BASF's recording equipment, which he then used in creating and editing his radio shows.
2. A single memory cell is basically a "flip-flop", a circuit that will either output a 1 or a 0. While technologies will differ in their requirements, in transistor terms you would basically need four or six transistors. In vacuum tube terms, you would need at least the same number. When registers number in the 16's or 32's, that starts to get very expensive both in terms of the number of components and also in terms of the power that needs to be consumed.
One idea that was used early on was that of the delay line
. You have a system. You input a bit and it takes a pre-determined period of time for that bit to come out the other end. You can read the bit at that time, but more important for memory purposes is that you can then feed it back into the system.
Now let's say that you have a pool of mercury. Knowing physics, you know just how long it will take for a pulse to ripple through that mercury to the other side. Now you have a speaker on one end of the mercury and a microphone on the other end. You send a series of pulses (or lack of pulses for zeros) out one end of the mercury pool and the other end reads in what had been sent out.
In my USAF correspondence courses, we covered a system that used a magneto-strictive wire. To write bits into the wire, you energized a magnet. That created a physical shock wave that traveled down the wire up to a transducer that read a bit everywhere that there had been a shock. Apparently, that was the same idea.
We have arrived at our current state of technology through the efforts of those who had come before us. There is a definite history of that advancement. There can be some value (or not) to learning what had preceded us.
In high school and college, I worked for my father, a general contractor and master carpenter, such that my most drastic adjustment to military life was having a weekend off -- ie, for about 8 solid years of my life, I was either in class, on the work site, or too sick to do anything.
Working construction, we used the current tools almost all the time, but on occasion we would have to revert to the older construction technologies (eg, drilling a hole in concrete with a star bit, using a hand router instead of a power router). That question always fascinated me: Before our modern power tools, how did they do it?
Similarly, before our modern computing hardware and tools, what did we have to work with? There's lots of interesting stuff to learn there.