July 18th, 2013, 07:03 PM
My textbook says this code
"displays a heading at the top of a new page, indented to the third tab and underlined."
But what I get is only an underline indented to the third tab.
According to the book, "\r gives a program to the ability to create a file containing more than one character in one line position".
If this were so, I would understand that for F and _ would both be shown on the screen and hence F would be underlined.
However, reading this page (http://stackoverflow.com/questions/4638552/carriage-return-in-c) it seems to me that \r doesn't have such powers.
Rather than showing two characters, \r overrides the character that was previously in that position.
Which is the correct explanation?
Is it possible to get an underlined text using \r, as the textbook claims?
July 18th, 2013, 07:14 PM
It's possible if they're talking about using a line printer.
What book are you reading?
July 18th, 2013, 07:19 PM
I'm reading Problem Solving and Program Design in C, Sixth Edition by Jeri R. Hanly and Elliot B. Koffman, published by Pearson.
I'm using the International Edition.
July 18th, 2013, 08:12 PM
When was that book published?
At first, printer terminals, basically little more than teletype machines, were fairly common and it wasn't until the late 1970's (as I recall) that video terminals became more common. ASCII was developed in the 50's and 60's for teletype machines, so the control characters were based on what was needed to control those machines. So, with a printer for output, it made sense to be able to just do a carriage return without line-feed in order to type over what had been printed before, thus creating accented letters and underlines and to blot out passwords.
When I started working on my computer science degree in 1978, we had DecWriter terminals (Byte Magazine cover of July 1976 depicted Thomas Jefferson writing the Declaration of Independence on a DecWriter) which we connected to the IBM mainframe via a 110-baud acoustical-coupler modem (we also had a few 300-baud lines that we were almost willing to kill for). Soon as you connected, you'd get the login prompt asking for your username. When you entered that, it would prompt you for your password and then it would make several passes over the same line typing a series of capital letters in each position of the password field, so that when it was done the password field would be filled with near-solid blocks. That way, when you typed in your password, which would get printed in the process, nobody would be able to read it.
But video displays work differently. In the terminal's memory is an array called Video RAM or something similar. For each character position, there would be one or more bytes assigned. One byte would be for the character's ASCII code, and the others for whatever characteristics were supported (eg, color and intensity (RGBI), special markings such as underline, strikethrough, blinking).
Now, if you write a character to a given position in video RAM, then that character would be displayed. If you write a different character to that same position, then the first character would no longer be displayed, but rather the new character would. That is why the old trick of overwriting the same line won't work on a video display.
OK, so the book's 6th edition was published in 2009. When was the first edition published? This example was apparently one from the earliest editions that never got updated.
July 18th, 2013, 10:31 PM
So the difference arises in the difference between printer and video terminals; video terminals store what to show in a memory, so when you do overwrite, the information that was there gets deleted.
In the third page of the book it said "Copyright 2010, 2007, 2004, 2002, 1999".
But I don't know if 1999 is the year the first edition was published.
July 18th, 2013, 11:40 PM
Plus, by that time video displays were in very common use and you hardly ever saw a printing terminal anymore. If that code had been intended for a line printer attached to the PC, then it would have used fprintf instead of printf and it would have printed out to the LPT printer port. It's hard to understand why they had included such outdated code. Could you please read carefully through the accompanying text to see if their intent was to demonstrate an older use.
BTW, \r does get used for applications, such as HTTP messages, which explicitly require CRLF, eg "\r\n" (if I got that order right).
BTW, I'm attaching a jpeg from that BYTE magazine cover with Thomas Jefferson working at his DECwriter.
July 19th, 2013, 12:36 AM
I did, and it doesn't seem like they wanted to show an usage example of a previous period.
I google image searched byte magazine and saw some really old computers from the past.
Boy are they bulky.
July 19th, 2013, 02:19 AM
You do not know the meaning of bulky.
I got my start in 1977 in Air Force tech school as a 305x4, an "Electronic Computer Systems Repairman". We trained on the BUIC (BackUp Intercept Computer). The BUIC was a late-1960's computer built with transistors (AKA "discrete components" as opposed to integrated circuits). The CPU was in a cabinet the size of a rather large refrigerator that was packed solid (a decade later, I worked a short while on a computer system with cabinets half that size that were almost completely empty because everything was on a single large circuit board). The memory cabinet was the same size as was the I/O cabinet. All three were duplicated for redundancy (if one failed, its backup would switch in immediately). The magnetic drum mass memories were a bit smaller as were the tape transports. The entire system filled a space about twice or thrice the size of a living room. The BUIC was supposed to first back up and then replace the Semi-Automatic Ground Environment (SAGE) computer which handled the bulk of the US Air Force's real-time radar data, but the BUIC couldn't do the job so in 1977 the last system was operating to provide training for apprentice technicians. The SAGE had been built in the 1950's, so instead of transistors it had vacuum tubes. It would be housed in a concrete block of a building that took up about half a city block of ground (OK, measuring the building at my old base it was about 155 ft by 155 ft) and was five stories high: the top two floors were administrative offices and radar scopes operations, one floor was maintenance, and two whole floors were computer. The building even had its own air-conditioning cooling tower (OK, not quite a huge cooling tower, but A/C was still vitally crucial). We got a couple conflicting stories about what would happen if the air conditioning ever failed in that building: within 20 minutes everybody inside would be dead and within 20 minutes the racks would start to melt down. I think the first one is more likely, but the bottom line is still that that computer kicked out some very serious heat.
For fun and for your own edification, run Google Earth and go to Grand Forks AFB, ND, at 47°56'48.93" N and 97°22'56.66" W, at the corner of Steen Blvd and G St. Now go back to 2002 Oct 16. There you will see the old SAGE building, which in my time there (1977 to 1982) had been converted to the Strategic Missile Wing building. A year later, the building had been demolished.
So, you wanna talk about "bulky computers"?
On YouTube, search for "Triumph of the Nerds: Accidental Empires" (or portions thereof). That was a PBS show from a couple decades ago that followed the creation of Apple and Microsoft and the rest of the PC industry. On YouTube there's also "Pirates of the Silicon Valley", a TV movie dramatizing those same events. Jobs and Wozniak got started working through and demonstrating to a "home-brew" club, which in the 1970's was the state of micro-computers. The Altair 8000 was the first computer kit you could buy, but nobody had any idea what to make it do. Through the home-brew club demonstrations, Jobs and Wozniak got their first prototype working and from that were able to get financing to start production. Microsoft was at first solely a language company that supplied BASIC interpreters for the various microcomputers (eg, Radio Shack's TRS-80, based on the 8080-A, which everybody referred to as the "Trash-80"). The King of microcomputer operating systems was Digital Research, the creator of CP/M. Circa 1980, when IBM was trying to get into the PC market, they approached Digital Research for the operating system, but with no success (the standard story is that the boss of DR was too busy flying in his plane to talk to them, but that's not supposed to be true). So IBM turned back to Bill Gates who was providing BASIC for the IBM PC and asked him if he could do the operating system and he said he could. Then he quickly bought some graduate student's project, QDOS ("Quick and Dirty OS"), and massaged it into PC-DOS v1. There's a scene in "Pirates of Silicon Valley" in which Bill Gates is negotiating with IBM over the rights to the operating system. IBM, ever stuck in the "big iron" paradigm in which the hardware is everything while software is just freebies that you throw in to the deal, allowed Microsoft to retain the rights to DOS. And now you know the rest of the story. (there was a radio broadcaster, Paul Harvey, who would tell an inspirational story from the beginning and as soon as he told you the name of the person involved, "and now you know the rest of the story.")
BYTE magazine started out in the mid-1970's as a home-brew magazine and it grew and kept pace with the growth of the PC industry well into the 1980's, after which it started to lose steam, especially as the industry became more industrialized. It was part of what we grew up with in the PC industry. Their cover artwork was always (in the beginning, at least) very clever and inspirational.
I remember reading through older issues in the university library. Back in 1975 or so, one of the big issues was the ever increasing densities of semiconductor memory. Memory used to be one of the most expensive parts of the hardware. Ferrite-core memory (manually threading ultra-thin wires through extremely tiny iron rings) was the norm (after mercury pool delay-line memories) and of course was extremely expensive. I think that the BUIC had 2kB of ferrite core memory (or at the very most 16kB). Why does C build projects the way that it does? Because it was designed when memory was very scarse and very expensive. In 1977, I window-shopped in a local store a semiconductor memory upgrade kit for an AppleII; it cost about $280. Circa 1990, I heard a quote of one meg of RAM for one dollar.
Anyway, in that back issue of BYTE that I read circa 1975 to 1977, the editorial was about the ever increasing densities of semiconductor RAM. The editorial quoted an electrical engineer acquaintance as stating that "nobody could ever possibly use more than 1 K of RAM." And now we routinely waste megs of RAM, because we can.
If you are a fan of science fiction or even if you are not, please start reading the first of Isaac Asimov's robot novels, The Caves of Steel; Will Smith's movie, "I, Robot", was taken mainly from it. It was written in 1951, the year of my birth. On the very first page, the protagonist, a police officer, requests some information, so as it's rippling through a pool of mercury, it's recorded on a wire which is handed to him.
1. Recording technology at that time used wire, not tape. Tape recording was being researched by Nazi Germany through the German company, BASF. On NetFlix, watch an early first-season episode of "Mission Impossible" and you will see that the information they are seeking was recorded on a wire which is hidden in plain sight in a window planter. FWIW, my father had some radio equipment in the garage before he sold it to a local Japanese farmer (long post WWII; he had fought in the Pacific, so his being able to befriend local Japanese says a lot -- funny thing is that he took me along to drop that equipment off and all along the way he cautioned me to not say anything about the guy's eyes being slanted. I didn't know what to expect, but all I could think of when I saw his eyes was that there was nothing at all wrong about them.
You might want to read the Wikipedia page on Bing Crosby. In WWII he supported the USO effort very much, but as they advanced into Germany he also "liberated" a lot of BASF's recording equipment, which he then used in creating and editing his radio shows.
2. A single memory cell is basically a "flip-flop", a circuit that will either output a 1 or a 0. While technologies will differ in their requirements, in transistor terms you would basically need four or six transistors. In vacuum tube terms, you would need at least the same number. When registers number in the 16's or 32's, that starts to get very expensive both in terms of the number of components and also in terms of the power that needs to be consumed.
One idea that was used early on was that of the delay line. You have a system. You input a bit and it takes a pre-determined period of time for that bit to come out the other end. You can read the bit at that time, but more important for memory purposes is that you can then feed it back into the system.
Now let's say that you have a pool of mercury. Knowing physics, you know just how long it will take for a pulse to ripple through that mercury to the other side. Now you have a speaker on one end of the mercury and a microphone on the other end. You send a series of pulses (or lack of pulses for zeros) out one end of the mercury pool and the other end reads in what had been sent out.
In my USAF correspondence courses, we covered a system that used a magneto-strictive wire. To write bits into the wire, you energized a magnet. That created a physical shock wave that traveled down the wire up to a transducer that read a bit everywhere that there had been a shock. Apparently, that was the same idea.
We have arrived at our current state of technology through the efforts of those who had come before us. There is a definite history of that advancement. There can be some value (or not) to learning what had preceded us.
In high school and college, I worked for my father, a general contractor and master carpenter, such that my most drastic adjustment to military life was having a weekend off -- ie, for about 8 solid years of my life, I was either in class, on the work site, or too sick to do anything.
Working construction, we used the current tools almost all the time, but on occasion we would have to revert to the older construction technologies (eg, drilling a hole in concrete with a star bit, using a hand router instead of a power router). That question always fascinated me: Before our modern power tools, how did they do it?
Similarly, before our modern computing hardware and tools, what did we have to work with? There's lots of interesting stuff to learn there.
July 19th, 2013, 10:10 AM
Wow, that is pretty scary.
I had no idea computer technicians were involved in such risks back then.
I don't remember the exact numbers, but I heard a similar story from my father. One was when he bought a scientific calculator for me at walmart. He was amazed that you could get them for $10 these days. He also recounted his experience when he bought some HDD for some price, and a few years later, much smaller HDD's with more storage capacity were sold at half the price or something.
One more valuable lesson from you.
This is only half relevant, but a programming exercise on strings that I found online instructed me to build my own version of strcpy and strcat.
I thought the exercise was beneficial because it provided a chance for me to understand how the function actually worked.
(I say it's half relevant because during the exercise I stood at a point before there were such convenient functions strcpy and strcat)
Thanks so much for information. It was really interesting, and I never expected I would get this much from one sentence I posted.