Fog Creek Software
Discussion Board




"Father" of ASCII dead.

Bob Bemer died today. 84 years old and they said he was still coding up until he died.

http://www.cnn.com/2004/TECH/06/23/obit.bemer.ap/index.html

Festung
Wednesday, June 23, 2004

I heard just as he was keeling over, he managed to type in the closing bracket, but didn't get to the semicolon in time.

Can anyone confirm this?

Cubist
Thursday, June 24, 2004

Bullshit. I've been coding strictly in Python after retirement.

Bob Bemer's Ghost
Thursday, June 24, 2004

Bummer if true.
http://www.bobbemer.com/ is well worth a browse BTW.

Peter Ibbotson
Thursday, June 24, 2004

> Bummer if true

No, still Bemer. He's now got letters after his name though.

R.I.P.
Thursday, June 24, 2004

he died in cancer, that's was a major bug in his system he was not able to debug/eliminate.


Thursday, June 24, 2004

Am I the only one who doesn't think inventing ASCII text was that great an achievement?

He also seemed to have warned of te Y2K bug back in the seventies which would have seemed more precient if anything had happened when the year flipped.

name withheld out of cowardice
Thursday, June 24, 2004

Hmm, let's see.  He's the one who created the mapping from bytes to characters that almost every computer now uses (except when they use unicode).

I think that's a very big deal.

AllanL5
Thursday, June 24, 2004

> Am I the only one who doesn't think inventing ASCII
> text was that great an achievement?

Inventing the wheel wasn't such a big thing in hindsight either, nor was fire or walking upright, but before such things existed I'd bet people had a rougher time getting by. How about Morse code or the Dewey Decimal system for closer analogies. What may seem insignificant today was not so back in the day.

old_timer
Thursday, June 24, 2004

I've always been curious - if the intention was to allow computers to represent text which people could read, why do the letters and so on start in the middle?


Thursday, June 24, 2004

Several good attributes of the ASCII code:

1.  When sorted by ASCII, you also sort alphabetically.  One problem is that the lower case letters sort AFTER the upper case letters.

2.  The Numbers are hex values 0x30 ('0') through 0x39 ('9')

3.  All 'Control Characters' are 0x00 through 0x19, so are easily masked and/or checked.

4.  Upper case are 0x40 through 0x5A.  Lower case are 0x60 through 0x7A.  Set the '0x20' bit to change from upper to lower case.  Clear it to do the opposite.

5.  Yes, the 'special' characters (:;<=>?[\] etc.) are sprinkled around.  Sigh.

6.  It only uses 7 bits.  This allows the eighth bit to give us another 127 characters to be used for creative purposes.

All of these options were not obvious at the time.  IBM had the 'EBCIDIC' standard too, which had a different (and even wierder) mapping.

AllanL5
Thursday, June 24, 2004

I'm an old enough fart to remember when ascii was one of many.  EBCDIC was one, another was a grey code.  In a grey code, going to the previous/next in the sequence involved changing only a single bit.  So 'a' could be 0x60, 'b' 0x61, but then 'c' would need to be something like 0x63, not 0x62.

This was before Microsoft patented using our skin as batteries, needless to say.

Snotnose
Thursday, June 24, 2004

Unfortunately EBCDIC shouldn't be referred to in the past tense.  I still write ASCII <--> EBCDIC converters so our AS/400 & S/36 programmers can handle data coming in from the civilized world.

Brian
Thursday, June 24, 2004

"He also seemed to have warned of te Y2K bug back in the seventies which would have seemed more precient if anything had happened when the year flipped. "

It is really easy to poke fun at the people who warned of Y2K as a bunch of Chicken Littles, and in some cases their warnings went way overboard (yeah, I'm looking at you, Yourdon), but if they hadn't have made a big deal about it sooner rather than later, a lot more things would have gone wrong because they wouldn't have been fixed as they were.

Mr. Fancypants
Thursday, June 24, 2004

I worked at the IRS on their Y2K effort.  It was non-trivial.

If a person points out a crisis looming in the future, and then lots of people pay lots of money to avoid the crisis, and then the crisis does not happen -- does that mean there never was a crisis in the first place?

I don't think so.  I think the Y2K crisis was real, and the efforts of lots of people made sure its effects were minor.

AllanL5
Thursday, June 24, 2004

Okay Y2K was real in that there would be a problem for programs that need to convert years into something that can be displayed to people.  The thing I have questioned, and perhaps brighter people here can correct me, is this:  The idea that embedded services would go nuts on that day has never made any sense to me.  If you use one byte to represent the year, but it is internal to say, a traffic light, why would it act as if we were back at 1900 after "99"?  Shouldn't such a year flip back to zero after 255? (or maybe after 127 if you consider it to be signed).

name withheld out of cowardice
Thursday, June 24, 2004

The Y2K panic was more about databases that 'assumed' the year started with 1900.  Thus, the year 2000 would be considered 100 years ago, in 1900, not one year more than 1999.  These programs tended to expect their data to be decimal, not hex.

In embedded systems, that cared about day-of-week (lock the doors on Saturdays) then 1900 had a different DOW mapping than 2000.

Also, 1900 was NOT a leapyear, where 2000 WAS a leapyear.  So year 2000 actually had 366 days.  And what about all that archived data that only had 2 characters for Year.  Would they all assume the first two digits were now 20 when loaded into 'new' programs?

There were several issues involved -- not the least being what was the legal obligation people had to insure against, should something really bad happen, if they 'ignored' the Y2K problem.

AllanL5
Thursday, June 24, 2004

We still have anal QA people writing up bugs for any date that is displayed as two digits. I understand that two digit date can be a problem for sorting retirement records or something, but in the footer of a printed report, it's a worthless bug.

By the way, the world is going to end in 2038 when the Unix clocks role over...

MilesArcher
Thursday, June 24, 2004


"He's the one who created the mapping from bytes to characters that almost every computer now uses (except when they use unicode)."

Uh... You do know that ASCII is the lowest 7 bits of ISO-88591-1, which is the lowest 8 bits of Unicode, right?
So when you use Unicode, you are using ASCII too.

PR
Friday, June 25, 2004

Speaking of Y2K, my very favorite Y2K error was when early in the morning on January 1, 2000, I logged onto the OFFICIAL U.S. TIME SITE (which is now time.gov but back then was a longer URL, something like time.nist.something.gov) and was told it was January 1, 19100.

They were apparently using a number to represent the years after 1900 and just converting it to text and appending it to the string "19".  So when it got to 100, we jumped ahead to the magical year of 19100.

Would have been a bit amusing on any other site, but on the official time site it was quite hilarious.

Some Ranting Idiot
Friday, June 25, 2004

Allan5:

All plausible but was this reality?  I mean, an old program, so strapped for memory that it represents a year as a byte is unlikely to have a mapping of dates to days of the week stored.  It probably just counts the days and flips every seven.

Besides, the whole scare was that elevators would crash and planes would fall from the sky and the food supply would be disrupted and the electrical grid would shut down.  Basically Y2K is a widespread bug, here are the things that use software, what if they went down.

We were told that the problem was that every new year the Year field would be incremented and suddenly you subtract someone's birthday from todays year, or the last time a diagnostic was run from today's date, and you get a negative number which throws the computer into chaos as if Captain Kirk had asked it a very difficult question- smoke starts pouring out.

I don't see why this would, in general, occur.

Sure it could mess up the databases you described but that would hardly lead to the downfall of civilization.

name withheld out of cowardice
Friday, June 25, 2004

name withheld out of cowardice

Why are your harping on about the severity of the Y2K problem? Yes the media overhyped it, but I was there - I worked on medical software. The bug was real and it was serious - the big problem was not if the computers crashed it was if they didn't crash and instead continued on with bogus data leading to UNPREDICITABLE results. That was the scary thing.

DJ
Friday, June 25, 2004

It's weird to think of a programmer born in the 'silent era'.

Pakter
Friday, June 25, 2004

*  Recent Topics

*  Fog Creek Home