Fog Creek Software
Discussion Board




Were 80's developers better than most today?

Someone on another thread was discussing video games from the 80s (Atari, C-64, etc.)

Got me to thinking:

Those fellows managed to come up with GREAT games while dealing with the following constraints (vs. today's developers):

1. Very poor development enviroment. Very unproductive.
I wrote in assembler and BASIC on the Old TS1000 and the C-64.  I could NEVER do that again.

2. They had much smaller markets.

There were fewer people. A smaller % of them had computers. An even smaller % had a particular O/S (fragmented market).

3. They had less library support.
Heck, it was a BIG deal when someoen published the code for the C-64 to allowyou to DRAW in hires easily.

4. they sold for about the same as games today, sometimes even less. (High end C-64 game was about $40 as I recall, although that was in 1986 dollars).


5. Had more expensive distribution: Multiple Diskettes instead of one CD. No easy downloading of software. No inexpensive websites to market products. (We have a catalog and a website. The catalog costs about 10x what the website does and that doesn't even include postage.)


YES, they managed to have successful companies and products. (I remember Silient Service and (Apache?) a helicopter simulator).

POSSIBLE REASONS
1. They were more efficient: included only the most necessary features in a program.

2. Game players demand more "eye candy" today.
(Honestly, I'd enjoy the game play of Silient Service as much today as playing Star Craft - if it was multiplayer enabled.)

It's kinda like how the cost of making a presentation went up with the advent of computer projectors. Yet, the INFORMATION communicated didn't go up.  Color slides may look cool but most people don't really even know how to create a good b&w slide. (I used to make a lot of management presentations when I was a telecom mgmnt consultant.  My best presentation ever, which got rave reviews, had just circles and dots in the most complex slide explaining PCS vs. Cellular.)

HMMM...product idea: reproduce the classics for online multiplayer playing. Call it NastalgiaGames.net



Any other reasons?

The real Entrepreneur
Monday, January 19, 2004

Consider the superiority of:

* E.T. the Extraterrestrial
* PacMan
(both for the Atari 2600)

'Nuf said...

bob
Monday, January 19, 2004

I fondly remember the days of assembly program as a much _simpler_ time - a few dozen instructions, a very small list of API calls, and a machine that was extremely deterministic and static. You literally could remember every single instruction, system call, and parameter. Even in the early days of Windows I remember getting my Visual C++ 1.0 box with 50lbs of manuals, and I actually could remember every single WM_message and Win16 API call.

I've long given up on trying to remember more than that which I'm directly working on today, and I hit google and MSDN dozens of times daily (not because I'm lazy, but because it is impossible to remember the scope of information one touches daily). The machine that I work on is constantly changing and in flux, and there are so many interdependencies that I'm constantly fighting with that I feel like a QA analyst for Microsoft and every third party vendor.

Dennis Forbes
Monday, January 19, 2004

I think that in general, home computer programs from 20-25 years ago were of higher overall quality than what's in stores today. I think this is entirely explained by the sheer lack of complexity of those programs.

Sure, that copy of "Electric Pencil" was rock-solid, and it never crashed, but it also didn't support a spell-checker, inline graphics, right-to-left text, more than one font (without embedding printer-specific control codes in your text), or online help. Of course you hardly needed online help since there were so few features to explain.

The same argument applies to an even greater extent to the platforms that the software ran on. When you've only got one program at a time running, and you reboot to switch applications, that makes things a bit easier.

An interesting corollary to this is the embedded systems world, where many projects are still small enough that one person (or a small team) can be responsible for ALL the code running on the system at one time.  And no surprise, the quality of the typical embedded application is orders of magnitude better than the quality of the typical desktop application.

One other factor that's caused the first-shipped quality of software to decline is the rise of the Internet. You can ship pretty awful software as a 1.0 version these days, and just put a patch up on your website for folks to download.

Interestingly, the wide deployment of Flash memory technology is starting to erode the traditionally high cost of updating embedded applications. This is leading, I believe, to an overall decrease in quality for many embedded applications.

-Mark

(Who has written both commercial and embedded software in the 1980s, 1990s, and 2000s)

Mark Bessey
Monday, January 19, 2004

It depends on how you define "better". Were EE's better before Spice modeling? Were ME's better before CAD? Were accountants better before accounting software packages emerged?

Developers in the 80's may have had a greater understanding of the fundamentals, but ultimately the job of a developer has remained the same, to wit, to solve problems. The advancements in the field have allowed developers to turn their attention to much more difficult problems.

So, the constraints of the past have just been traded in for the complexity of the current.

Nick
Monday, January 19, 2004

All Infocom games had a proprietary language and libraries, the biggest challenge was getting a game to fit on the C|64, which some would argue made them make better games.

Silent Service is one of my favorite games of all time, another one is the little known game Alternate Reality for the Atari (also ported to Apple, Amiga and C|64).

Games back then were smaller. One developer could write an entire game, and often did. All the infocom games were pretty much one man operations, so was the first Alternate Reality game, the follow up was a 2 person game. The Kings Quest games were pretty much one woman games.

Alternate Reality has around the depth & complexity of Grand Theft Auto and an 80's version of graphics that were just as good, including sunrise, sunset, night, rain. The first game has no real plot because of a deadline imposed by the company he sold the game to.

It could accomplish this because it was so much simpler. the 3d engine was primitive - only 90 degree angles, the monsters were all 2 dimensional animations along the lines of .gif animation, the conversations were all written text and so on.

Today you need dozens of specialized programmers, voice actors, and so on to make just about any video game.

www.MarkTAW.com
Monday, January 19, 2004

You bet we were better. We had to chisel instructions into the bare metal. But I go back even farther, to clay tablets. Now they were tough to program.

old_timer
Monday, January 19, 2004

Today's developers, take heart: someone's bound to ask this question 20 years from now.

Cognitive Dissonance
Monday, January 19, 2004

"Remember back in the early 2000's where it only took a team of 100 or so people to make a commercial video game?"

www.MarkTAW.com
Monday, January 19, 2004

And did I tell you, back in the 80s, we had to trudge 25 miles thru snow, just to get to the computer :-)

S. Tanna
Monday, January 19, 2004

>> Developers in the 80's may have had a greater understanding of the fundamentals, but ultimately the job of a developer has remained the same, to wit, to solve problems. The advancements in the field have allowed developers to turn their attention to much more difficult problems.

While I agree with this sentiment (we have all these libraries and tools and so on that allow us to work on larger/more complex problems) --- I think there is a fundamental issue as well --- Are we sure that we are solving the basic aspects of the problem nearly as well as we used to?

"1980-style":  Solve A - solve it really well - even though we have no tools

"2000-style": Solve, A, B, C, D, E, F, G etc - solve lots of problems using tools/libraries/APIs/objects/etc. 

Question: Just looking at A, is the 2000 solution as good as the 1980 solution?

I hadn't really previously thought about quality/stability (but that's an interesting aspect of the discussion - and I tend to agree the 80s solution were generally more solid) -- but certainly in terms of efficiency, precision, etc., then 80s style seems to win, when we just look at A.

I'm not sure that precision, efficiency matters so much, when the 2000s-style is just to throw CPU cycles, memory etc., at any computer problem (and these are cheap in the 2000s) -- but I have this horrible feeling something has been (almost irretrievably) lost - especially as regards quality.

S. Tanna
Monday, January 19, 2004

"Were EE's better before Spice modeling?"

AAAAAIIIIEEEEEE!!!!! Make it STOP! Make it GO AWAY!!!!

Philo <- PSPICE flashbacks

Philo
Monday, January 19, 2004

It strikes me that Pac Man is the kind of project a college professor could assign today. Hobbyists who know very little programming create Infocom style games.

A Pitfall would be harder to do, Silent Service could probably be replicated as a term project.

A Sims on the other hand... Well, under the hood The Sims is fairly simple, but the interface would be hard to replicate.

www.MarkTAW.com
Monday, January 19, 2004

Most software development up until the early 1990s was a mix of "classical" computer science and some engineering practice: using fundamental algorithms and data structures to solve problems efficiently with limited available resources and constantly making cost/benefit judgements.  SW development today is mostly about plugging components together, even to build trivial programs. And the programs that result can generally be constructed without any real constraint such as memory space.

So, on the level of being in/excluded due to lack of skill, an 80's developer was far more likely to be a mensch. A 2000's developer can be a slob with rather shallow knowledge and still find work, and personality and "fit" with the employer means a lot more today than it used to. We've become a commodity class because it just isn't that hard to get "something" working today.

The mystery factor: In "olden" times, in C or assembler, you "knew" what an array was. In a modern HLL, an array may well be implemented as a list of memory references, the list may be subject to memory management, and you have no real idea where it lives in absolute locations in memory (if this classical definition of "array" even makes sense in that context.)

The relative challenge: It seemed pre-1990s like SW development was like Sisyphus pushing the eternal boulder uphill - you were never done optimizing and tweaking, and even the simplest project mushroomed into a *great* challenge until you could make some simplifying assumptions. Today, the SW developer connects black boxes that perform complex protocols and functions and continually uses APIs that hide a mountain of complicated stuff.

One thing that "degrades" the experience of writing software today is that you have to have a continually-renewed encyclopaedic knowledge of libraries if you want to be able to find work. There's not a lot of room to invent a clever algorithm or to do "research" type tasks in conjunction with SW.

This leads to my final point: modern software verges on "untestable" because it's so platform and version specific. The developer has to get so *many* things right, has to interpret buried functionality so carefully. A Windows application is a tiny wart/hickey on the side of Windows XP.

Bored Bystander
Monday, January 19, 2004

In the early home computer days, you could have a complete understanding of how something worked.  Put a value in a memory address and the hardware reacted.  There was nothing more to know about it.  Now, to put a pixel on screen you have to set up a window and a message loop and know the API calls needed to do drawing.  And you never fully know what's going on past that point.  How many instructions does it take to put a dot on the screen?  Does it depend on the video mode?  And so on.  You can do things slightly wrong and have it work on some machines, but not others.

Programming has gotten a lot softer, in that you have to rely on a lot of ill-defined layers beneath you which are difficult to have full faith in.  When programming an Apple or Atari or C64, you felt like you were standing on a rock-solid foundation.  Now it's more like fluffy clouds :)

Junkster
Monday, January 19, 2004

>> PSPICE?

Philo, is there ANYTHING you haven't done?! :-)

Bored Bystander
Monday, January 19, 2004

"Were EE's better before Spice modeling?"

That could be argued - it certainly can support bad ones.  I've seen EEs just playing around with values until it starts to work.

r1ch
Monday, January 19, 2004

Regarding development environments, I thought I read that the larger companies used high-end workstations for development back then. Anyone know more about 80's development?

Bill K
Monday, January 19, 2004

"Philo, is there ANYTHING you haven't done?! :-) "

I haven't eliminated all the stupid stinking power bricks under my desk.

Philo

Philo
Monday, January 19, 2004

Back in the 80s the development environments were not moving targets. Because of this we could spend all of our time getting up to speed on one system. Now the IT industry is Fire and Motioning itself in the foot, I barely have time to get to grips with one technology.

I don't know about anyone else, but I really liked knowing everything there was to know about a system. You could concentrate on development instead of worrying over leaky abstractions. It seems that 50% of my time these days is spent working out new APIs, language structures, frameworks, web surfing... surfing for technical info, of course. Of course, I wouldn't waste my employer's time needlessly on the web. =)

Shodan
Monday, January 19, 2004

My first programming job was 1979 - 81 in a Data General OEM shop.  Data General Nova III - there was a machine.  I still don't think it's really a computer if it doesn't have blinkin' lights and 16 toggle switches to boot from.

Ah, the memories ... hand printing code with mechanical pencils to submit to the entry operators for input and compile ... er, atttempt to compile ... re-code ... submit ... wait ...

The day we got 25 MB Century hard drives the size of washing machines ... the expensive and expressive sound they made when I managed to hard crash their heads on a fairly regular basis ....

All night sessions waiting for a compile that never came ...



Mongo
Monday, January 19, 2004

>>AAAAAIIIIEEEEEE!!!!! Make it STOP! Make it GO AWAY!!!!
>>
>>Philo <- PSPICE flashbacks

I'm not an EE, but I did have to do PSPICE modeling for my E-Fundies class. the old DOS-based programs were truly awful. The modern versions are much more user friendly.

>>That could be argued - it certainly can support bad ones.
>>I've seen EEs just playing around with values until it
>>starts to work.

Touche`.

Nick
Monday, January 19, 2004

"""Sure, that copy of "Electric Pencil" was rock-solid, and it never crashed, but it also didn't support a spell-checker"""

As it happens, I actually worked on a spellchecker for Electric Pencil, that was being added to the product by its then-owners.

However, that was actually in the late 80's, and it was a DOS version of Electric Pencil, the other versions having already been discontinued.  And for the life of me now, I don't remember much of anything about the work I did.  I literally lived in the office building for the whole time the work was done.  I vaguely recall that I was doing assembly coding, and that the whole thing was done in a ridiculously short time.  But that's about it.  Mostly I remember the cold and discomfort of sleeping on the floor, and leaving the building at night to forage for food at nearby supermarkets.

Gosh I'm glad it's not the 80's any more.  :)

Phillip J. Eby
Monday, January 19, 2004

It isn't just hobbyists writing infocom style games. I've assigned them in junior level CS classes.  There was no way I could have done that in the 1980's.  Those games were developed on mainframes and shoehorned into PCs after the kinks had been worked out. No way they could be developed on the typical home PC of the day.

Clay Dowling
Monday, January 19, 2004

Thus supporting my statement that 80's games can be assigned as work for college level classes.

www.MarkTAW.com
Monday, January 19, 2004

Clay, I wrote infocom style games back in the 80s and I didn't use a mainframe.

Dennis Atkins
Monday, January 19, 2004

I think a clarification of the particular segment of the "80s" is in order -- i.e. in 1985 the Mac, Atari ST, and Amiga all hit the scene, all with 512-4MB of RAM, and a rather powerful 32-bit processor (though it had a 16-bit memory bus, hence the Sixteen-Thirty-two naming of one machine). These are a world away from the Acorn or Atari 400XL.

Dennis Forbes
Monday, January 19, 2004

OK, I wrote adventure games in 1978 on a PET with 4k of memory. When I got a Atari 800 with 8k of memory, I was thrilled to buy an additional 40k of memory for $1200 and an assembler which I used to write first a collosal cave style parser and then an infocom style full-language parser. Once the parser is done, writing the adventure program is easy. It's just a matter of entering fields into a home brew database.

Dennis Atkins
Monday, January 19, 2004

I think one factor often overlooked about 80s games coding was that it was all, effectively, open source. 

This meant that techniques and styles were quickly copied across the whole field.

So, for example, once one programmer had worked out how to scroll the screen at 60 fps, every other programmer could easily disassamble and see how to do it for themselves.

I think this explains why the quality of home computer games improved so much quicker than on the consoles.

Ged Byrne
Tuesday, January 20, 2004

"I think one factor often overlooked about 80s games coding was that it was all, effectively, open source.  "

How do you figure this? My memory of the 80s was that apart from in UNIX camps where many apps were distributed with sourcecode (because there was fragmentation), `open-source' pretty much didn't exist.

Dennis Forbes
Tuesday, January 20, 2004

Thats why I say effectively.

Everything was written in machine code so the source was available for anybody to read and learn from.

Ged Byrne
Tuesday, January 20, 2004

It is pretty clear what he meant if you read his post without skimming it.

Since most games (most apps in general, for that matter) were written in assembly anyway, if you ran a disassembler on the code you'd get more or less the original source code -- minus comments, macros and labels... but everything was simple enough back then that you could get a lot of information from reading a disassembly and this was done quite often to pick up new techniques (as the poster 2 up mentioned). 


Also, everything old is new again and we're now at a point where many modern languages (like Java and C#) can be virtually fully decompiled because of all the metainformation....

Mister Fancypants
Tuesday, January 20, 2004

Yeah Jeez if you're saying that open source means reading the machine language, then everything in the world is open source and its not a useful distinction.

Of course given the lack of documentation in most modern open source software, you might as well be reading assembly without any comments.

Dennis Atkins
Tuesday, January 20, 2004

Absolutely.....they were better but only because they were forced to be better :-)

In those days you could not be a developer without atleast having a basic knowledge of Assembly language, code optimization techniques etc...

Look at Turbo Pascal from that era....look at Sidekick...look at Wordstar...Lotus 123....the things they managed to do within the limitations of CPU, Memory, 1.2MB floppies and 10MB harddisks were simply amazing...

Looking at the bloatware from Microsoft and other companies today there is just no comparison of today's developers with the developers of yore.

Code Monkey
Tuesday, January 20, 2004

Bah.

Bloatware has to be one of the most over-used and under-criticized phrases in software development these days.

As other posters have pointed out, the main difference in software these days and with software of the past is the incredible increase in complexity. That complexity is needed to meet the expectations of the end-user.

To use computer games as an example, try asking your average gamer to give up "Prince of Persia: Sands of Time" and go back to the original "Prince of Persia."

"But...", you say, "Prince of Persia: Sands of Time is so FREAKING HUGE compared to the original! It's bloatware! Why, the original could fit on a single floppy! The new one requires a DVD! What a bunch of lousy folks who wrote that thing."

Sorry, that's an indefensible argument in my book. You can make the same comparison for just about any modern piece of software with a corresponding historical equivalent: ask everyone who uses Excel to go back to Lotus 123 circa 1987 and see how many takers you have, "because it's smaller."

Mike Treit
Tuesday, January 20, 2004

sure they were better. back in the 80s there were cool magazines with code listings. programmers talked about..."programming." people knew like... what "bits" and "bytes" were.

nowadays programmers have huge drawn out conversations about whether or not cubicles are better than offices. Or how quiet the workspace needs to be before they can write their daily 3 lines of C# code.  it is embarassing to be lumped together with such softies, to say the least. 


Tuesday, January 20, 2004

In those days spirits were brave, the stakes were high, men were real men, women were real women and small furry creatures from Alpha Centauri were real small furry creatures from Alpha Centauri.

Douglas Adams

Michael Moser
Friday, January 23, 2004

*  Recent Topics

*  Fog Creek Home