Fog Creek Software
Discussion Board




on "How Microsoft Lost..."

Here are some random comments on Joel's latest piece:

You keep underscoring the improvements .net brings. As I have understood it consists of a) new APIs and b) bytecode. What did I miss that's so great? Very few people code their GUIs by hand anyway, so they would hardly notice the difference.

I am skeptic that having a GC is the biggest boost with modern languages and "managed code" (not explained while 'API' is). Ask a Python guy and I believe that the dynamic & late-bound typing is the real time-saver. (Or ask a Perl guy, who's done it far longer, and perhaps he would think it's all the free libraries.)

Plus the example on memory management is wrong. There is no problem saying f(g(x)) in C. This is an invented problem from people who has some religious belief pointers should not be visible, despite that's how it *works*.

There are some other smaller things I don't agree with in the article as well but I believe the idea is correct -- and trivial. You could say this a much simpler way: There's lots and lots of software available for MS Windows, and that's worth a heck of money. Which we all knew already.

I think you underestimate the biggest problem with managed code: it's slow. I know, there's a lot of benchmarks out there telling you otherwise, but some say Java bytecode is faster than C++ too. This is of course wrong, as long as a CPU works the way it does today, native code is always faster.

Given what people pay for 10% extra performance on their PC (don't lie to me -- you've done it too :) ), I suspect slowing an app 200% (something) is somewhat relevant.

And developing for someone elses platform has always been problematic. It's probably not a coincidence that MS is large on Windows applications and Claris was large on old MacOS. This is why IBM backs Linux, and the real advantage of it, Stallman calls it 'free' but this is what it means: levelling the playfield. Of course, for some people
that would be a big minus: there's no Microsoft to buy you.

I think the future is more along the lines of disconnecting the APIs from the operating system, what Joel calls 'dependencies'. Yes, it gets complex, but it works. I've used Linux for almost ten years, and it's customary for larger applications for have a dozen dependencies. I have never ever cared about them, never seen any of them without asking explicitly, and not once had a problem with it. It's not as much problem as you make it to be.

The world won't switch to Longhorn over a day. Most people keep their old code, and that's a good thing for them. Look at what happened to Itanium. There are countless other examples.

And therein lies the advantage: Playing the reinvention game slows down the competition, if you are as large as they. 'Database' is still a buzzword and namedropping it together with WinFS was a stroke of genius. Journalists all around the world hails it as the next big time. If you make operating systems, you have to counter that if you want people to invest in you. It's the 'push' thing all over again.

Which is sad. We should fight complexity, not embrace it for political reasons. Luckily no one knows what future hardware will look like. There'll be tons of opportunities to reinvent.

Jonas B.
Friday, June 18, 2004

By the way, the new Mozilla 1.7 now supports the
onBeforeUnlad, that lets you promt the user to save their work before closing the browser

http://www.mozilla.org/releases/mozilla1.7/README.html#new

Thomas
Friday, June 18, 2004

"Plus the example on memory management is wrong. There is no problem saying f(g(x)) in C. This is an invented problem from people who has some religious belief pointers should not be visible, despite that's how it *works*."

Let's say that x is a linked list of items, and g(x) is a filter function that takes a list and returns a list of all items with a certain property.  Then that new list gets passed to f.  There's no way around having to save the value returned from g(x), then deleting it after f() is called.

Junkster
Friday, June 18, 2004

There's no way around saving the list and deleting it later, that's what happens in both a C++, Java, and Lisp program, but there's no need to manually delete it in C++.

Hint: auto_ptr<>.

Reginald Braithwaite-Lee
Friday, June 18, 2004

Jonas wrote:

"I am skeptic that having a GC is the biggest boost with modern languages and 'managed code' (not explained while 'API' is). Ask a Python guy and I believe that the dynamic & late-bound typing is the real time-saver. (Or ask a Perl guy, who's done it far longer, and perhaps he would think it's all the free libraries.)"

Exactly. I made the same point in an earlier post. Moving from C/C++ to Python was a boon for me because of the variety of high-level libraries. Not having to worry about types and compile-time binding comes second. Running bits of code in an interactive window is third. Automatic memory management is a plus but not the most compelling feature by a longshot.

"Plus the example on memory management is wrong. There is no problem saying f(g(x)) in C. This is an invented problem from people who has some religious belief pointers should not be visible, despite that's how it *works*."

Exactly again. I have found without exception that the people who complain about C and C++ memory management and pointers and syntax don't really know those languages. That seems to be part of programmer psychology, because I often hear neophyte programmers make fun of COBOL when they have never read or written COBOL code -- they're ignorantly repeating received wisdom. If your introduction to C came from Kernighan and Ritchie, and you've spent more than five years mastering it, you should have no trouble with pointers and indirections and C memory management. If you used a "Learn to program in C in 21 Days" book and have never actually written a real C application you should keep your mouth shut.

I've often joked that if you can glance at 'while (*t++ = *s++) ;' and immediately recognize the idiom for copying a string, you are probably a C programmer. If you don't recognize it, or don't understand it, or think that it's bad form and should be written as a 'for' loop with arrays, you are not a C programmer.

Greg Jorgensen
Friday, June 18, 2004

VB6 to VB.Net was not the first incompatible upgrade from MS. The FoxPro folks went through something similar. I believe it was in the mid 90's.

There were some minor ones in the VB world that taught most VB developers to never again trust defaults when MS changed the default property for some common object back in the VB5/VB6 days. The VS IDE did not fix that one automatically.

Bob
Saturday, June 26, 2004

*  Recent Topics

*  Fog Creek Home