
|
YANIIC
Yet Another Nail In Itanium's Coffin
I got taken to task here a few weeks back for saying the Itanium was way overhyped and basically not being adopted. Well here is more proof:
http://www.eweek.com/article2/0,4149,1357756,00.asp
Five years from now we'll read the real story of what's wrong with the Itanium or what kept it from being adopted.
Mike
Monday, October 20, 2003
Henry Ford's insistance that he could sell his T modell forever was the final nail in Ford Motor Company's coffin. Was it? We do have cars today, and we do have Ford cars. So what's your point?
Johnny Bravo
Monday, October 20, 2003
More grasping at straws, you mean. So Sun is offering deals to alpha customers to lure them away? Startling!! Shocking!! If that's the best proof that can be found for your point, it might be time to buy some stock.
blah
Monday, October 20, 2003
The model T was a dead end, but at least was viable for awhile. The Itanic is not, nor never was viable. Yes Intel will exist without it making other chips, just as Ford went forward not selling Model T's. If you want a definition for viable - I mean the the company recouped it's develpment costs before discontinuing it (it being the Model T or the Itanium.
Mike
Monday, October 20, 2003
Blah, HP is continuing the Alpha for awhile longer. You have got to believe Intel really wanted those customers, don't you. As for Sun they always do that when anyone discontinues anything. They have Hp3000 and Windows NT migration paths as well.
Mike
Monday, October 20, 2003
The problem is that the Itanium optimizes the wrong problem.
At some point, the instruction set ceased to matter that much. This is partially Intel's fault, but it would have happened anyways. AMD's cores, the Transmeta cores, and Intel's cores all contain logic to translate x86 instructions into VLIW-like instructions. The Itanium is built in the era where people thought the instruction set mattered.
The biggest problem now with the x86 instruction set is not the complexity, not the multiple instruction lengths (Which can be approached as a form of compression. Besides, thumb and MIPS16 make the single-instruction-length purists look bad) but the simple fact that there aren't enough registers to adequately contain a reasonable number of temporary variables and thus, necessetates register spills that are hard to optimize away. The real big reason why you want to recompile apps for the x86-64 ISA is because it has more registers to stave off a register spill.
The x86-64 is a brilliant idea for AMD because it allows them to shove the 64 bit instruction set down people's throats, even if they don't actually use it, until there's a pretty large installed base.
The Itanium's problem is that there is no easy road to 32-bit compatiblity -- the hardware emulation is slower than the equivelent software emulation -- so you need to go whole-hog into the new architecture. And, there's nothing, starting out, that makes it sound good. All of the "legacy" RISC architectures were still delivering more bang-for-the-buck than the equivelent Itanium. HP had to surpress performance figures for the latest Alpha until a faster Itanium made it's way out because it made the Itanium look pretty bad.
So it doesn't offer you an easy migration, compared to the Athlon 64, it doesn't offer you anything you won't get out of your traditional RISC platform yet, the migration sucks, and too many folks are making fun of it instead of hyping it.
The interesting part will be in the next few years, as the migration path for PA-RISC, Alpha, and MIPS architectures start to look better and the Itanium actually beats some top-of-the-line RISC processors.
The problem is that Intel's in a bind where they have no choice but to keep going forward as if nothing's wrong. At the very least, Itanium will be a good high-end server CPU. The problem is that the Itanium's going to suck if it can't move low-end. So any 64-bit versions of the Pentium series will doom the Itanium in the low-end space. However, if AMD does too well with the Athlon 64 line, they will be faced with sacrificing the Itanium to the high-end or potentially getting in real trouble with the Pentium line at the low-end. If the Itaniums stay in the very-expensive part of the market, a high-end Pentium/Athlon server looks good and so does a SPARC or MIPS machine.
Flamebait Sr.
Monday, October 20, 2003
So unless the Windows crowd adopts Itanium en masse it has a dim future. Traditionally the Windows crowd has a difficult time giving up backwards compatibility.
It seems the Itanium is a product without a market. As you said it does nothing that is not available elsewhere for the same money or less and with more apps on the platform.
Mike
Monday, October 20, 2003
The Itanium is a product that came too late to market.
Back in the days of the Pentium, the x86 instruction set was a measurable performance problem. The Pentium Pro took a non-neglegable amount of the processor die to convert x86 instructions to RISC instructions. As I said, from the PPro and Athlon lines onwards, you were building a RISC/VLIW processor that had an x86 code translator on the front, not an x86 processor. Back then, people cared about that space.
However, given the latest shrinks in process, increases in performance, and software development, that amount of die space and/or built-in software required to process the x86 instruction set doesn't seem like quite as much of a penalty anymore, especially when compared to the huge onboard L2 cache.
The windows crowd can be moved. A very good percentage of users out there have converted to the NT kernel, generally either Win2k or WinXP. But it has to happen on their terms, not on any forced terms. The move to Win2k happened like this. People got rid of old software and hardware that didn't play well with the NT kernel, over a matter of years, simply because all of the old stuff stopped working or the new stuff was worth upgrading to, which eventually caused all of the hardware and software that didn't work under the NT kernel to fade off into the sunset. The market moved over to PCI from ISA, also rather gradually. Soundblaster compatability isn't a requirement anymore. Going from 16 to 32 bit was a little bit of a change, but changing architectures alltogether wouldn't have been *that* much harder. Going from 32 to 64 should be mildly easier because the only real change is the size of a few standard defined variables and some minor API tweaks, not a whole new API.
The big thing about the Athlon 64s that's in AMD's favor is the same sort of thing. Because they get it "for free", x86-64 is likely to hit critical mass, especially among the power-users, before the Itanium does. Drivers will be an issue, but MS can and has taken drivers out of the hands of the individual manufacturer when it helps them push things the way they want. I'm betting that MS would be happy to have *either* the Athlon 64 or the Itanium, just as long as they don't get left behind. ;)
Flamebait Sr.
Monday, October 20, 2003
Apart from anything else, how about the Itanium's general design (instruction set, addressing modes, register renaming etc)?
I've purged the details from my brain now but I remember reading the specs a while back and thinking that it reeked of over complexity. My basic feeling was that it was just all wrong, and if that was the future of computing then things are looking pretty dim.
My theory is that they designed the chip to be hellishly complex in order to make it extremely difficult to clone.
cheers,
Peter
Peter McKenzie
Tuesday, October 21, 2003
The central idea was that they wanted to take a lot of the smarts out of the instruction decoder and various processor performance optimizations and replace that with smarts in the compiler. They would guess less and make the compiler specify the expected case more.
Some of it's not half bad, like forcing the compiler to indicate which operations can be parallelized out of a block of instructions so that the CPU doesn't need to make a guess or depend on prefetch.
The biggie is that no human was expected to be able to write good assembly for the Itanium, so making the ISA clean for human eyes wasn't as important as making the ISA clean for the compiler and processor's eyes. Thus the complexity. Which isn't entirely a bad thing, but there are probably better ways to do things. The dynamic recompilation stuff that both IBM and DEC->Compaq->HP were working on got my vote and looks suspiciously like the AS/400, which, incidentally, dumb-lucked itself into underappreciated good design...
Flamebait Sr.
Tuesday, October 21, 2003
Recent Topics
Fog Creek Home
|