Fog Creek Software
Discussion Board




Moore's Law is coming to an end.  Is that bad?

Interesting article in today's _New York Times_:

http://www.nytimes.com/2003/04/09/technology/09MOOR.html

Frankly, I'd rather see a much greater emphasis on usability than on faster-and-cheaper.

Case in point:  my parents recently bought a combination DVD / VCR made by Philips.  The controls, user interface, and manual are baffling, even by the (low) standards of home electronics.  It really puzzles me why companies devote so many resources at improving hardware, and so little effort at making the end-product easy to use.

J. D. Trollinger
Wednesday, April 09, 2003

J.D, from a troller to flamebate, Moore's law coming to an end (which has been predicted for about as much time as the exhaustion of the IPv4 32 bit address space) won't mean that we'll all focus on the UI.

No, it's much much much worse.  In order to get computing power, we'll need to use various forms of distributed computing to leaverage new technologies.

You'd better pray to the god of computing every night that researchers get the boundschecker equivelent for massively multithreaded software together and there's enough iterations of Moore's Law left to all of us to actually run said software, because otherwise, you are going to be popping excedrin like there's no tomorrow.  Threads are FUN! ;)

flamebait sr.
Wednesday, April 09, 2003

It's probably because companies believe that it's the "features" that sell the product, rather than its usability.

Chi Lambda
Wednesday, April 09, 2003

I propose a new law, a correlary to Moore's law (who do we name it after? )

CORRELARY: Every 3 months some jounalist somewhere in the world will predict the end of Moore's law. Every year said journalist will be proven wrong. Film at eleven.

Sergent Sausage
Wednesday, April 09, 2003

Sergeant Sausage's Law.

At least it'll stick in the memory. :)

Dustin Alexander
Wednesday, April 09, 2003

In May 1996, Roy Brander warned of the end of Moore's Law ( http://www.cuug.ab.ca/CUUGer/9605/editoral.html ).

In July 1999, Jeffrey R. Harrow warned of the end of Moore's Law ( http://h18000.www1.hp.com/rcfoc/19990712.html ).

In May 2000, Intel's Paul A. Packan warned of the end of Moore's Law ( http://www.imakenews.com/techreview/e_article000003598.cfm ).

In October 2001, Ben King warned of the end of Moore's Law ( http://www.silicon.com/analysis/500007/1/1028346.html ).

In December 2002, Intel's Andy Grove warned of the end of Moore's Law ( http://www.theinquirer.net/?article=6677 ).

Moore's Law continues to hold true.

Brent P. Newhall
Wednesday, April 09, 2003

I'm still trying to figure out why my damn PlayStation2 remote doesn't have an Off button. I mean how in the hell did that slip by Usability?


I wonder how many times Moores law can come to an end. The publishers usually hit this about 500 times a year. I also wonder how advances in quantum computing will affect this.

trollbooth
Wednesday, April 09, 2003

It's a tad cavalier to assume that Moore's Law will always hold true.  The limit of one-atom feature size is currently quite intractable, to our current knowlege, although I have confidence that there's probably a way around that one, too.

Moore's law may go on hold for a span of time, waiting for a practical physics breakthrough.

flamebait sr.
Wednesday, April 09, 2003

There is one part of Moore's Law that came to an end somewhere towards the end of the last decade while the trumpets remained strangely mute, and that was the part about the investment required to produce said piece of silicon doubling every eighteen months.

I do remember that in the middle of the nineties many bottom-end computers were being sold with cloned Pentiums (this was the origin of the "Intel Inside" sticker) but the application of Moore's Law with regard to investment had driven them all out of business by 2000, and we have been stuck with the Penny Farthing situation we are in now.

Stephen Jones
Wednesday, April 09, 2003

"I wonder how many times Moores law can come to an end."

Once every 18 months, I should think.

Danil
Wednesday, April 09, 2003

In a speech I heard Gordon Moore give about a year ago, he also mentioned that 'Moore's law' seems to be ending.

Talking to people who design chips for a living, the limit seems to be 'soft' rather than 'hard'. Making masks is very expensive for leading-edge semiconductor processes, and few integrated circuits sell with the volume and price necessary to pay back those costs. For example, x86 processors seem to have beaten the RISC processors in the desktop market, because only x86 processors have the volumes necessary to justify the costs of using leading-edge processes.

So, I think that for extremely high-volume parts (memory, x86 CPUs, etc.), Moore's law will continue to apply for some time, but for lower-volume parts, development will slow and process improvements will be few and far between. A consequence will be that programmable logic (FPGAs) will become more attractive relative to ASICs and low-volume custom ICs.

The people who will be hurt most will be ASIC designers, especially those who deal with chip layout rather than logic design. On the other hand, the opportunities for engineers who design FPGAs will improve. I expect to see many embedded engineers work with both software and FPGAs.

Dan Brown
Wednesday, April 09, 2003

Unfortunately, I will never be able to put "Sergeant Sausage's Law" out of my mind. "Sergeant Sausage's Law" is a new meme. Maybe Joel's good Google PageRank karma will help propagate the "Sergeant Sausage's Law" meme.  ;-)

runtime
Wednesday, April 09, 2003

MIPS/$ has been steadily increasing for the past few decades. Once Moore's "Law" hits the brick wall, we will see two responses:

1. Same MIPS for fewer $. Thus MIPS/$ will still increase but now because $ is decreasing.

2. More SMP.

I guess there is also an Option 3 which combines 1 and 2: More SMP using cheaper processors. Today's SMP users want max performance, so they are doing SMP with top-of-the-line processors. When processor costs decrease, SMP with cheap processors will give the same performance for less money. Then people will use SMP to save money, not for more performance! Same bang, fewer bucks, so who care how many processors are inside the box?

I'm currently reading Clayton Christensen's "The Innovator's Dilemma". My Option #3 seems like it might fit his hypothesis about underdog entrants' radical innovations unseating the established companies' sustaining innovations.

runtime
Wednesday, April 09, 2003

I'm thinking:

'The law of one Moore time'

Maybe not

#If
Wednesday, April 09, 2003

Personally, I looking forward to Moore law finally running out, because then we can get down to writing some quality software.  Right now Moore's law just keeps rescueing programmer who aren't up to the job.

Also, once Moores law is over, we won't have to learn everything anew every decade. 

Beside, when you look at platforms that don't increase in hardware speed as quickly as the PC.  For example, the consoles, you seen see the same type of progress in the quality of software, it's just that the programmers really get to know the hardware.

It's similar to the old 8bit days.  I remember being amazed by 3D starstrike on the Sinclair spectrum.  5 years later there was Starglider, and after that Carrier Command.  The same 3.5 mhz Z80 processor, just a bit more memory and a whole load of coding talent.

Ged Byrne
Wednesday, April 09, 2003

I'm no expert (so don't flame me), but for the sake of discussion, the following are tidbits I've read here or there:

1. Moore's Law is destined to come to an end when it nears the limits of the laws of physics (next 10 years?).

2. Currently processors operate in a single plane (i.e., 1 layer of silicon). There is research being done to built multi-layered processors which will extend the life of Moore's Law.

3. When Moore's Law does reach it's limit at the processor level, there are other things that can be done to extend Moore's Law at the system level, such as further improvements in bus speed, integration into the processors of ancillary circuits (video cards, network cards, etc.).

Nick
Wednesday, April 09, 2003

"Physical Limits of the Universe" is a phrase that always makes me snicker.  Scientists are "discovering" that light speed isn't the boundary they thought it was, new protocols allow for terrbytes to moved in hours, and we have barely touched on nanotechnology; plus, don't get me started on the mathematical theories we're just discovering to put to use with modern hardware.  We're no where near a brick wall.

Andrew Burton
Wednesday, April 09, 2003

I suppose we have a simple examle of what type of processing power is available - our own brains.

Since we know that the physical laws of the universe allow at least that much power, it would seem we have quite a way to go yet.

Ged Byrne
Wednesday, April 09, 2003

Moore's law should be good for another decade, ..people are already looking at alternatives though, alternatives like carbon nanotube technology, etc..

Prakash S
Wednesday, April 09, 2003

"Same bang, fewer bucks, so who care how many processors are inside the box?"

You don't own an athlon XP, do you... :) How do you like your eggs?

Geoff Bennett
Wednesday, April 09, 2003

I don't know if Moore's Law is coming to an end, but I hope it is. Last night I booted up my son's IBM notebook (233 MHz Pentium) running Windows 95 in order to use IE4. Not only did it boot up vastly more quickly than my supposedly much more powerful work machine running Windows 2000 SP 9536, IE4 was so much more uncluttered and pleasant to use than the later versions. If only machines stopped getting faster, software producers might be forced to stop ramming gigabytes of filthy, insanely complicated bloatware down our throats. And personally I do include Linux under the heading of bloatware.

ajs
Thursday, April 10, 2003

There is no off button on a playstation remote to reduce the incidence of haemarroids in the population.

Simon Lucy
Thursday, April 10, 2003

W2000 is notoriously slow to boot. The upside is that you don't need to reboot it that often (though with a laptop there are other considerations).

When Win 98 first came out there was a bug that meant it would crash if it ran 89(?) days without rebooting. Bets were taken on whether anybody would be able to run Win 98 for that period of time without it crashing, and the only known case of it happening was where somebody booted the machine up at work, and after he went home the cleaner put the machine away in a cupboard where it stayed for a few months, up but without actually running any programs.

Recommended times for doing a format/clean install of Windows 95/98FE varied between six weeks for magazine reviewers to a year to eighteen months for light users. How many people do you know of who now regularly install W2000/XP?

If your version of IE5/6 is more bloated than your son's version of IE4, then it's because you have decided to install more features.

If you don't like bloatware then DON'T INSTALL THE BLOAT IN THE FIRST PLACE. This is particulary true of Linux installations but does apply to a lesser extent to both XP and W2000.

Stephen Jones
Thursday, April 10, 2003

I find the "I'm hoping Moore's law will end" comments pretty sad.
More power gives you more options. You can lay them to waste, or you can use them. Seems like you guys would like everybody to be stopped just so your own limited imagination would not stand out.
More people can now use and even program computers than before. This is largely do to advances in usability and abstraction which we can now afford. The bar for being able to use or program a computer should not be raised, it should be lowered until it hits the floor!

Just me (Sir to you)
Thursday, April 10, 2003

Moore's law will come to an end - it's inevitable. Not a matter of if, but just when. For now though, like R. Feynman said "There's plenty of space at the bottom" – meaning we can still squeeze more information in the silicon.

The upper computational power is limited by mainly 2 things:
1) Energy required for computation is inverse proportional to the computation speed (E x t = const; where const depends on the technology). Reversible logical gates make a very interesting case). For a given technology there is a speed limit beyond which the heat dissipation become destructive.

2)  Uncertainty principle limits us to distinguish between states of a micro-objects. Even if computers are based on quantum technology, we are still limited by a minimal hbar action to induce/deduce a state change in the computer (ie there is a limit on how much information we can store in a bit of matter). Plus, we are not talking classical computers anymore but quantum computers. That alone turns all programming concepts around. With classical technology we have to keep the read/writes in the circuitry in the macro-space where we can clearly distinguish between states and end results have a close to 1 probability.

To conclude, Moore's law may stick around for a bit more at the price of expensive technology changes, but one thing is sure: at some point in time Moore's law will break.

Cheers
Dino

Dino
Thursday, April 10, 2003

[There is no off button on a playstation remote to reduce the incidence of haemarroids in the population. ]

I expect you mean hemorrhoids and I take that to mean I have to get off my ass. Well that's what I do now. But if I had a off button I could *gasp* actually turn the thing off without getting off my ass. Now that is a REMOTE control. If I want a workout I'll go to the gym.

trollbooth
Thursday, April 10, 2003

Andrew,

[Scientists are "discovering" that light speed isn't the boundary they thought it was]

What exactly are you referring to?

Plus, with current technologies we may be approaching a brickwall quite fast because the growth is exponential.

For example if current technology allows us to double speed every 8 months and max technological limit is 200 GHz, from today's 2GHz to 200GHz we need 4 to 5 years ( 4, 8 , 16, 32, 64, 128, ... < 256)

Cheers,
Dino

Dino
Thursday, April 10, 2003

It's important to note that Moore's law has nothing to do with processor speed. It is strictly concerned with the number of transistors that can be fit onto a piece of silicon. The "law" doesn't state that processor speeds will double every eighteen months. It says that the number of transistors available on a chip will double every eighteen months.

So, yes, at some point we will reach the end of Moore's law.

But does that mean our computers will stop becomming more powerful? Absolutely not.

The most obvious next step is to start working more with symmetric multiprocessing. At the moment, that means putting more processors into the same machine. But intel, with the release of its hyperthreading technology, is starting to show that you can put two processor pipelines onto the same chip, and that one of them can do work while the other one is waiting for data.

After we exhaust the possibilities for symmetric mulitprocessing, we'll probably work more in depth with asymmetric multiprocessing, which will involve the division of processor tasks into more specialized chunks. Much in the same way that a dedicated graphics card can offload 3d rendering tasks from the cpu, asymmetric multiprocessing will encourage a computer to use more and more specialized chips and fewer generalized chips. Maybe in twenty years, we'll have specialized database-processor chips.

By the time we've exhausted the possiblities of asymmetric multiprocessing, we'll probably have the ability to implement optical data-transfer subsystems to replace the busses in today's machines. And, since i/o bandwidth is the biggest bottleneck in a modern computer, moving the bits around at the speed of light will be a major improvement.

In short, even though Moore's law will surely catch up with us, computers will never stop getting more powerful. There will always be new and clever techniques that can be used to add more processing power to a computing setup, much to the dismay of punidts and doomsday prophets in the tech industry.

Benji Smith
Thursday, April 10, 2003

[remote control]
"if I had a off button I could *gasp* actually turn the thing off without getting off my ass."

No, you could put it into a kind of standby state which still draws power. ISTR reading that this costs the average American household over $200 per year in wasted electricity along with the associated pollution caused to create that electricity. All because they can't be bothered to get off their fat arses to turn the thing off properly.

"Now that is a REMOTE control"

Buy a stick. Or alternatively buy r/cs for your plug sockets, we have them and they work just fine - they actually cut the power to the device.


Friday, April 11, 2003

Why does every discussion related to americans have to be a discussion about how we are all fat and lazy. Are you calling me fat? Because I am built like Adonis goddamit. I got a BETTER IDEA. How about they design the damn thing RIGHT in the first place and you go along with it.

FYI there is NO off button on the remote that puts it into a standby state, that is on the machine itself. I turn the thing totally off because I don't like outrageous electric bills or wasting electricity.

I should buy a stick! To whip your ass with, calling me fat goddamit. I deserve better. My fragile ego is falling apart now. The huge chip on my shoulder has been knocked off. I hope I can make it through the day.

trollbooth
Friday, April 11, 2003

Dear Trollbooth,
                        The off button on the remote puts the machine into standby. If it didn't you wouldn't be able to turn it on after with the remote control.

                          I very much doubt the $200 a year figure though.  Those LCD screens really use up very little electricity. Even a monitor on standby only uses about 5 watts an hour.

Stephen Jones
Friday, April 11, 2003

Dear Stephen,
My PS2 remote does not have an off button of any kind. I understand the difference between completely off and stand-by mode for most devices, like a TV for instance (If there is no power at all how can the device listen for the remote signal, etc).

trollbooth
Monday, April 14, 2003

*  Recent Topics

*  Fog Creek Home