Why only 2 GB?
I'm sure I'll get it by the time I'm finished reading "Inside Windows 2000," but anyway, so long as memory is virtual, why do applications get 2 GB and the system, 2 GB?
They could get 4 GB each and everyone would be happy.
Alex
Monday, December 1, 2003
Without knowing anything about Windows, I would guess because it basically uses a 32-bit address space divided in two. So the applications get 31 bits (2 GB), and the kernel gets 31 bits (2 GB), and there is a bit to indicate whether the address is for applications or for the kernel.
Roose
Monday, December 1, 2003
I think Linux defaults to a 3GB userspace and Windows 2000 Enterprise Edition offers a boot flag to use a 3GB userspace. Linux 2.6 also has an experimental patch (not yet merged into mainline) to allow a 4GB userspace! The problem is that there is a 5-10% performance hit because every user-to-kernel and kernel-to-user context switch requires the CPU's TLB to be flushed because you are entering a different address space. :-(
runtime
Monday, December 1, 2003
For a great example of how backwards compatibility can cause headaches long after the problem has gone away see: Raymond Chen's discussion of address space which includes why you don't technically even get the full 2GB of address space:
http://blogs.gotdotnet.com/raymondc/PermaLink.aspx/96609a14-3f09-436c-9f87-b2a6372c3dff
Rob Walker
Monday, December 1, 2003
Seems to me that if the kernel-mode address space were completely distinct from the user-mode address, as 4G each on a 32bit machine would imply, you couldn't pass references into the kernel functions. After populating your nice data structure or character array at address 18492, when you passed 18492 into the kernel's WinDoSomething function, the kernel code would see something completely different at that address than your user code, because it's address space is separate.
Besides, what are you griping about? 640K should be enough for anybody. :-)
charles
Monday, December 1, 2003
That 2 GB is becomming a real limitation. Even the most basic machines for for business are now ordered with 512Mb, and anything that will actually have to do some work is ordered with 1Gb. Extrapolating from the past that will be moved up to 1GB - 2GB - 4GB in the 2004-2005 timeframe. Better ramp up on consumer 64bit systems fast.
Just me (Sir to you)
Tuesday, December 2, 2003
Is the average business machine running Microsoft Office hitting the 2 GB limit on a regular basis then? I very much doubt it.
John Topley (www.johntopley.com)
Tuesday, December 2, 2003
"...640K should be enough for anybody."
640K luxury ...my first machine had 4K - but I did buy the 16K ram pack that plugged in the back. That was over 20 years ago mind. God I feel old.
A cynic writes
Tuesday, December 2, 2003
Cynic,
I bit the bullet and had 16K in my first 8 bit Atari machine,
4K? Did you get the VIC20? :-)
6502 ruled
Tuesday, December 2, 2003
You're not seriously telling me that secretaries using Word on a box with 256 - 512 MB if RAM are complaining on a regular basis that their machines are running low on memory? I don't buy it. The majority of users running business applications do not need 64-bit computing.
John Topley (www.johntopley.com)
Tuesday, December 2, 2003
John,
I am not syaing they are complaining. I'm saying 3 years ago a basic box came with 128 Mb. Now, 512Mb is standard. What will it be 3 years from now? At the same growth rate that will be 2 GB.
Just me (Sir to you)
Tuesday, December 2, 2003
6502
The machine was a Sinclair ZX80 I bought off a mate in 1981 - 4k with an extra chip he'd bought that meant you could have moving graphics and scratch marks where one his dogs chewed it. The 16K ram pack was a wobbley brick that plugged in the back and the printer used sparks to burn onto conductive paper.
The ZX80 was superseded by the ZX81 which was in turn replaced by the Spectrum (which people have actually heard of). We used to dream of Vic-20s. Bizarrely my parents still have the machine and my 5 year old wants it (being dissatisfied with the PentiumII he has now). I'm afraid it might still work.
Meanwhile at school (I was ~15 in '81) we had the use of 2 Commodore Pets but could also occasionally send off stuff on cards to the county mini. The cards we used were considered advanced as you didn't need to punch holes just black out boxes with a pencil. What is even more frightening we used to use the same things for exams (multiple choice obviously).
(Obligatory python reference)
And you tell kids that today and they don't believe you.
A cynic writes
Tuesday, December 2, 2003
Hitting 2GB of RAM doesn't require moving to a 64 bit system. The 2GB address space limit is *per process*.
You could have 4GB of RAM and still be running a 32 bit OS and using every bit of RAM.
With the right processors you can be running with 32GB of RAM on a 32 bit processor.
I think the move to 64 bit on the desktop is a ways off.
Rob Walker
Tuesday, December 2, 2003
JustSir assumes that such trends continue linearly. That's not so. I'm sure you could have done similar extrapolations 80 years ago about automobile speeds, thinking by now we'd all be driving thousand-mile-per-hour cars. Don't mistake an infant's growth trend as a permanent feature.
In the Windows world, the 500Mhz PC with 256M RAM was the last machine that, say, a secretary or manager will ever truly need. Everything past that is wasted gravy for them. C programmers (C++/C#/ObjC) can make good use of any current megahertz available to speed up compilations, Java compilations are too fast to worry about above 1GhzP4 or 600MhzPPC and 512M. Anyway now that all the good ones do TDD, we probably don't recompile as often anyway. Server programmers of both stripe gain from extra memory to run databases and server processes on their laptops, but only up to around 1G or so for most. Even gamers aren't pushing the envelope as hard as they once did.
Ignoring scientists, who probably have big clusters to do their intensive work, graphic artists and music and movie folks are about the only big group that can gain much productivity with faster personal systems and more memory. As of 2003, they already get 64bit machines with 1Ghz+ FSB and 8G RAM, the Apple G5. Within two years Apple and IBM will take artists to their natural saturation point too.
charles
Tuesday, December 2, 2003
Charles,
The idea the "less sofisticated" users do not need much of a machine is something I have been hearing time and time again. It is wrong.
Home users are flocking to digital media. These applications eat memory faster than you can say swapspace. They are also very impatient.
The less IT savvy the user, the more intelligent the app has to be. If we are going to go serious AI for supporting the common user, we better throw some hardware at it, since that is usually the cheapest way of getting there if at all.
Rob, in what timeframe do you feel that the 2GB limit will become noticable on the common desktop? Is total installed memory an indication, keeping pace but simply running a bit in front? Would make sense if we are running a more or less constant number of concurrent apps throughout the years.
Just me (Sir to you)
Tuesday, December 2, 2003
If it's available users will want it - more memory, faster processor - whatever. Moore's Law will cease when it ceases to be financially worthwhile. Until then there will always be someone crying there system isn't fast enough or clever enough or doesn't talk to them.
I'd suggest the reason for any memory limit is that it was convenient *at the time*. When changing becomes financially worth the bother it will be changed.
In reality does (group a) *need* (insert upgrade) to do their job - probably not. But want isn't the same as need, and if cheap enough and shuts the users up people will buy the kit -if only for a quiet life..
A cynic writes
Tuesday, December 2, 2003
The problem is not so much memory capacity as address space. There is a trend to move away from the concept of 'reading a file into memory' toward system calls which simply map part or all of the file into a process's virtual address space. These use the OS's underlying (and efficient) virtual-memory paging routines to provide the illusion that all the file has been read in when it is actually only paged in/out as required. More importantly, it avoids unnecessary copying between kernel & user space.
However, a 2GB file is no big deal these days (think video editors), and realistically, to map even a 1GB file you have to start pulling dirty tricks like mapping different parts of the file & moving/copying between them. It gets much easier with 64 bit virtual address space, even if there is no more physical RAM.
And your office PC user will soon be editing video clips into a Powerpoint presentation...
Max Hadley
Tuesday, December 2, 2003
video clips in powerpont - the buggers do that already.
I wouldn't mind - but in my experience they don't think to forward on the actual clip so all you get is a single frame. You then have to explain how to send the clip in words of one syllable and edit the damn thing yourself.
I'm over over-communicating again aren't I?
A cynic writes
Tuesday, December 2, 2003
I don't expect the 2GB limit to be a problem to most users for the 'forseeable' future - say at least 4 years. Most likely a few beyond that too.
Moving to 64 isn't as simple as recompiliing your app. Most software written for Win32 in C/C++ will require some work to ensure that it doesn't pull any nasty cast stuff or make assumptions that are no longer valid.
Even if your software compiles cleanly, does it use any 3rd party components/libraries? In the windows world at least a 64-bit process can't load a 32-bit DLL, so if you use an in-proc COM component for something you have to wait until the vendor releases their own 64 bit version. (Or jump through some nasty hoops to get the component hosted out-of-process and hope performance isn't an issue).
I agree that having 64-bits of address space makes writing some code easier - but easier doesn't necessarily mean better. Just because I can map a multi-gigabyte file into my address space and let the OS handle the physical memory allocation through paging doesn't mean I should.
A video editor can do a much better job of managing its own memory usage than the OS will do for it.
Other recent threads have touched on the performance disparity between processors and memory, so having a lot of memory won't necessarily be the best way to speed up your program.
I think the move to 64 bit will happen, but it will take much longer than the 16 to 32 transition.
Rob Walker
Tuesday, December 2, 2003
"Moving to 64 isn't as simple as recompiliing your app."
It's no coincidence that Microsoft is pushing everybody onto .NET at right about the same time that 64-bit Windows is becoming a reality. With managed code, you indeed do get 64-bit compatibility without recompiling your app.
Brad Wilson (dotnetguy.techieswithcats.com)
Tuesday, December 2, 2003
Don't know much about the current state of 64-bit... but won't windows provide backward compatibility? So you can run 32-bit executables on a 64-bit platform? That's compatibility without recompiling your app.
And it seems that is basically the same thing as having a virtual machine. In both cases you have some intermediate format being translated into native machine code. It's just that in one case, the intermediate format is some other machine code.
Roose
Wednesday, December 3, 2003
"Ignoring scientists, who probably have big clusters to do their intensive work, graphic artists and music and movie folks are about the only big group that can gain much productivity with faster personal systems and more memory. As of 2003, they already get 64bit machines with 1Ghz+ FSB and 8G RAM, the Apple G5. Within two years Apple and IBM will take artists to their natural saturation point too. "
I agree with your general sentiment, but I beg to differ on the 2 years. I work on video games, and it still takes about 8 hours every night with 12 or so machines to do our nightly builds (proecssing art data, not compiling code). Sure we could speed it up, but the manpower to do so costs more than the hardware.
It would tremendously improve productivity if that same process took 1 second on a single machine.
So we have a ways to go. And with the next generation of consoles, we will have more and more art to process. Development is funny like that, because as the target machines (video game consoles) get faster, the more processing power development requires too.
And it's not just CPU speed... we could easily use more disk space (for archiving 4 gig DVD images, storing video, and probably 20+ gigs per DVD for the next generation Blu-ray or whatever). Copying around these files is not trivial either, we could use some Gigabit ethernet and probably even more. That would improve productivity as well.
Roose
Wednesday, December 3, 2003
"A video editor can do a much better job of managing its own memory usage than the OS will do for it."
Rob, I agree, but if trends show anything it is that we are always going for more abstraction/easier VM models, not for squeezing more power by micromanaging the lower layers.
Isn't "just let go" one of the mantras of .Net?
Just me (Sir to you)
Wednesday, December 3, 2003
Recent Topics
Fog Creek Home
|