Sunday, April 24, 2005

Hansleman's List: Memory & Processes

More from Scott Hansleman's list of what great developers should know: "What is the maximum amount of memory any single process on Windows can address? Is this different than the maximum virtual memory for the system? How would this affect a system design?" NT-based OSes split memory into user and kernel space. The amount of memory usable is dependent on the exact flavor of the OS and whether the processor is 32-bit or 64-bit. In the 32 bit world, XP and Win2K both support 4GB, Win2K Advanced Server supports 8GB and Win2K Datacenter Server supports up to 32GB. XP and W2K's 4GB gets split 50/50, with 2GB being available for a process and the remaining 2GB being held for the kernel. (Note that this last space isn't just for the OS -- all the hardware devices use this space as well, so that killer graphics card you got with gobs of video RAM uses up portions of the kernel space as well. This is the same in 32- or 64-bit procs.) XP Pro and Win2003 Server can use the /3GB switch in boot.ini to make 3GB available to the user side. The 2GB limit isn't a Windows limit, it's a 32-bit processor limit and hits Linux, Solaris, etc. (I'm sure Macs don't have this problem because Macs are the perfect system and have no limitations whatsoever. Ask any Mac user.) 64 bit Windows supports up to 16 terrabytes (woo hoo, gimme one o' dem filled up!) but still split memory 50/50 between user and kernel. 32 bit applications are still limited to 2GB user space. 64-bit Windows versions don't support the /3GB switch. So yes, a process's max memory is different than the system's total virtual memory. System design is impacted if you're trying to create a system using very large data structures, particularly if your software's running on a system with other apps. Your design needs to make careful use of memory to make sure you're not going to run out of virtual memory. Furthermore, since virtual memory is a combination of physical memory and swapfile space, you need to take care to avoid paging -- or at least minimize it as much as possible. This means you should have an understanding of the environment your software's going to run in. At the minimum, you should have good documentation explaining your assumptions, estimated system requirements and projected system impacts. The first chapter in Bentley's Programming Pearls is a great discussion on careful use of system memory, plus it shows the benefit of ensuring you know what problem you're really trying to solve. The story opening the problem relates a conversation Bently had with a friend. The short version is that what Bently first thought was a simple sort of a large file on disk turned out to be a horse of a different color. The "real" problem required some creative thought to get around memory limitations. References: MSDN: Memory Support and Windows Operating Systems MSDN: Comparison of 32-bit and 64-bit memory architecture for 64-bit editions of Windows XP and Windows Server 2003

1 comment:

Babu said...

Thank you Mr.FrazzledDad.
I enjoyed your comments and links.

Subscribe (RSS)

The Leadership Journey