Uncovering the Secrets of Memory Allocation
The Mysterious Case of the Stack
When a thread starts, a fixed-size contiguous memory block is allocated for the stack. But have you ever wondered how big this stack is and what happens when it reaches its limit? The answer lies in understanding how the stack grows and contracts.
The Stack’s Growth and Contraction
Every time a function is called, the stack grows, and when the function returns, it contracts. Additionally, when a new stack variable is created within a function, the stack grows, and when the variable goes out of scope, it contracts. However, the most common cause of stack overflow is due to deep recursive calls and/or large automatic variables on the stack.
Experimenting with the Stack
Let’s write a program to estimate the default stack size on our system. We’ll create a function that recurses infinitely, allocating a 1-kilobyte variable on each call. By printing the current size of the stack, we can estimate its maximum size.
The Results
When we run the program, it crashes after a short while, printing thousands of lines with the current size of the stack. The last lines of the output reveal that the maximum size of the stack is around 8 MB on our system. This is confirmed by using the ulimit
command with the -s
option, which returns the current setting for the maximum stack size in kilobytes.
Platform Differences
Interestingly, the default stack size varies across platforms. On Windows, it’s usually set to 1 MB, which means a program running fine on macOS might crash due to a stack overflow on Windows if the stack size isn’t correctly configured.
The Importance of Memory Management
This experiment highlights the importance of managing memory efficiently. Running out of stack memory can cause a program to crash, making it essential to understand how to implement memory allocators to handle fixed-size allocations.
The Heap: A Different Story
The heap, also known as the free store, is where data with dynamic storage lives. Unlike the stack, the heap is shared among multiple threads, making memory management more complicated. The allocation and deallocation pattern for heap memory is arbitrary, increasing the risk of fragmented memory.
The Battle Against Fragmented Memory
In the next chapter, we’ll explore how to implement a rudimentary memory allocator to handle fixed-size allocations. By understanding the stack as a type of memory allocator, we’ll see how it can be implemented efficiently, ensuring that memory will never become fragmented.