Recently I was analyzing an application written in managed code for memory problems. In managed code a common cause of eating up memory is statically allocated objects which are not nulled out after they are no longer needed.
In the application I was debugging it was making use of binary deserialization to reconstruct a graph of objects. Many of these objects needed some additional context so a StreamingContext object was created with an additional context state. Here’s a sample of what the code was doing:
Everything was working well until I noticed that deserialization would consistently increase the memory footprint (recognize that, in my case, the additional context object was quite large). I wrote a sample application analyzed the problem using WinDBG. The sample program is located here.
Open WinDBG and select “Open Executable”. Select LeakApplication.exe and hit F5. LeakApplication will run waiting for console input. Select ‘Break’ in WinDBG and type “!dumpheap -type LeakApplication.ContextObject”. You’ll see something like this:
WinDBG is telling us that 1 LeakApplication.ContextObject may be in memory. In order to determine if this is accurate type “!gcroot 01f6b7b0”. Here are the results:
Sure enough the BinaryFormatter is holding onto the LeakApplication.ContextObject. This is a known leak and, unfortunately, the best you can do is to mitigate the problem. In my case I created a small, disposable property bag object to hold onto my larger “additional” context objects. After deserialization simply call Dispose on the property bag and let it NULL out any members, thus clipping the object graph. Your property bag will still leak, but at least your larger object graph will be available for garbage collection.