.NET and the Secret Relationship Between Microsoft and RAM Chip Manufacturers

First, read this post to get the context for what I'm talking about. Now, I've never met Mark in person, but I've read bits and pieces of this book (well, the previous edition) and there's no doubt in my mind he understands the core aspects of the OS like few others do. So I'm not going to argue his general point, which is: Managed apps take up more memory at start up than a counterpart unmanaged apps ("The managed Notepad has consumed close to 8 MB of private virtual memory...whereas the native version has used less than 1 MB"). They do. Just writing a WinForms application that does nothing more but show a window will take up 8 MB - at least I've been able to confirm that on my box. So, he's right. But he also uses Sharp Reader in his discussion of a bloated .NET application, and I don't think that's fair. Sharp Reader is bloated. I used to use it, and it would get to over 100 MB on my box. I couldn't take that anymore. So I tried RSS Bandit, and even though my subscribed feed count has grown (it's currently 485), the current allocated memory for RSSBandit.exe as I type this is 24 MB. Granted, it sometimes jumps to around 100 MB when it's updating the feeds (and that may be something that future versions can improve on), but Sharp Reader's working memory set never went down - at least I didn't see it go down, and that's what forced me to look for another reader. I've also written a WinService that runs on my box that has consistently stayed at 15 MB for over a year.

So what's my point? .NET is handling a bunch of services for the developer, which gives rise to a larger working set for any EXE. I know Mark knows this, but I want to emphasize that point as it's a generalized trade-off that's made when one develops with .NET. Personally, as a developer, I like the ease of developing in .NET because generally I don't have to worry about memory leaks due to the garbage collector, etc. However, to this point, I think .NET developers are also writing crappy applications because what they're really hearing is, "Oh, I don't have to worry about memory leaks anymore - let's make an ArrayList with 100,000 object references and store that as a field reference in my form!" That will kill your application. .NET can't claim that memory, so now the application, while it technically hasn't leaked the memory, it's done some very inefficient things. Now, could the CLR do things that could make memory reclamation more aggressive? In general, I think so. I don't work on the CLR team so I don't know what's going on underneath the scence, but from my higher-level perspective as a developer I get the feeling that improvements can be made. I've written test apps where I've made a fairly large memory allocation and then the reference went out of scope. For some reason, the garbage collector didn't let that go anytime soon. I know I could call GC.Collect() and I know in 2.0 there's GC.AddMemoryPressure() and GC.ReleaseMemoryPressure(), but...I think there's the meme in the .NET world that messing with the garbage collector is somewhat verboten, so their apps ends up with the "appearance" (for lack of a better term) of a bloated application. But .NET developers need to get the fallacy out of their head that memory leaks no longer exist in .NET. Application tests (perferably automated) should be run to see where the memory pressure seems to increase without resistance. These kinds of tests are not as easy to write as unit tests, but they're not hard either, and there needs to be more emphasis on this. And, believe me, I'm not standing from the pulpit on this; I'm in the congregation. I don't do this as often as I should either, but to prevent .NET from getting the tag of being a runtime that consistently yields bloated applications we all need to be cognizant of every aspect of our applications, and not just the ones from the users like, "I like the pretty round blue buttons."

* Posted at 04.16.2005 09:56:57 AM CST | Link *

Blog History