In some situations, the large object heap will become heavily fragmented. The amount of fragmentation is so large that the LOH uses more than twice the amount of memory than the totall sum of allocated large objects.
The problem is described in detail in this article:
It really takes a large amount of chutzpah to release a product with such a major flaw. It basically makes it almost impossible to write long-running server processes in .NET. This issue almost cost my company a contract since we were experiencing inexplicable OutOfmemoryExceptions in a very important product.
The problem can be solved by
a) treating large objects like normal objects and compacting the LOH during a generation 2 collection or
b) using a classic heap management algorithm like the one from Donald Knuth to ensure that the total size of the heap never exceeds two times the total number of allocated bytes.
I don't expect microsoft to do anything about this issue. After all, they are busy adding new language features like dynamic to C# in order to make it more buzzword compliant. And they did not do anything about any of the other highly rated feedback items I have submitted. But I think everybody should know about this issue so that they can avoid the .NET platform for important applications.