[Edit this entry.] Misreporting of variable values when debugging x64 code with the Visual Studio 2010 debugger - by Liam Westley

Status : 

  Fixed<br /><br />
		This item has been fixed in the current or upcoming version of this product.<br /><br />
		A more detailed explanation for the resolution of this particular item may have been provided in the comments section.

Sign in
to vote
ID 655793 Comments
Status Closed Workarounds
Type Bug Repros 1
Opened 3/31/2011 2:49:38 PM
Access Restriction Public


When debugging variables in Visual Studio 2010 (RTM and SP1), Professional and Ultimate editions it is possible for a variable to hold a valid value, but the debugger may report a the initialisation value of the variable, for a Decimal value this default value is 0.

We have found that using the null-coalescing operator on a method returning a nullable Decimal value can result in incorrect values being displayed by the debugger when the Platform Target is set to x64.

            Decimal result = GetNullableValue() ?? GetNonNullableValue();

Full code and description, with screenshots, can be found on this blog post,

Sign in to post a comment.
Posted by Microsoft on 8/30/2011 at 11:18 AM
Thanks for reporting this issue, Liam.

We've identified the problem and fixed it. The fix will appear in the C# compiler in the next release of .Net and Visual Studio.

This behavior is occuring because the 64 bit CLR JIT is associating the MSIL store instruction that saves the value of the null-coalescing operation into result with the IL for the statement that follows. The JIT is doing this because the ?? operator introduces a branch, and both paths of the branch merge back at the store instruction, but the C# compiler does not include a NOP or debug sequence point at this merge point, and the JIT cannot decide how to map the native code back to the IL and back to the source. The JIT has a heurestic that maps the store instruction's native code with the following debug sequence point which happens to be for the next statement.

So what happens in the debugger is that it actually stops before the store instruction is executed when you step over the declaration for result. So result still has its zero-initialized value, which in this case is 0.0m. Stepping over the next statement finally executes that store instruction and then result gets its expected value in the debugger display.

This scenario happens to work fine in the x86 JIT by accident and it also works in the x64 JIT if the struct type being stored into is 64 bits or smaller. Decimal is larger than 64 bits and so we see this behavior instead. Guid also exhibits this behavior for example.

Note there is nothing wrong with the compiled code. It is just the CLR's diagnostic heurestics that aren't working out. The solution we are taking is to make the compiler insert a NOP instruction at the IL merge point. A NOP instruction tells the JIT that there is an implicit debug sequence point at that instruction and the JIT is able to correctly map the native code back to source.

Also note this applies to the ?: operator as well, which has also been fixed.
E.g. the following has the same behavior as your example.

Decimal? decnul = GetNullableValue();
Decimal result = decnul.HasValue ? decnul.Value : GetNonNullableValue();

Ian Halliday
VB & C# Compiler SDE
Posted by Microsoft on 3/31/2011 at 11:24 PM
Thanks for your feedback.

We are rerouting this issue to the appropriate group within the Visual Studio Product Team for triage and resolution. These specialized experts will follow-up with your issue.
Posted by Microsoft on 3/31/2011 at 3:15 PM
Thank you for your feedback, we are currently reviewing the issue you have submitted. If this issue is urgent, please contact support directly(http://support.microsoft.com)