We have a product running on windows 7 - 64 bit, running on Visual Studio 2010 Professional that was passing all of our unit tests. Most of the projects in our solution were targeting the 3.5 .NetFramework with some targeting the 4.0 .NetFramework. When we installed Visual Studio 2012 Professional on our machines, this was a side by side install where both Visual Studio 2010 and 2012 are currently installed on the machines, some of our unit tests began failing, more specifically those having to deal with floating point computation. Now, we are still using and running our tests on Visual Studio 2010, but now Visual Studio 2012 is also installed in our machines. This is all in our managed code (C#). Comparison between old stored values and new values are coming out false when they should be true due to differences in the precision of the computations. Was there anything changed in either the .NetFramework ver. 4.5 or bug fixes in Visual Studio 2012 Professional or some optimization difference that causes discrepancies in how floating point values are stored or computed that could be causing these differences in our values? Note that the difference is up to about the 9th or 10th decimal place at times, but we need our values to be accurate up to the 12th decimal place. The new values seem to be more accurate, with greater decimal point precision. Is there a solution to this problem?
↧