Sql-server – Do decimal separators behave differently for Money and Decimal data types

sql serversql-server-2012

I've recently been troubleshooting some problems with a third party application around regional settings and decimal separators. The detail of these problems isn't too relevant except to say that the application is a web-service reading to a SQL Server column with the data type money and that it appears to multiply the value because a decimal separator is being interpreted as a thousand separator.

This leads me to the question: In the course of my investigations I noticed that if I open SSMS and execute the following I get "1,00" returned:

declare @money money;
set @money = 1;
select @money;

However, if I execute the following against the same server then I get "1.00" returned:

declare @decimal decimal(18,2);
set @decimal = 1;
select @decimal;

My Windows regional settings here are configured to use a comma as the decimal separator, so on the face of it the money value is displayed correctly and the decimal is not.

What's going on here? It's unclear why the behaviour is different for these two data types. Is this down to SQL Server or is this caused by the .Net framework? (The application utilizes the .Net framework and presumably SSMS is doing so in the same way…)

I found the following ticket which seems to refer to the same issue but doesn't seem to provide an answer: https://connect.microsoft.com/SQLServer/feedback/details/636074/some-datatypes-doesnt-honor-localization

Best Answer

The ToString() implementation in the SqlDecimal type in System.Data.Types contains this as the final step in the algorithm:

while (i > 0)
{
    if (i-- == (int)this.m_bScale)
    {
        array3[num2++] = '.';
    }
    array3[num2++] = array2[i];
}
return new string(array3);

Notice there is a hardcoded '.' in there? SqlMoney does not have this as it calls a specific format which uses the current cultureinfo.