Sql-server – Performance hit of CONVERT from nvarchar to money (SQL Server)

sql serversql-server-2008

I have a dollar amount value, that for historical reasons has always just been stored in an nvarchar field. Now that we are storing many more rows than we have in the past, and are using this field to total up amounts using an application that will automatically refresh frequently, I'm concerned about performance.

But I'm also concerned about the cost of making a change like this late in the development cycle.

Soon this database will be used by a customer that will generate millions of rows. All of these rows will not participate in this calculation. The rows are divided into groups of around 10,000, each parent of the group will be processed every 10 minutes or so, which is when these totals will be calculated.

Is it worth it now to change the column type to money, including the stored procedures, UDT's, data layer, etc?

It seems like it could have a performance impact, but unfortunately, I can't generate enough volume with the resources I have to do performance tests thta would be realistic. So I'm hoping someone has experience with a string to number conversion and can give me an idea if doing this conversion on 10,000 rows at a time will be a problem.

Best Answer

Performance cost of cast is minor. Database cost is usually in IO and a cast seldom even shows on the radar.

But that is not the problem. The real problem is going to be data purity. If you allow arbitrary nvarchar then rest assured, non-numeric values will show up in the database. Despite every precaution you take in the app, experience tells us that down the road somewhere somehow will insert 'abagft' as the value. And then the casts will start throwing runtime exceptions and you (or whoever will be running the show then) will curse and curse and curse. And you'll try to 'fix' it with a boolean shortcircuit WHERE, just like everybody else, and it won't work. This road is paved with bodies, trust me.

There are also problems related to data type precedence, type inference in dev tools (fields ends up as C# string) and, most importantly, index seek sargability problems.

Use a numeric type for numeric values. DECIMAL is good for moneys. And the time for change is always before you go live with millions of values. Right now is the cost of code change and risk of regression. In 6 months will be the same cost of code change and risk of regression plus downtime and size-of-data update operation.