I suspect you are simply not checking the length of the column accurately (like inspecting PRINT
or SELECT
output in Management Studio, which is truncated by the tool, not accurately reflecting what's actually in the database). There is no limitation on the number of characters that can be exposed or updated by a view, unless you introduce them, perhaps you are using an INSTEAD OF
trigger, or there is implicit conversion going on somewhere, or your data is being truncated before the update. Hard to tell without more details, which I hope you can furnish.
In the meantime, try this:
USE tempdb;
GO
CREATE TABLE dbo.foobar(x VARCHAR(MAX));
GO
CREATE VIEW dbo.v_foobar
WITH SCHEMABINDING
AS
SELECT x FROM dbo.foobar;
GO
INSERT dbo.v_foobar(x) VALUES('c');
GO
UPDATE dbo.v_foobar
SET x = REPLICATE(CONVERT(VARCHAR(MAX), 'c'), 8000)
+ REPLICATE(CONVERT(VARCHAR(MAX), 'c'), 8000);
GO
SELECT LEN(x), DATALENGTH(x)
FROM dbo.v_foobar;
GO
Results:
----- -----
16000 16000
I'm also not convinced that CHAR(34)
is the culprit.
UPDATE dbo.v_foobar
SET x = CHAR(34) + 'x'
+ REPLICATE(CONVERT(VARCHAR(MAX), 'c'), 800)
+ CHAR(34) + 'x'
+ REPLICATE(CONVERT(VARCHAR(MAX), 'c'), 800)
+ CHAR(34) + 'x'
+ REPLICATE(CONVERT(VARCHAR(MAX), 'c'), 800)
+ CHAR(34) + 'x'
+ REPLICATE(CONVERT(VARCHAR(MAX), 'c'), 800)
+ CHAR(34) + 'x'
+ REPLICATE(CONVERT(VARCHAR(MAX), 'c'), 800)
+ CHAR(34) + 'x'
+ REPLICATE(CONVERT(VARCHAR(MAX), 'c'), 800)
+ CHAR(34) + 'x'
+ REPLICATE(CONVERT(VARCHAR(MAX), 'c'), 800)
+ CHAR(34) + 'x'
+ REPLICATE(CONVERT(VARCHAR(MAX), 'c'), 800)
+ CHAR(34) + 'x'
+ REPLICATE(CONVERT(VARCHAR(MAX), 'c'), 800)
+ CHAR(34) + 'x'
+ REPLICATE(CONVERT(VARCHAR(MAX), 'c'), 800)
+ CHAR(34) + 'x'
+ REPLICATE(CHAR(34), 400)
+ 'x'
+ REPLICATE(CHAR(34), 400)
+ 'x'
+ REPLICATE(CHAR(34), 400)
+ 'x' + CHAR(34)
+ REPLICATE(CONVERT(VARCHAR(MAX), 'c'), 8000);
GO
SELECT LEN(x), DATALENGTH(x)
FROM dbo.v_foobar;
Results:
----- -----
17226 17226
I suspect that something else is happening to this string as it's being generated or before the command is passed to SQL Server.
Just Join the table to itself and make the appropriate sum.
This query will match each row in the table to all rows in the table with a smaller ID and the same category. You can then easily sum these rows and filter out all rows with a total greater than your limit.
Using sub queries in the where clause like you do might result in many unnecessary query executions if your sub-queries are not optimized to an INNER JOIN by the database engine:
DECLARE @Limit FLOAT
SELECT @Limit = 30
SELECT
o1.Id
, o1.Cost
, o1.Category
FROM
MyOrders o1
-- Join all orders with same category and inferior ID
INNER JOIN MyOrders o2 ON o2.ID <= o1.ID AND o1.Category = o2.Category
GROUP BY
o1.Id
, o1.Cost
, o1.Category
HAVING
SUM(o2.Cost) < @Limit
ORDER BY
ID
Best Answer
A trigger would be how I would handle this. This will slow down all your update statements as SQL now has a bunch of extra work to do, but it'll handle your requirement.
Personally I wouldn't like the idea of the instead of trigger and reapplying the update with a smaller row set as this gets into tricky areas of half completed operations now which could get very messy, quickly.