T-sql – Bitwise operations in TSQL

t-sql

Take the following code:

Declare @a integer
set @a = 0
set @a = @a | cast (1 as BINARy)
set @a = @a | cast (100 as BINARy)
select cast (@a as integer)

Now, shouldn't the output of this code be 5 ?

I have an integer, set it to 0, so in binary its all 0's

then I OR it with 1 , so it becomes 0..001

then I OR it with 100 in binary, so it should become 00..0101 which, in decimal is 5

The output I actually get is 101

EDIT: for more clarity, I'm trying to accomplish with TSQL what the following code does in C++

#include<stdio.h>
#include<iostream>
using namespace std;
int main()
{
int a;
a = 0;
a = a | 1;
cout<< a;
a = a | 0b100;
cout<<endl<<a;
}

Best Answer

decimal    binary
100     =  1100100
1       =  0000001
                     or
101     =  1100101