Sql Server Data Type Casting
I'm wondering what the performance impact of the following might be if any. Let's say there is a table with one of the columns data typed as int. This table contains millions and millions of rows. Then lets say the below 2 sql statements are ran.
SELECT * FROM tblTest WHERE colInt in (3,4)
and
SELECT * FROM tblTest WHERE colInt in ('3','4')
I believe in the second query the implied character data type will attempt to convert all the values in the colInt and then try to do the comparison. Can anyone confirm this is how the process works? Does anyone know of an MSDN article that explains this? Is this fairly expensive?
I know this seems quite silly howe开发者_StackOverflow社区ver when you are forced to live by it because of third party software there isn't much we can do except fully understand what the implications are going to be when they change the data type of several columns from int to varchar and vise versus.
The more likely scenario is that the optimizer will cast the two literals to INTs.
You should run this against your data to see.
However, be VERY VERY cautious when converting from int
to varchar
or vice versa. You will get some unexpected behavior, especially with ordering or inequality operators like <
and >
:
SELECT 'True'
WHERE 100 < 20
SELECT 'True'
WHERE '100' < '20'
100
is greater than 20
, but '100'
is less than '20'
because varchars are compared by the first character first, and 2
is greater than 1
.
It gets even more complicated if you include leading 0
s in your varchars:
SELECT 'True'
WHERE 0100 < 020
SELECT 'True'
WHERE '0100' < '0020'
The second statement is no longer True
because '0020'
is less than '0100'
since it starts with 2 0
s.
Be VERY VERY careful.
精彩评论