Most succinct way to transform a CSV string to a table in T-SQL?
-- Given a CSV string like this:
declare @roles varchar(800)
select @roles = 'Pub,RegUser,ServiceAdmin'
-- Question: How to get roles into a table view like this:
select 'Pub'
union
select 'RegUser'
union
select 'ServiceAdmin'
After posting this, I started playing with some d开发者_StackOverflowynamic SQL. This seems to work, but seems like there might be some security risks by using dynamic SQL - thoughts on this?
declare @rolesSql varchar(800)
select @rolesSql = 'select ''' + replace(@roles, ',', ''' union select ''') + ''''
exec(@rolesSql)
If you're working with SQL Server compatibility level 130 then the STRING_SPLIT function is now the most succinct method available.
Reference link: https://msdn.microsoft.com/en-gb/library/mt684588.aspx
Usage:
SELECT * FROM string_split('Pub,RegUser,ServiceAdmin',',')
RESULT:
value
-----------
Pub
RegUser
ServiceAdmin
See my answer from here
But basically you would:
Create this function in your DB:
CREATE FUNCTION dbo.Split(@origString varchar(max), @Delimiter char(1))
returns @temptable TABLE (items varchar(max))
as
begin
declare @idx int
declare @split varchar(max)
select @idx = 1
if len(@origString )<1 or @origString is null return
while @idx!= 0
begin
set @idx = charindex(@Delimiter,@origString)
if @idx!=0
set @split= left(@origString,@idx - 1)
else
set @split= @origString
if(len(@split)>0)
insert into @temptable(Items) values(@split)
set @origString= right(@origString,len(@origString) - @idx)
if len(@origString) = 0 break
end
return
end
and then call the function and pass in the string you want to split.
Select * From dbo.Split(@roles, ',')
Here's a thorough discussion of your options:
- Arrays and Lists in SQL Server
What i do in this case is just using some string replace to convert it to json and open the json like a table. May not be suitable for every use case but it is very simple to get running and works with strings and files. With files you just need to watch your line break character, mostly i find it to be "Char(13)+Char(10)"
declare @myCSV nvarchar(MAX)= N'"Id";"Duration";"PosX";"PosY"
"•P001";223;-30;35
"•P002";248;-28;35
"•P003";235;-26;35'
--CSV to JSON
--convert to json by replacing some stuff
declare @myJson nvarchar(MAX)= '[['+ replace(@myCSV, Char(13)+Char(10), '],[' ) +']]'
set @myJson = replace(@myJson, ';',',') -- Optional: ensure coma delimiters for json if the current delimiter differs
-- set @myJson = replace(@myJson, ',,',',null,') -- Optional: empty in between
-- set @myJson = replace(@myJson, ',]',',null]') -- Optional: empty before linebreak
SELECT
ROW_NUMBER() OVER (ORDER BY (SELECT 0))-1 AS LineNumber, *
FROM OPENJSON( @myJson )
with (
col0 varchar(255) '$[0]'
,col1 varchar(255) '$[1]'
,col2 varchar(255) '$[2]'
,col3 varchar(255) '$[3]'
,col4 varchar(255) '$[4]'
,col5 varchar(255) '$[5]'
,col6 varchar(255) '$[6]'
,col7 varchar(255) '$[7]'
,col8 varchar(255) '$[8]'
,col9 varchar(255) '$[9]'
--any name column count is possible
) csv
order by (SELECT 0) OFFSET 1 ROWS --hide header row
Using SQL Server's built in XML parsing is also an option. Of course, this glosses over all the nuances of an RFC-4180 compliant CSV.
-- Given a CSV string like this:
declare @roles varchar(800)
select @roles = 'Pub,RegUser,ServiceAdmin'
-- Here's the XML way
select split.csv.value('.', 'varchar(100)') as value
from (
select cast('<x>' + replace(@roles, ',', '</x><x>') + '</x>' as xml) as data
) as csv
cross apply data.nodes('/x') as split(csv)
If you are using SQL 2016+, using string_split
is better, but this is a common way to do this prior to SQL 2016.
Using BULK INSERT you can import a csv file into your sql table -
http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
Even the accepted answer is working fine. but I got this function much faster even for thousands of record. create below function and use.
IF EXISTS (
SELECT 1
FROM Information_schema.Routines
WHERE Specific_schema = 'dbo'
AND specific_name = 'FN_CSVToStringListTable'
AND Routine_Type = 'FUNCTION'
)
BEGIN
DROP FUNCTION [dbo].[FN_CSVToStringListTable]
END
GO
CREATE FUNCTION [dbo].[FN_CSVToStringListTable] (@InStr VARCHAR(MAX))
RETURNS @TempTab TABLE (Id NVARCHAR(max) NOT NULL)
AS
BEGIN
;-- Ensure input ends with comma
SET @InStr = REPLACE(@InStr + ',', ',,', ',')
DECLARE @SP INT
DECLARE @VALUE VARCHAR(1000)
WHILE PATINDEX('%,%', @INSTR) <> 0
BEGIN
SELECT @SP = PATINDEX('%,%', @INSTR)
SELECT @VALUE = LEFT(@INSTR, @SP - 1)
SELECT @INSTR = STUFF(@INSTR, 1, @SP, '')
INSERT INTO @TempTab (Id)
VALUES (@VALUE)
END
RETURN
END
GO
---Test like this.
declare @v as NVARCHAR(max) = N'asdf,,as34df,234df,fs,,34v,5fghwer,56gfg,';
SELECT Id FROM dbo.FN_CSVToStringListTable(@v)
I was about you use the solution mentioned in the accepted answer, but doing more research led me to use Table Value Types:
These are far more efficient and you don't need a TVF (Table valued function) just to create a table from csv. You can use it directly in your scripts or pass that to a stored procedure as a Table Value Parameter. The Type can be created as :
CREATE TYPE [UniqueIdentifiers] AS TABLE(
[Id] [varchar](20) NOT NULL
)
精彩评论