Write utf-8 to a sql server Text field using ADO.Net and maintain the UTF-8 bytes
I have some xml encoded as UTF-8 and I want to write this to a Text field in SQL Server. UTF-8 is byte compatible with Text so it should be able to do this and then read out the xml later still encoded as utf-8.
However special characters such as ÄÅÖ, which are multi-byte in UTF-8 get changed开发者_运维百科 on the way.
I have code like this:
byte[] myXML = ...
SqlCommand _MyCommand = new SqlCommand(storeProcedureName, pmiDB.GetADOConnection());
_MyCommand.CommandType = CommandType.StoredProcedure;
_MyCommand.Parameters.Add("xmlText", SqlDbType.Text);
_MyCommand.Parameters["xmlText"].Value = Encoding.UTF8.GetString(myXML);
_MyCommand.ExecuteNonQuery();
My guess is that changing the xml byte array to string changes the special characters to UTF-16 characters which are then changed again to the Latin1. And Latin1 ÖÄÅ are not the same as UTF-8 ÖÄÅ.
How can I write the UTF-8 xml bytes to the Text field without them getting changed?
Define your column as NText or NVarchar
The solution that I got to work was to change the Stored Procedure so that the myXml parameter was Varbinary(Max), which allowed me to pass in the byte array. Then in the SP I Cast the Varbinary(max) to Varchar(max). This preserves the bytes as required for UTF-8
SET myXMLText = CAST(myXMLBinary as VARCHAR(MAX))
if you want to store UTF-8 use binary then, because text is stored internally as UTF-16
If it's XML and if you're on SQL Server 2005 and up - use the XML column type! It's faster, it's more compact than VARCHAR(MAX) or NVARCHAR(MAX), you can associate it with an XML schema and thus validate only valid XML is stored.... only benefits!
If you can't use the XML column type for whatever reason, then please at least drop the TEXT for VARCHAR(MAX) or NVARCHAR(MAX)! TEXT/NTEXT is deprecated and will go away - plus, with (N)VARCHAR(MAX), you get all the usual strings functions, too, that don't work on TEXT/NTEXT.
精彩评论