Re-encode url from utf-8 encoded to iso-8859-1 encoded
I have file:// links with non-english characters which are UrlEncoded in UTF-8. For these links to work in a browser I have to re-encode them.
file://development/H%C3%A5ndplukket.doc
becomes
file://development/H%e5ndplukket.doc
I have the following code which works:
public string ReEncodeUrl(string url)
{
Encoding enc = Encoding.GetEncoding("iso-8859-1");
string[] parts = url.Sp开发者_如何学Golit('/');
for (int i = 1; i < parts.Length; i++)
{
parts[i] = HttpUtility.UrlDecode(parts[i]); // Decode to string
parts[i] = HttpUtility.UrlEncode(parts[i], enc); // Re-encode to latin1
parts[i] = parts[i].Replace('+', ' '); // Change + to [space]
}
return string.Join("/", parts);
}
Is there a cleaner way of doing this?
I think that's pretty clean actually. It's readable and you said it functions correctly. As long as the implementation is hidden from the consumer, I wouldn't worry about squeezing out that last improvement.
If you are doing this operation excessively (like hundreds of executions per event) I would think about taking the implementation out of UrlEncode/UrlDecode and stream them into each other to get a performance improvement there by removing the need for string split/join, but testing would have to prove that out anyway and definitely wouldn't be "clean" :-)
While I don't see any real way of changing it that would make a difference, shouldn't the + to space replace be before you UrlEncode so it turns into %20?
admittedly ugly and not really an improvement, but could re-encode the whole thing (avoid the split/iterate/join) then .Replace("%2f", "/")
I don't understand the code wanting to keep a space in the final result - seems like you don't end up with something that's actually encoded if it still has spaces in it?
精彩评论