Python's string.translate() doesn't fully work?
Given this example, I get the error that follows:
print u'\2033'.translate({2033:u'd'})
C:\Python26\lib\encodings\cp437.pyc in encode(self, input, errors)
10
11 def encode(self,input,errors='strict'):
---> 12 return codecs.charmap_encode(input,errors,encoding_map)
13
14 def decode(self,input,errors='strict'):
开发者_JAVA技巧UnicodeEncodeError: 'charmap' codec can't encode character u'\x83' in position 0
Try this instead:
>>> print u'\u2033'.translate({0x2033:u'd'})
d
Since you used u'\2033'
instead of u'\u2033'
, The result was two characters: u'\203'+u'3'
. Trying to print this gave an exception because your terminal's encoding doesn't support the character u'\203'
(which is the same as u'\x83'
).
Also note the difference between 2033
and 0x2033
in the dictionary: The \uxxxx
escape sequence takes its value in hexadecimal, so you'd need 0x2033
to match it.
Regarding your question's title, string.translate
(the translate
function in the string
module) doesn't support a dictionary as a parameter, but calling .translate
on the unicode string itself (as you did in the question body) works.
精彩评论