Can't use unichr in Python 3.1
I've been looking through the Python Cookbook (2nd Edition) to learn how to process strings and characters.
I wanted to try converting a number into its Unicode equivalent. So I tried using the built-in function called 'unichr', which, according to the Cookbook, goes something like:
>>> print repr(unichr(8224))
... and will output:
u'\u2020'
However, the code failed. I thought it had something to do with print (because Python 3 uses print() instead of print ""), but that didn't work out as well. I tried several variations to the code, and it still failed. At long last, I just typed a simple line:
unichr(10000)
To my surprise, this error message kept popping up, no matter what value I put into the above function:
NameError: name 'unichr' is not defined
What could be the problem? Is there some specific module that开发者_JAVA百科 I'm supposed to import?
In Python 3, you just use chr
:
>>> chr(10000)
'✐'
In Python 3, there's no difference between unicode and normal strings anymore. Only between unicode strings and binary data. So the developers finally removed the unichr
function in favor of a common chr
which now does what the old unichr
did. See the documentation here.
In case you need to run in both python 2 and python 3, you can use this common syntax (the unused syntax would point to the new one)
try:
unichr
except NameError:
unichr = chr
Python 3.x doesn't have a special Unicode string type/class. Every string is a Unicode string. So... I'd try chr
. Should give you what unichr
did pre-3.x. Can't test, sadly.
As suggestted by this is a better way to do it for it is compatible for both 2 and 3:
# Python 2 and 3:
from builtins import chr
assert chr(8364) == '€'
And you may need pip install future
incase some error occures.
精彩评论