UTF-8 and CGI::Ajax
I have cgi script pulling search terms from an HTML form through CGI::Ajax
. When running Encode::Detect::Detector
over the resulting string in my script, I get, instead of just UTF-8, a variety of encodings depending on the characters entered in the form: Greek chars turn up as UTF-8
, umlauts as windows-1252
and no output at all for ASCII characters. I am fairly sure the problem lies with whatever CGI::Ajax
does to the string it passes to Perl. It uses decodeURI()
somewhere in its code, I have tried URI::Escape
and Encode
in all possible permutations, but none of my attempts to normalise the different strings to a single encoding were successful. As it is, one or the ot开发者_JS百科her set of non-ASCII chars (either umlauts or the Greek) will always be garbled. How do I tell Ajax to keep it Unicode?
Solved: CGI::Ajax
apparently uses Javascript’s escape
function, which does not handle Unicode correctly. The function has been superseded by encodeURI
and encodeURIComponent
, which can be set as the default escaping functions for an CGI::Ajax
object $pjx
like so: $pjx->js_encode_function('encodeURIComponent');
. Phew.
The above answer didn't work for me, however when I changed my call from:
print $pjx->build_html( $cgi, \&Show_HTML);
to
print $pjx->build_html( $cgi, \&Show_HTML, {-charset=>'UTF-8'});
UTF-8 characters started displaying correctly.
Source: http://search.cpan.org/~bpederse/CGI-Ajax-0.707/lib/CGI/Ajax.pm
精彩评论