开发者

How to convert an NSString to hex values

I'd like to convert a regular NSString into an NSString with the (what I assume are) ASCII hex values and back.

I need to produce the same output that the Java methods below do, but I can't seem to find a way to do it in Objective-C. I've found some examples in C and C++ but I've had a hard time working them into my code.

Here are the Java methods I'm trying to reproduce:

/**
* Encodes the given string by using the hexadecimal representation of its UTF-8 bytes.
*
* @param s The string to encode.
* @return The encoded string.
*/
public static String utf8HexEncode(String s) {
    if (s == null) {
        return null;
    }
    byte[] utf8;
    try {
        utf8 = s.getBytes(ENCODING_UTF8);
    } catch (UnsupportedEncodingException x) {
        throw new RuntimeException(x);
    }
    return String.valueOf(Hex.encodeHex(utf8));
}

/**
* Decodes the given string by using the hexadecimal representation of its UTF-8 bytes.
*
* @param s The string to decode.
* @return The decoded string.
* @throws Exception If an error occurs.
*/
public static String utf8HexDecode(String s) throws Exception {
if (s == null) {
    return null;
}
    return new String(Hex.decodeHex(s.toCharArray()), ENCODING_UTF8);
} 

Update: Thanks to drawnonward's answer here's the method I wrote to create the hex NSStrings. It gives me an "Initialization discards qualifiers from pointer target type" warning on the char declaration line, but it works.

- (NSString *)stringToHex:(NSString *)string
{
    char *utf8 = [string UTF8String];
    NSMutableString *hex = [NSMutableString string];
    while ( *utf8 ) [hex appendFormat:@"%02X" , *utf8++ & 0x00FF];

    return [NSString stringWithFormat:@"%@", hex];
}

Haven't had time to write the decoding method yet. When I do, I'll edit this to post it for anyone else interested.开发者_开发技巧

Update2: So the method I posted above actually doesn't output what I'm looking for. Instead of outputting hex values in 0-f format, it was instead outputting all numbers. I finally got back to working on this problem and was able to write a category for NSString that exactly duplicates the Java methods I posted. Here it is:

//
//  NSString+hex.h
//  Created by Ben Baron on 10/20/10.
//

@interface NSString (hex) 

    + (NSString *) stringFromHex:(NSString *)str;
    + (NSString *) stringToHex:(NSString *)str;

@end

//
//  NSString+hex.m
//  Created by Ben Baron on 10/20/10.
//

#import "NSString+hex.h"

@implementation NSString (hex)

+ (NSString *) stringFromHex:(NSString *)str 
{   
    NSMutableData *stringData = [[[NSMutableData alloc] init] autorelease];
    unsigned char whole_byte;
    char byte_chars[3] = {'\0','\0','\0'};
    int i;
    for (i=0; i < [str length] / 2; i++) {
        byte_chars[0] = [str characterAtIndex:i*2];
        byte_chars[1] = [str characterAtIndex:i*2+1];
        whole_byte = strtol(byte_chars, NULL, 16);
        [stringData appendBytes:&whole_byte length:1]; 
    }

    return [[[NSString alloc] initWithData:stringData encoding:NSASCIIStringEncoding] autorelease];
}

+ (NSString *) stringToHex:(NSString *)str
{   
    NSUInteger len = [str length];
    unichar *chars = malloc(len * sizeof(unichar));
    [str getCharacters:chars];

    NSMutableString *hexString = [[NSMutableString alloc] init];

    for(NSUInteger i = 0; i < len; i++ )
    {
        [hexString appendString:[NSString stringWithFormat:@"%x", chars[i]]];
    }
    free(chars);

    return [hexString autorelease];
}

@end


The perfect and short way to convert nsstring to hexadecimal values

NSMutableString *tempHex=[[NSMutableString alloc] init];

[tempHex appendString:@"0xD2D2D2"];

unsigned colorInt = 0;

[[NSScanner scannerWithString:tempHex] scanHexInt:&colorInt];

lblAttString.backgroundColor=UIColorFromRGB(colorInt);

The macro used for this code is----

#define UIColorFromRGB(rgbValue) 
[UIColor \colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0 \
green:((float)((rgbValue & 0xFF00) >> 8))/255.0 \
blue:((float)(rgbValue & 0xFF))/255.0 alpha:1.0]


For these lines of Java

utf8 = s.getBytes(ENCODING_UTF8);
new String(decodedHexString, ENCODING_UTF8);

Objective-C equivalents would be

utf8 = [s UTF8String];
[NSString initWithUTF8String:decodedHexString];

To make an NSString with the hexadecimal representation of a character string:

NSMutableString *hex = [NSMutableString string];
while ( *utf8 ) [hex appendFormat:@"%02X" , *utf8++ & 0x00FF];

You will have to make your own decodeHex function. Just pull two characters out of the string and, if they are valid, add a byte to the result.


There is a problem with your stringToHex method - it drops leading 0s, and ignores 00s. Just as a quick fix, I made the below:

+ (NSString *) stringToHex:(NSString *)str
{   
    NSUInteger len = [str length];
    unichar *chars = malloc(len * sizeof(unichar));
    [str getCharacters:chars];

    NSMutableString *hexString = [[NSMutableString alloc] init];

    for(NSUInteger i = 0; i < len; i++ )
    {
        // [hexString [NSString stringWithFormat:@"%02x", chars[i]]]; /*previous input*/
        [hexString appendFormat:@"%02x", chars[i]]; /*EDITED PER COMMENT BELOW*/
    }
    free(chars);

    return [hexString autorelease];
}


Thanks to all who contributed on this thread. It was a great help to me. Since things have moved on a little since the original post, here's my updated implementation for iOS 6. I went with the categories approach, but chose to split the load between NSData and NSString. Comments welcomed.

First, the NSString half, which handles decoding a hex encoded string into an NSData object.

@implementation NSString (StringToHexData)

//
// Decodes an NSString containing hex encoded bytes into an NSData object
//
- (NSData *) stringToHexData
{
    int len = [self length] / 2;    // Target length
    unsigned char *buf = malloc(len)
    unsigned char *whole_byte = buf;
    char byte_chars[3] = {'\0','\0','\0'};

    int i;
    for (i=0; i < [self length] / 2; i++) {
        byte_chars[0] = [self characterAtIndex:i*2];
        byte_chars[1] = [self characterAtIndex:i*2+1];
        *whole_byte = strtol(byte_chars, NULL, 16);
        whole_byte++;
    }

    NSData *data = [NSData dataWithBytes:buf length:len];
    free( buf );
    return data;
}
@end

The changes were mostly for efficiency's sake: some simple old-fashioned pointer arithmetic means I could allocate the whole buffer in one go, and populate it byte by byte. Then the whole thing is passed to NSData in one go.

The encoding part, in NSData, looks like this:

@implementation NSData (DataToHexString)

- (NSString *) dataToHexString
{
    NSUInteger          len = [self length];
    char *              chars = (char *)[self bytes];
    NSMutableString *   hexString = [[NSMutableString alloc] init];

    for(NSUInteger i = 0; i < len; i++ )
        [hexString appendString:[NSString stringWithFormat:@"%0.2hhx", chars[i]]];

    return hexString;
}
@end

Again, some minor changes, though I suspect no efficiency gains here. The use of "%0.2hhx" solved all the problems of missing leading zero's and ensures that only a single-byte is output at a time.

Hope this helps the next person taking this on!


One possible solution:

+(NSString*)hexFromStr:(NSString*)str
{
    NSData* nsData = [str dataUsingEncoding:NSUTF8StringEncoding];
    const char* data = [nsData bytes];
    NSUInteger len = nsData.length;
    NSMutableString* hex = [NSMutableString string];
    for(int i = 0; i < len; ++i)[hex appendFormat:@"%02X", data[i]];
    return hex;
}


So, first off, I would like to thank drawnonward for his answer. This gave me the first function, mean and clean. In the same spirit, I wrote the other one. Hope you like it.

@synthesize unsigned char* value= _value;

- (NSString*) hexString
{
    _value[CONSTANT]= '\0';
    unsigned char* ptr= _value;

    NSMutableString* hex = [[NSMutableString alloc] init];
    while ( *ptr ) [hex appendFormat:@"%02x", *ptr++ & 0x00FF];

    return [hex autorelease];
}

- (void) setHexString:(NSString*)hexString
{
    _value[CONSTANT]= '\0';
    unsigned char* ptr= _value;

    for (const char* src= [hexString cStringUsingEncoding:NSASCIIStringEncoding];
         *src;
         src+=2)
    {
        unsigned int hexByte;
        /*int res=*/ sscanf(src,"%02x",&hexByte);
        *ptr++= (unsigned char)(hexByte & 0x00FF);
    }
    *ptr= '\0';
}


My input was an digit base10 string, and the output should be the hex representation in string format. Examples:

  • @"10" -> @"A"
  • @"1128" -> @"468"
  • @"1833828235" -> @"6D4DFF8B"

Implementation:

+ (NSString *) stringToHex:(NSString *)str{
NSInteger result = [str integerValue];
NSString *hexStr = (result)?@"":@"0";

while (result!=0) {
     NSInteger reminder = result % 16;

    if(reminder>=0 && reminder<=9){
        hexStr = [[NSString stringWithFormat:@"%ld",(long)reminder] stringByAppendingString:hexStr];
    }else if(reminder==10){
        hexStr = [@"A" stringByAppendingString:hexStr];
    }else if(reminder==11){
        hexStr = [@"B" stringByAppendingString:hexStr];
    }else if(reminder==12){
        hexStr = [@"C" stringByAppendingString:hexStr];
    }else if(reminder==13){
        hexStr = [@"D" stringByAppendingString:hexStr];
    }else if(reminder==14){
        hexStr = [@"E" stringByAppendingString:hexStr];
    }else{
        hexStr = [@"F" stringByAppendingString:hexStr];
    }

    result /=16;
}

return hexStr;

}


Perhaps you should use NSString dataUsingEncoding: to encode and initWithData:length:encoding: to decode. Depends on where you are getting the data from.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜