开发者

Encrypt with openssl and decrypt on iPhone with AES 128, ecb mode

Update : found the solution. I will update this question soon with the actual working code and command.


A client is encrypting a file server-side with C++, and I need to decrypt it in an iPhone application.

My client can crypt and decrypt on his side, and so do I on the iPhone, but we can't decrypt the file encrypted by each other. I saw many related questions on SO, but none could help me find an implementation that works the same way on both side.

I want to output some sample values that we will accept as the common implementation.

I tried to crypt a file with openssl and decrypt it with cocoa, but couldn't.

Here is what I use for encryption:

echo "123456789ABCDEFG" | openssl enc -aes-128-ecb -nosalt -K "41414141414141414141414141414141" -iv 0 > hello.txt.bin

Adding the option -p to openssl call shows that the expected key and iv are used:

key=41414141414141414141414141414141
iv =00000000000000000000000000000000

And for cocoa decryption (in an NSData category):

- (NSData *)AESDecryptWithKey:(NSString *)key {

    char keyPtr[kCCKeySizeAES128+1]; // room for terminator (unused)
    bzero(keyPtr, sizeof(keyPtr)); // fill with zeroes (for pad开发者_StackOverflow社区ding)

    // fetch key data
    [key getCString:keyPtr maxLength:sizeof(keyPtr) encoding:NSUTF8StringEncoding];

    NSUInteger dataLength = [self length];

    //See the doc: For block ciphers, the output size will always be less than or 
    //equal to the input size plus the size of one block.
    //That's why we need to add the size of one block here
    size_t bufferSize = dataLength + kCCBlockSizeAES128;
    void *buffer = malloc(bufferSize);

    size_t numBytesEncrypted = 0;

    char iv[32];
    for (int i = 0; i < 32; i++) {
        iv[i] = 0;
    }

    CCCryptorStatus cryptStatus = CCCrypt(kCCDecrypt, kCCAlgorithmAES128, kCCOptionECBMode + kCCOptionPKCS7Padding,
                                          keyPtr, kCCKeySizeAES128,
                                          iv, //"00000000000000000000000000000000" /* initialization vector (optional) */,
                                          [self bytes], dataLength, /* input */
                                          buffer, bufferSize, /* output */
                                          &numBytesEncrypted);

    if (cryptStatus == kCCSuccess) {
        //the returned NSData takes ownership of the buffer and will free it on deallocation
        return [NSData dataWithBytesNoCopy:buffer length:numBytesEncrypted];
    }

    free(buffer); //free the buffer;
    return nil;
}

called this way:

- (void)testBinaryFileDecryption {
    NSString *databasePath = [[NSBundle mainBundle] pathForResource:@"hello" ofType:@"txt.bin"];

    NSData *data = [NSData dataWithContentsOfFile:databasePath];
    NSAssert(nil != data, @"Encrypted data, freshly loaded from file should not be nil");

    NSData *plain = [data AESDecryptWithKey:@"AAAAAAAAAAAAAAAA"];
    NSAssert(nil != plain, @"Decrypted plain data should not be nil");

    NSLog(@"Result: '%@'", [[NSString alloc] initWithData:plain encoding:NSASCIIStringEncoding]);
}

Result logs: Result: '4¨µ¢Ä½Pk£N

What option am I forgetting? Is the encoding of the NSData returned something else than NSASCIIStringEncoding ?


I know nothing of iPhone development, but looking at this code, it appears you're trying to use the ascii-of-hex-encoding of the actual key to decrypt the packet. OpenSSL's enc requires the hex encoding because it converts the hex into bytes. Your actual key looks more like this, when converted directly to ascii.

["\037", " ", "!", "\"", "#", "$", "%", "&", "'", "\036", "\037", " ", "!", "\"", "#", "$"]

(All that might be obtuse. If you were to encode the string you're using for decrypting into the same format that OpenSSL enc accepts, the key would be 3331333233333334333533363337333833393330333133323333333433353336.)

Try using a key specification of 41414141414141414141414141414141 to OpenSSL and use AAAAAAAAAAAAAAAA in your iPhone code.

Also, I strongly suggest your initial tests be made with data that is exactly N*16 bytes long. OpenSSL enc uses PKCS#5 padding (unless you use -nopad), and your iPhone code is using PKCS#7 padding. On a cursory glance at RFCs, they seem to be the same padding mechanism, but I could be wrong.

And I know you're just trying things out here, but in real production code, please do not use ECB mode.


I'm using Crypt::OpenSSL::AES to encrypt files that are decrypted in my iOS app, which decrypts using CommonCryptor.

    cryptStatus = CCCrypt(kCCDecrypt, kCCAlgorithmAES128, 0,
                          keyPtr, kCCKeySizeAES256,
                          IVECTOR /* initialization vector (optional) -- was NULL*/,
                          [self bytes], dataLength, /* input */
                          buffer, bufferSize, /* output */
                          &numBytesDecrypted);

To initialize the IVECTOR I'm using bzero.

bzero(keyPtr, sizeof(keyPtr)); // fill with zeroes (for padding)

To encrypt under OpenSSL in perl I do this:

my $cipher = Crypt::CBC->new( -key    => $key,
                              -literal_key => 1,
                              -header => 'none',
                              -iv =>     '0000000000000000',
                              -cipher => 'Crypt::OpenSSL::AES' );

OpenSSL seems to accept the '0000000000000000' IV as the same thing as ASCII 0 (null) characters. Seems plausible in retrospect, but it required a lot of hair pulling because every crypto failure looks like every other crypto failure: garbage out.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜