Problem with strchr
I can't get why the following bit of C code doesn't work:
int obtainStringLength(char* str, char c1, char c2) {
char* firstOcurrence = strchr(str, c1);
开发者_运维问答 char* endOcurrence = strchr(str, c2);
return 2+(endOcurrence - firstOcurrence) / sizeof(char*);
}
The idea is to find how many characters are between c1
and c2
:
printf("%d\n", obtainStringLength("abc def ghi", 'a', 'i')); //should yield 11
Unfortunately, this is always printing 1. What is the problem? Shouldn't strchr
work like C#'s string.IndexOf()
?
Division by sizeof(char*)
? That's incorrect - the result of subtracting two pointers is a numerical value (ptrdiff_t
) corresponding to the number of values, not a pointer or difference of addresses.
There's also the off-by-one error in calculating the length. So that last line should look like:
return 1 + (endOcurrence - firstOcurrence);
Your return statement has several problems, due to not understanding pointer arithmetic.
Pointer subtraction already divides by the element size, and char*
was the wrong type anyway.
And you should be adding 1, not 2.
Because each character occupies exactly sizeof (char)
bytes; not sizeof (char*)
bytes.
And sizeof (char)
is, by definition 1, so you can omit it:
return 1 + (endOcurrence - firstOcurrence);
No, strchr() returns a pointer (the address of) the character being sought, or NULL if the character was not found.
That's very different from IndexOf().
精彩评论