Can you make sense of this C pointer code?
I have this snippet of C code that uses pointers in a very confusing way.
// We first point to a specific location within an array..
double* h = &H[9*i];
int line1 = 2*n*i;
int line2 = line1+6;
// ..and then acc开发者_运维知识库ess elements using that pointer, somehow..
V[line1+0]=h[0]*h[1];
V[line1+1]=h[0]*h[4] + h[3]*h[1];
What's happening here? how do I write something equivalent in C#?
You don't really write something equivalent in C# because you don't have pointers there (except by invoking unsafe
code) - to get an element from a C# array, you need an array ref and an index, and you index into the array.
You can, of course, do the same with a C array. We convert the C pointer-arithmetic into C array-indexing:
int h_index = 9 * i;
int line1 = 2 * n * i;
int line2 = line1 + 6;
V[line1 + 0] = H[h_index] * H[h_index + 1];
V[line1 + 1] = H[h_index] * H[h_index + 4] + H[h_index + 3] * H[h_index + 1];
And then we have something that can be used pretty much verbatim in C#.
&H[9*i] == (H + 9*i)
, thus you can replace uses of h[x]
with H[9*i+x]
. The rest should be straightforward.
精彩评论