How to initialise an unsafe pointer in C# and convert it to a byte[]?
I put a post up yesterday, How does one create structures for C# originally written in C++.
Thank you for your responses.
I'm trying, without much success, to use DeviceIOControl on an ARM platform running WinCE 6.0 and .NET Compact framework 2.0 All I am trying to achieve is the control of a port pin and it's proving to be a nightmare.
The following is the PInvoke declaration:
[DllImport("coredll.dll", EntryPoint = "DeviceIoControl", SetLastError = true)]
internal static extern bool DeviceIoControlCE(int hDevice,
int dwIoControlCode,
byte[] lpInBuffer,
int nInBufferSize,
byte[] lpOutBuffer,
int nOutBufferSize,
ref int lpBytesReturned,
IntPtr lpOverlapped);
The PInvoke declaration suggests a byte[] may be passed to it simply. Surely it's an easy ma开发者_高级运维tter to write the values to each member of a structure, convert it to an array of bytes and pass it to the dll.
I have the following:
[StructLayout(LayoutKind.Sequential)]
public struct pio_desc
{
unsafe byte* pin_name; //Length???
public uint pin_number; //4 bytes
public uint default_value; //4 bytes
public byte attribute; //1 byte
public uint pio_type; //4 bytes
}
and
pio_desc PA13 = new pio_desc();
So surely now it's a matter of doing something like:
PA13.pin_number = AT91_PIN_PA13; //Length 4 bytes
PA13.default_value = 0; //Length 4 bytes
PA13.attribtue = PIO_DEFAULT; //Length 1 byte
PA13.pio_type = PIO_OUTPUT; //Length 4 bytes
and to convert (pin_number for example) to a byte[]:
byte[] temp = BitConverter.GetBytes(PA13.pin_number); //uints are 4 bytes wide
byteArray[++NumberOfChars] = temp[0];
byteArray[++NumberOfChars] = temp[1];
byteArray[++NumberOfChars] = temp[2];
byteArray[++NumberOfChars] = temp[3]; //Will need to check on Endianess
Questions:
In the structure PA13, how do I initialise the unsafe pointer pin_name? The author of the driver notes that this is not used, presumably by his driver. Will Windows need this to be some value?
PA13.pin_name = ??????
Then, how do I convert this pointer to a byte to fit into my byte[] array to be passed to DeviceIOControl?
I've become quite disappointed and frustrated at how difficult it is to change the voltage level of a port pin - I've been struggling with this problem for days now. Because I come from a hardware background, I think it's going to be easier (and less eligant) for me to implement IO control on another controller and to pass control data to it via a COM port.
Thanks again for any (simple) assistance.
You will need to do a few different things here. First, replace this member:
unsafe byte* pin_name; //Length???
with:
[MarshalAs(UnmanagedType.LPStr)] public string pin_name;
Then replace the in/out buffers in the P/Invoke declaration from byte[]
to IntPtr
. Then you can use this code to convert the data:
pio_desc PA13;
// Set the members of PA13...
IntPtr ptr = IntPtr.Zero;
try {
var size = Marshal.SizeOf(PA13);
ptr = Marshal.AllocHGlobal(size);
Marshal.StructureToPtr(PA13, ptr, false);
// Your P/Invoke call goes here.
// size will be the "nInBufferSize" argument
// ptr will be the "lpInBuffer" argument
} finally {
if (ptr != IntPtr.Zero) {
Marshal.DestroyStructure(ptr, typeof(pio_desc));
Marshal.FreeHGlobal(ptr);
}
}
You can make this a lot easier by lying about the [DllImport] declaration. Just declare the lpInBuffer argument as the structure type, the pinvoke marshaller will convert it to a pointer anyway. Thus:
[DllImport("coredll.dll", EntryPoint = "DeviceIoControl", SetLastError = true)]
internal static extern bool SetOutputPin(IntPtr hDevice,
int dwIoControlCode,
ref pio_desc lpInBuffer,
int nInBufferSize,
IntPtr lpOutBuffer,
int nOutBufferSize,
out int lpBytesReturned,
IntPtr lpOverlapped);
Using IntPtr for lpOutBuffer because the driver probably doesn't return anything. Pass IntPtr.Zero. Same idea with the structure. If the field isn't used then simply declare it as an IntPtr:
[StructLayout(LayoutKind.Sequential)]
public struct pio_desc
{
public IntPtr pin_name; // Leave at IntPtr.Zero
public uint pin_number; //4 bytes
public uint default_value; //4 bytes
public byte attribute; //1 byte
public uint pio_type; //4 bytes
}
Be careful about the Packing property, it makes a difference here because of the byte sized field. You may need 1 but that's just a guess without knowing anything about the driver. If you have working C code then test the value of sizeof(pio_desc) and compare with Marshal.SizeOf(). Pass Marshal.SizeOf(typeof(pio_desc)) as the nInBufferSize argument. If you would have posted the C declarations then this would have been easier to answer accurately.
Declare lpInBuffer and lpOutBuffer as IntPtr. Initialize them using Marshal.AllocHGlobal (don't forget to release them with Marshal.FreeHGlobal in the end). Fill these buffer and read it using different Marshal.Copy overloads.
精彩评论