Problem calculating byte length when sending data over sockets
I'm writing a client/server application. It seems to work well when connecting using the loopback address, but when connecting over the LAN or internet I'm getting a very strange problem that I just can't figure out.
All data sent between the server and client is contained within a class called 'DataPacket'. This class is then serialized to binary so that it can be sent using sockets.
Before a packet is sent, a 4 byte header is attached which contains the byte length of the packet so that the receiver knows how many bytes to expect before it can be deserialized. The problem is that at some point, the receiver thinks it is waiting for much more data than what is actually sent in the header. Sometimes this figure is in the millions of bytes, which is causing serious problems.
This is how the data is sent:
public override void Send(DataPacket packet)
{
byte[] content = packet.Serialize();
byte[] header = BitConverter.GetBytes(content.Length);
List<Byte> bytes = new List<Byte>();
bytes.AddRange(header);
bytes.AddRange(content);
if (socket.Connected)
{
socket.BeginSend(bytes.ToArray(), 0, b开发者_JS百科ytes.Count, SocketFlags.None, OnSendPacket, socket);
}
}
On the server side, each client is assigned a StateObject - this is a class that holds the users socket, buffer, and any data that has been received so far that is incomplete.
This is how the StateObject class looks:
public class StateObject
{
public const int HeaderSize = 4;
public const int BufferSize = 8192;
public Socket Socket { get; private set; }
public List<Byte> Data { get; set; }
public byte[] Header { get; set; }
public byte[] Buffer { get; set; }
public StateObject(Socket sock)
{
Socket = sock;
Data = new List<Byte>();
Header = new byte[HeaderSize];
Buffer = new byte[BufferSize];
}
}
This is how the StateObject class is used:
private void OnClientConnect(IAsyncResult result)
{
Socket serverSock = (Socket)result.AsyncState;
Socket clientSock = serverSock.EndAccept(result);
StateObject so = new StateObject(clientSock);
so.Socket.BeginReceive(so.Buffer, 0, StateObject.BufferSize, SocketFlags.None, OnReceiveData, so);
}
When some data is received from a client, EndReceive is called. If StateObject.Data is empty then it is assumed that this is the first part of a new packet, therefore the header is checked. If the number of bytes received is less than the size shown in the header then I call BeginReceive again, re-using the same StateObject.
Here is the code for OnReceiveData:
private void OnReceiveData(IAsyncResult result)
{
StateObject state = (StateObject)result.AsyncState;
int bytes = state.Socket.EndReceive(result);
if (bytes > 0)
{
state.ProcessReceivedData(bytes);
if (state.CheckDataOutstanding())
{
//Wait until all data has been received.
WaitForData(state);
}
else
{
ThreadPool.QueueUserWorkItem(OnPacketReceived, state);
//Continue receiving data from client.
WaitForData(new StateObject(state.Socket));
}
}
}
As can be seen above, first I call state.ProcessReceivedData() - this is where I detach the header from the data, and then move the data from the buffer:
public void ProcessReceivedData(int byteCount)
{
int offset = 0;
if (Data.Count == 0)
{
//This is the first part of the message so get the header.
System.Buffer.BlockCopy(Buffer, 0, Header, 0, HeaderSize);
offset = HeaderSize;
}
byte[] temp = new byte[byteCount - offset];
System.Buffer.BlockCopy(Buffer, offset, temp, 0, temp.Length);
Data.AddRange(temp);
}
I then call CheckDataOutstanding() to compare the number of bytes received with the size in the header. If these match then I can safely deserialize the data:
public bool CheckDataOutstanding()
{
int totalBytes = BitConverter.ToInt32(Header, 0);
int receivedBytes = Data.Count;
int outstanding = totalBytes - receivedBytes;
return outstanding > 0;
}
The problem is that somehow the value in the header is ending up at a different value to what it was sent as. This seems to happen randomly, sometimes on the server and other times on the client. Due to the randomness and the general difficulty in debugging both server and client accross the network I'm finding it nearly impossible to figure this out.
Is there something obvious or stupid that I'm missing here?
What happens if you get only part of the header? That is, suppose you got three bytes in the first read instead of four or more? Your ProcessReceivedData
function will need to account for that and not try to extract the header size until you have at least four bytes.
精彩评论