receiving data from socket after losing WiFi network
I have client/server
application. Client on iPhone
, the server on the Windows-based
machine. First, I establish a connection to my server and that works fine. When I go away from WiFi area - client disconnects (is's expected).
But, 开发者_如何学编程when I enter into WiFi
area again and tried to connect to the server, the connection is established but the client doesn't receive any data. I need restart my client application to connect to the server.
Why did it happen? This happens always when I'm sending data in both directions with about 1 Mbit/s.
I'll be thankful for any idea =)
Thanks
When you leave the Wifi area, Tcp detects that you have no connection, so the server will drop the Tcp connection. You have to reestablish it when you get back in Wifi range.
I am not an expert in this area, but maybe there is a time limit, within which, if you reenter the Wifi, the existing Tcp connection is reused?
However, to be robust, you should always code your app to retry, if operating in Wifi is one of your main scenarios.
Your TCP has gone into a long timeout trying to get the data through on a link that doesn't exist for a while. You need to detect this happening, close the socket, and reconnect.
Or, switch to using UDP if you don't actually need TCP's reliability (although this isn't its proper name, think of UDP as the 'Unreliable Datagram Protocol', in other words there's no guarantee of delivery, nor of delivery in the correct order).
精彩评论