How can I Calculate Delay, Jitter and packet loss in network using Java Programming
I have a client and a server. Client asks for number of connections from the user which he wants to open with the server. The client then opens that number of connections with the server and starts communication with server. These all connections are threads and starts immediately. P开发者_如何学运维lease give me a hint how can I calculate delay, Jitter and packet loss in network.
You can send a single request which the server can reply to immediately, like a heartbeat. Do this many times and you can get an average delay and standard deviation for jitter (or you can calculate jitter another way from the same data)
There is no easy way to determine or estimate packet loss with TCP, but with UDP can you can. However, I would leave monitoring packet loss to the network administrators who should have tools to detect this. If they don't have it, why is it important to you?
精彩评论