开发者

WCF service leaks handles and memory when a client times out

I've got a net.tcp WCF service running in a self-hosted environment. I've setup a forced scenario where the server will run a method of the service for long enough that the client time's out. I.e:

[OperationContract]
public bool someMethod() {
    Thread.Sleep(60000);
    return true开发者_开发问答;
}

On the client:

public void callSomeMethod() {
     using (var proxy = getProxy()) {
         proxy.someMethod();
     }
}

When the client times out after 60 seconds, one or more handles are leaked on the server. Doing this repeatedly will eventually lead to a server crash because it's out of resources.

Note: On the client, I intentionally am not cleaning up by using Close()/Abort() for this test. My theory is that if the client ever connects to the server but is interrupted before the connection can be cleaned up, the server will leak.

One final word- when I look at ProcessExplorer, it's showing that my leaks are TCP sockets.

Is there a better way to handle this server-side? I've seen references to using a ChannelFactory, but I'm not sure on the details (is that even server-side code?) If so, would it give me any more control over making sure error states get cleaned up properly?

EDIT

I've since done some searching on ChannelFactory and I see it's a client-side feature, not server side, so disregard that. Still open though about if there's a way to more properly clean up leaked ports.


As it turns out, I was running my service from a [STAThread] - anytime an app uses COM calls, you'll potentially leak memory unless you are running [MTAThread] mode.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜