SQL Server Physical Memory Grow After Execution of Commands
I am running Server Application on Windows Server 2008 with SQL Server 2008, Now My scenario is as Given.
I have implemented a custom Connection Pooling (why I did that its another long story). therefore I am not opening or closing connection on each request.
Server Application executes more than thousands DBC开发者_如何学Command in a minute using 10 connections available in the Pool.
after Execution of the command and I am not cleaning connection(becuase i have no idea what to clean and how to clean without closing and reopening it) and also i am not disposing the Command Object itself.
When server application goes down it releases all the connection by call close method on them.
Now I observed that the After a test of 1 hour or 2 process of sqlserver goes around 3Gig Memory, Then I shut down my server application even then the occupied memory was not released or reduced (according to task manager and Resource Monitor) afterall i restarted sqlserver to release the memory.
Now following are my question.
Do i need to call Dispose of DBCommand object each time, if yes then will it leaves impact on connection object.
Is above mentioned problem is causing what I observed or there are other reasons as well.
Is there any way to clean the connection without closing it and what kind of garbage it need to clean after each DbCommand execution.
Thanks Mubashar
Then I shut down my server application even then the occupied memory was not released or reduced (according to task manager and Resource Monitor) [...]
Since closing your application disposes all objects, this observation make it quite clear that the memory loss problem is not caused by your objects not being disposed. (Nevertheless, as David correctly pointed out, always Disposing IDisposables is good practice.)
Thus, you should look at the memory configuration of your SQL Server. Note that, if memory is available, SQL Server using that memory is a good thing. Unused memory is a waste of money. Usually, SQL Server frees memory once some other process needs it:
When SQL Server is using memory dynamically, it queries the system periodically to determine the amount of free physical memory available. SQL Server grows or shrinks the buffer cache to keep free physical memory between 4 MB and 10 MB depending on server activity. This prevents Microsoft Windows NT® 4.0 or Windows® 2000 from paging. If there is less memory free, SQL Server releases memory to Windows NT 4.0 or Windows 2000 that usually goes on the free list. If there is more memory free, SQL Server recommits memory to the buffer cache. SQL Server adds memory to the buffer cache only when its workload requires more memory; a server at rest does not grow its buffer cache.
So, unless your server starts swapping excessively, I would not worry too much about SQL Server memory consumption.
SQL Server will use lots of memory and not give it back until necessary, hence you should always run SQL on a dedicated machine. You can configure a maximum memory limit if you do have to use it in a shared environment, but there is no way your application will cause SQL to release memory.
Always
Dispose()
IDispose
-ables.No idea.
Use sp_reset_connection to reset the connection before returning it to the pool.
Connection Pooling is controlled and the parameters passed to a connection string that basically comprises the following:
· Connect Timeout
· Min Pool Size
· Max Pool Size
· Pooling
If you see that your application generates a huge number of connections in a very short span, then you first check your application about the open connections and make sure to close them as soon as you finish working with them. The closed connection will not be actually closed and will be available for use by another request to the server. Specially check your DataReader objects and close them as soon as they go out of context. Sometimes, developer uses multiple DataReaders in nested loop and close all the DataReader in finally clause. Don't do this and try to stop the DataReader as soon as you come out of loop. Yes, you can always check the status of a DataReader object in finally clause again, and if not closed, you can close it.
Now check with tweaking the ConnectTimeout to a minimum, say 1 second. This sets the lifespan of a connection in pool closed by the Application, and set the MaxPoolSize to maximum limit of your server. I guess, it would be 100, if no other application would be connected to the same server database. Put MinPoolSize value to 5, as you would set the ConnectTimeOut to a minimum and in this case, connection pooling to work efficiently, you would need some connections always available.
Try these settings and changes. Hope this will work to your satisfaction.
Thanks,
Rajeev Ranjan Lall
精彩评论