开发者

High performance database queries

I have a .NET desktop application, which is used by 5000 users who are distributed across Canada. One of the functionality is to communicate with some modems using some parameters that we get from the database.

The unique thing about this functionality is:

1 - it should be extremely fast because it is communicating with modem tools and there is a time out on those modem, and we cannot control that.

2 - Those information that we are reading from the database are very large in size, they have parameters for all modem types, which are like 2000 of them.

3 - those information are not changing very frequently. They change only once per month maybe.

What is the best approach to handle that?

I was thinking to request all these info on the application startup, and keep it in memory, but I am开发者_开发问答 hesitant to do that because it might take time, because there are tons of info.

Our standard at work is to have WCF services whenever we want to communicate to database , but I can get exception for that rule, but at the same time, it is nice to stick to that rule if possible.

My question, is can we do WCF service cache on the server side? so if one client request data for one modem type, it will be in the cache for all later requests? Consider that we have multi-servers for WCF services, so the cache will have more complexity with this setting.

is WCF cache (if possible) is the best answer?

can we gain any advantage if we bypass WCF and access the database directly?

Please advice, because it is very critical issue


Why not let all clients have their own copy of the data ? You stated yourself that it doesn't change too frequently. You can have it in an lightweight database, such as SQLite; or in an XML file.

Then you could employ a versioning scheme. Store a version number along with the local data. When starting the app, ask the central server for the current version number of the newest data using a webservice. If the version numbers differ, download the new data set.

This way your application will also work if the central SQL server is down (or network connectivity is missing, latency is high, etc.).

Whenever performance is critical (as in, things don't work if an answer is not received within a time frame), I would really try to avoid depending on the network.


I would say for any performance based application that you need to look at the amount of data going across the wire and limit it down where possible.

For WCF, take a look at the actual request structure and determine whether the request itself or the data within the request is a limiting factor.

Some pointers: stay away from anything that looks like XML. Use JSON notation if you have to or even just a simple CSV record format if possible. For example: "value1","value2"...

If the data doesn't change very often you can have the initial request pass in the version number of the data it currently has and let the server(s) determine if it has the latest or not. This could radically drop the amount of data you have to send down. As a side note this is how most caching systems work.


First off, I'd write this as you normally would. I suspect that this may not really be a problem at all - and the only way to check would be to implement it and profile to see if there really is a performance bottleneck here. I understand wanting to prevent modem hangs, but the timeouts on modem devices are not incredibly short, and often configurable in order to make longer.

Those information that we are reading from the database are very large in size, they have parameters for all modem types, which are like 2000 of them.

Modem parameters for 2000 modems does not seem like a huge amount of information - at most, I'd expect this to be a few megabytes worth of data. If that is the case, loading this into memory on the server in order to make the WCF service quicker.

That being said, if the client knows the type of modem(s) they will be using, they could always pull the information required for those modem types across the wire prior to their first usage. Not every client will be interfacing with all 2000 modem types, so only a few types need to be local. Since this changes infrequently, this could even be cached in a local application store, preventing the need to re-fetch the data.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜