Architecture for high thoroughput streaming data application in C#?
My current project: We are connecting to remote server开发者_StackOverflow中文版s which are flooding us with health-related data over TCP/IP. The byte streams received are parsed into objects, and the objects are stored into a SQL Server database. Clients to the database are expecting to see data in near-real time.
The problem is that our application can't keep up with the maximum output of the remote server. We have profiled the application, and no methods seem to be overwhelmingly slow.
I'm assuming this problem has been solved at least a few times already. Does anyone have any suggestions, hints, or references concerning this problem?
Try breaking down larger portions of the operation, and looking for alternative, quicker methods. For example, the data parsing individual functions might not be slow, but overall, it could be taking considerably longer than it should (maybe you should keep the bytes in structs, or use unsafe casting for example). There could be issues with the DB insertion (are you using bulk insert?). Maybe the DB structure needs to be changed. Without more information, it's going to be hard for any of us to pinpoint one single issue.
Based on what you've said, I'm guessing the bottleneck is probably in your network I/O.
Q: Are you compressing the data?
精彩评论