What about using Raven DB to replace indexes in SQL Server?
I'm working on a project with big tables (10 millions records) on MS SQL server and we need to query that table with very complex filters (that span 10+columns) and sort the result on different fields depending on the user selection on the UI.
Queries are somehow a little bit slow to run, as we need to use temporary tables inside our stored procs to do all the filtering and sorting we need.
I'm thinking I could use a process that load the part of the table we use in the filter part of the query in Raven DB, replacing in fact the stored procs and indexes in SQL with a Raven DB layer I can query against.
Do you think I should spend more time to 开发者_运维知识库investigate why stored procs are slow and do some optimization, or do you think the Raven DB approach could give me more benefits?
It sounds as if you already have a lot of complexity and overhead in your architecture. Introducing a new technology seems unlikely to reduce the amount of complexity. So I certainly think you should investigate the possibility of optimizing your existing set-up.
One thing you should look at: Indexed Views. This would allow you to build the sort of query layer you describe, but with built-in SQL Server functionality. So it should be a lot easier to get working (if not quite so much fun!). Find out more.
You should always profile to identify why something is slow before you attempt to make it faster.
It depends on your data. You can use denormalization for presentation layer, if you have difficult queries. Without any joins, only sorting.
精彩评论