开发者

504 with big resultset in RIA Domain Service

I am building a RIA Domain Service that needs to perform queries on a table with lots of data. By working on this, I've realized the table size can effectively be an issue.

For example:

return this.ObjectContext.Logs;

fails with a 504 err, whereas:

return this.ObjectContext.Logs.Take(10);

works.

As you can see I am working on a table of logs, which includes a field "username". I need to retrieve all distinct usernames from the table. So I set up this method:

public IEnumerable<string> GetUsers()
    {
        var users = (from l in this.ObjectContext.Logs
                    select l.Username).Distinct().Take(10);

        return users;
    }

I am trying to execute all the necessary operations on the server-side, in order to reduce the transferred information (and relative delay) to the minimum.

However, even with that query I get a 504 error. In Fiddler I see: "ReadResponse() failed: The server did not return a response for this request."

Initially I thought this to be a transport issue (perhaps the size of the transferred data was too big or so), so I set up a breakpoint on "return users". I ran the code and the deb开发者_StackOverflow社区ugger stopped on the "return users" line. I clicked on "results view" but at that point it was unable to display any result, just showing an error.

I'd like to report here the error, but suddenly Visual Studio does not stop anymore on the DomainService breakpoints and stops only on the Silverlight project's ones. I am actually investigating this as it seems that it is not calling the domain service anymore (fiddler does not show any call to the ria domain service anymore).

Is there anybody who can assist me in solving these issues? I am really struggling in getting this to work.

Thanks in advance, Cheers, Gianluca


you can select the size of the response of the Ria services : Edit your Config File :(maxItemsInObjectGraph)

<system.serviceModel>    
    <serviceHostingEnvironment aspNetCompatibilityEnabled="true" 
      multipleSiteBindingsEnabled="true" />
    <services>
      <service name="ModuleDossier.DSDossier" behaviorConfiguration="DSDossier_BehaviorConfig"></service>      
    </services>
    <behaviors>
      <serviceBehaviors>
        <behavior name="DSDossier_BehaviorConfig">
          <dataContractSerializer maxItemsInObjectGraph="2147483647"/>
        </behavior>
        <behavior name="DSDossier_BehaviorConfig1">
          <dataContractSerializer maxItemsInObjectGraph="2147483647"/>
        </behavior>
      </serviceBehaviors>
    </behaviors>
  </system.serviceModel>


Ok, so after a while, here I am to provide an answer to my own question.

There is nothing wrong with RIA or EF, etc. Long operations will hit time and size limits, that's normal and correct. In my case, we are talking about a table that contains 32M rows.

We are using PostgreSQL 9.0. The table was already indexed but when the number of returned rows is elevated, indexes are not used. In order to overcome the performance issues that we met, we underwent a major refactoring of both database and application. Tables have been partitioned, support tables with counters were created and have now been used in place of COUNT operations, auto-vacuum enabled, increased the number of indexes on all the most meaningful and used search fields, and - important - queries were reviewed in order to reduce the returned recordset.

The new application, which is running on a dev environment (that has just few resources available), answers within a couple of seconds to any request, even if the database is growing 2-4M rows per month.

That's all. I hope this can help someone else.

Cheers, Gianluca.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜