开发者

problem writing xml to file with .net mvc - timeout?

Hey, so having an issue with writing out to an xml file. Works fine for single requests via the browser, but when I use something like Charles to perform 5-10 repeated requests concurrently several of them will fail. The trace simply shows a 500 error with no content inside, basically I think they start timing out waiting for write access or something...

This method is inside my repository class, have also attempted to have repository instance as a singleton but doesn't appear to make any difference..

Any help would be much appreciated. Cheers

    public void Add(Request request) {
        try {
            XDocument requests;
            XmlReader xmlReader;
            using (xmlReader = XmlReader.Create(_requestsFilePath)) {
                requests = XDocument.Load(xmlReader);
                XElement xmlRequest = new XElement("request",
                            new XElement("code", request.code),
                            new XElement("date", request.date),
                            new XElement("email", new XCData(request.email)),
                            new XElement("name", new XCData(request.name)),
                            new XElement("recieveOffers", request.recieveOffers)
 开发者_如何学Python                       );
                requests.Root.Element("requests").Add(xmlRequest);
                xmlReader.Close();
            }
            requests.Save(_requestsFilePath);
        } catch (Exception ex) {
            HttpContext.Current.Trace.Warn("Error writing to file: "+ex);
        }
    }


There are a couple problems with your design here. First off, in contexts with concurrent operations, it is okay to read from the same source at the same time, but when a write occurs, you need to stop all other reading and writing from happening, otherwise you run into race conditions.

The first thing you should change is getting rid of the catch big Exception because that is preventing you from seeing the full error. The problem could be an out of memory error because you capped out your RAM while loading the same data set 5-10 times. Unless you know what the error is you wont be able to address it correctly, and guessing wont help.

and if stack overflow stops timing out on me, I will copy and paste my answer here!

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜