开发者

StreamWriter not writing out the last few characters to a file

We are having an issue wit开发者_如何转开发h one server and it's utilization of the StreamWriter class. Has anyone experienced something similar to the issue below? If so, what was the solution to fix the issue?

  using( StreamWriter logWriter = File.CreateText( logFileName ) )
  {
    for (int i = 0; i < 500; i++)
      logWriter.WriteLine( "Process completed successfully." );
  } 

When writing out the file the following output is generated:

  Process completed successfully.
  ...  (497 more lines)
  Process completed successfully.
  Process completed s

Tried adding logWriter.Flush() before close without any help. The more lines of text I write out the more data loss occurs.


Had a very similar issue myself. I found that if I enabled AutoFlush before doing any writes to the stream and it started working as expected. logWriter.AutoFlush = true;


sometimes even u call flush(), it just won't do the magic. becus Flush() will cause stream to write most of the data in stream except the last block of its buffer.

try
{
 // ... write method
 // i dont recommend use 'using' for unmanaged resource
}
finally
{
 stream.Flush();
 stream.Close();
 stream.Dispose();
}


Cannot reproduce this.

Under normal conditions, this should not and will not fail.

  • Is this the actual code that fails ? The text "Process completed" suggests it's an extract.
  • Any threading involved?
  • Network drive or local?
  • etc.


This certainly appears to be a "flushing" problem to me, even though you say you added a call to Flush(). The problem may be that your StreamWriter is just a wrapper for an underlying FileStream object.

I don't typically use the File.CreateText method to create a stream for writing to a file; I usually create my own FileStream and then wrap it with a StreamWriter if desired. Regardless, I've run into situations where I've needed to call Flush on both the StreamWriter and the FileStream, so I imagine that is your problem.

Try adding the following code:

            logWriter.Flush();
            if (logWriter.BaseStream != null)
                logWriter.BaseStream.Flush();


In my case, this is what I found with output file

Case 1: Without Flush() and Without Close()

Character Length = 23,371,776

Case 2: With Flush() and Without Close()

logWriter.flush()

Character Length = 23,371,201

Case 3: When propely closed

logWriter.Close()

Character Length = 23,375,887 (Required)

So, In order to get proper result, always need to close Writer instance.


I faced same problem

Following worked for me

using (StreamWriter tw = new StreamWriter(@"D:\Users\asbalach\Desktop\NaturalOrder\NatOrd.txt"))
{
    tw.Write(abc.ToString());// + Environment.NewLine);
}


Using framework 4.6.1 and under heavy stress it still has this problem. I'm not sure why it does this, though i found a way to solve it very differently (which strengthens my feeling its indeed a .net bug).

In my case i tried write huge jagged arrays to disk (video caching). Since the jagged array is quite large it had to do lot of repeated writes to store a large set of video frames, and despite they where uncompressed and each cache file got exact 1000 frames, the logged cash files had all different sizes.

I had the problem when i used this

//note, generateLogfileName is just a function to create a filename()

using (FileStream fs = new FileStream(generateLogfileName(), FileMode.OpenOrCreate))
{
  using (StreamWriter sw = new StreamWriter(fs)
  {
    // do your stuff, but it will be unreliable
  }
}

However when i provided it an Encoding type, all logged files got an equal size, and the problem was gone.

using (FileStream fs = new FileStream(generateLogfileName(), FileMode.OpenOrCreate))
  {
    using (StreamWriter sw = new StreamWriter(fs,Encoding.Unicode))
    {
      // all data written correctly,  no data lost.
    }
 }

Note also read the file width the same encoding type!


This did the trick for me:

streamWriter.flush();
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜