开发者

Is it bad to open() and close() in a loop if speed isn't an issue?

I modified another programmer's Perl script I use to make it output logs. The perl script goes through files, and for every file it goes 开发者_开发技巧through I open() the log, write/print to it and then close() it. This happens a lot of times. I do this to make sure I don't lose any data if said Perl script hangs up (it eventually starts doing that, and I'm not knowledgeable enough to fix it). Therefore, I don't have a good alternative to repeating open() and close() in that loop.

My main question is this: the Perl script is for personal use, so speed reduction is not an issue. But are there other bad things that could follow out of this likely improper usage of open/close? It may sound like a stupid question, but is it possible this would wear my hard disk down faster, or am I misunderstanding how file handling works?

Thanks in advance.


As others have mentioned, there is no issue here other than performance (and arguably cleanliness of code).

However, if you are merely worried about "losing data if Perl hangs up", just set autoflush on the file handle:

use IO::Handle;
open HANDLE, '>log.txt'
    or die "Unable to open log.txt for writing: $!";
HANDLE->autoflush(1);

Now every print to HANDLE will get flushed automatically. No need to keep opening and closing.

Search for "autoflush" in the perldoc man page for more information.


In theory it's usually better to open and close connections as quickly as possible, and files are no different. The two things you will run into are file locking and performance.

File locking could come about if something else is accessing your file at the same time.

Performance, as you mentioned, isn't a huge concern.

We're not talking about lifetimes of waiting for open/close operations anyway...it's mostly noticeable with high concurrency or hundreds of thousands of actions.


The OS determines hard drive access so you should be fine. If you need to open() and close() a lot of files then it's ok. The only thing that might happen is if your script hangs (for some odd reason) while it has the file pointer from open() it could cause data loss if it resumes after you edit manually (but this is a pretty rare scenario). Also if your script crashes, then the descriptors get released anyway so there's no issue as far as I can tell.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜