single big files for PHP or lots of small ones?
I have a server which I try to optimize for speed (php/mysql). From a speed standpoint, is it better to keep a big file with all my functions in it so as to have only one file to include? Or is it better to have many, small, task-optimized includes that will include depending on the job at hand (that solution produces 4/5 even 6 files to include).
Moreover, I do use the pear/MDB2 lib for database access. Does this add any overhead instead of using plain php calls? (including pear/MDB2 is one开发者_StackOverflow include plus, at least)
Thank you all in advance!
Speed-wise, it won't matter much. Functions are ignored unless called upon, just as any code in your file that shouldn't be loaded. It does go through the parser, but we're talking about speeds that you'll never notice anything about.
What is more important is accessibility and maintainability. By dividing your code into different files makes that a lot easier for you, possibly your colleagues and also your text editor (working in huge files with code highlighting might get really slow).
One thing you could take a look at though is PHP's autoload. This allows you to call on any class or function in that class without having to include it specifically in the script. This saves lines of code and it'll only be included and loaded when needed. But apart from this being very handy code-wise, it won't make it faster/slower.
It depends on the size of the files and functions. With some small scripts, I'd say: Go for one file for simplicity and make it easy to copy around for example.
Think about it this way:
Every read of the small separated files means a disk operation and every disk operation is quite slow - even on servers.
Only one read is faster. BUT if the file read is several megabytes big the server would read it every time (and need a "long" time for that) and just a small part of the file is actually used in the task at hand. Seems like a big waste of resources - especially RAM.
Of course we forget about caching, virtual file systems and stuff like that, but the argument still stands.
And ultimatly ask yourself this: Would you rather scoll throught thousends of code lines and search to find the function you have to change / repair? Or would you rather have them nice and separated in some small and clear files?
My projects usually include dozens of files, some of them small, some of them huge. Years ago I included them myself which caused 2 problems: really long list of includes (most of them redundant for most of the tasks), dependency problems (the code was very sensitive for loading order). Then I tried a simple auto-loader.
See http://pl2.php.net/manual/en/ref.spl.php
It made life easier, since it auto-magically included only necessary files. Default spl_autoload is very fast, but causes problems with upper case letters in class names. Aditionaly it requires to name the files exactly like classes, which can be a problem if you decide to put more than one class in a file (sometimes there's a reason for it). I wrote my own auto-loader in PHP, which does very advanced class mapping. First I expected it to significantly slow down my app. I've done some tests and here's what I've found.
A file include is nothing compared to normal data processing. It's almost insignificant since every file is included exactly once, while any silly foreach operation can run thousands of times. Notice that all file requests don't really read files from servers HD, they use memory cache which lasts way shorter than any PHP code loop (because OS calls are written in C).
My opinion: forget about it. Do not care. The only one thing which really can cause some performance impact is loading too much redundant files - they use servers memory and it could matter under heavy load. Use autoloader - you'll forget that include or require exist.
精彩评论