Does the size of the EXE affect the execution speed?
I'm going to create some command line tools that make use of some large library DLL's. For security reasons I plan to embed the DLL's in the command line's EXE.
Example: Suppose the CL's (command line tool) functionality is to just copy a file from A to B. The procedure to do this is included in a 100MB library DLL. If I would just take out the lines of code from the DLL and paste them in the CL's code then the CL would only be 10Kb. But I don't want to do that, so I embed the full library in the CL's EXE, which will make it 101MB in size.
Pleas开发者_运维知识库e be aware that the above is just an example. I once read somewhere (cannot remember where) that Windows would only use the part of the EXE that's actually used. So if that's true then it shouldn't matter if the EXE size is 10Kb, 100MB or 1GB. I don't know if that is true, so that's why I'm asking this question.
I own the code of the DLL, so if the best solution is to not include the whole DLL but just only link to or include those code files, of the DLL project, that are used by the CL then I will go that way.
So the question is: will the 10Kb CL run faster and consume less memory than the 101MB CL?
First of all, if you're embedding the extra dll into the executable for security reasons, then don't.
If the program can unpack it, anyone else cans, so you're only fooling yourself if you think this will improve security, unless it is job security you're talking about.
Secondly, I suspect the underlying question here is quite a bit harder to answer than others might think.
If you had used a regular non-managed executable and a non-managed dll, then portions of those files would be reserved in memory space when you start the program, but only the actual bits of it you use will be mapped into physical memory.
In other words, the actual amount of physical memory the program would consume would be somewhat proportional to the amount of code you use from them and how that code is spread out. I say "somewhat" because paging into physical memory is done on a page basis, and pages have a fixed size. So a 100 byte function might end up mapping a 4KB of 8KB page (I don't recall the sizes of the pages any more) into memory. If you call a lot of such functions, and they're spread out over the address space of the code, you might end up mapping in a lot of physical memory for little code.
When it comes to managed assemblies, the picture changes somewhat. Managed code isn't mapped directly into physical memory the same way (note, I'm fuzzy on the details here because I don't know the exact details) because the code is not ready to run. Before it can be run it has to be JITted, and the JITter only JITs code on a need-to-jit basis.
In other words, if you include a humongous class in the assembly, but never use it, it might never end up being JITted and thus not use any memory.
So is the answer "no", as in it won't use more memory?
I have no idea. I suspect that there will be some effect of a larger assembly, more metadata to put into reflection tables or whatnot.
However, if you intend to place it into the executable, you either need to unpack it to disk before loading it (which would "circumvent" your "security features"), or unpack it into memory (which would require all of those 100MB of physical memory.)
So if you're concerned about using a lot of memory, here's my advice:
- Try it, see how bad it is
- Decide if it is worth it
- And don't embed the extra assembly into the executable
Will the smaller one run faster and consume less memory? Yes.
Will it be enough to make a difference? Who knows? If done wrong, the big one might take up about 100MB more memory (three guesses where I got that amount from)
But it sure seems awfully silly to include 100MB of 'stuff' that isn't needed...
EDIT: My "Yes" at the top here should be qualified with "infinitesimally so", and incidentally so. See comments, below.
精彩评论