we are planing an app for blackberry app world and will have to load lots of textures and audiofiles.
I\'ve recently been playing with code for an iPhone app to parse XML. Sticking to Cocoa, I decided to go with the NSXMLParser class. The app will be responsible for parsing 10,000+ \"computers\", all
if the class was had a SerializableAttribute and its object was serialized with BinaryFormatter, the serialized size is equal to the size it occupied in memory?
Why does the memory decrease when you minimize an Application ? I found this out while running a Flash Application in IE. It was taking around 200 MB memory but when I minimized the IE it came down to
Recently I\'ve found myself testing an aplication in开发者_开发知识库 Froglogic\'s Squish, using Python to create test scripts. Just the other day, the question of how much memory the program is using
public static void Main() { Test t1 = new Test(); } when will t1 (reference 开发者_运维知识库variable) will get memory, at compile time or run time.
Imagine I\'m in C-land, and I have void* my_alloc(size_t size); void* my_free(void*); then I can go through my code and r开发者_如何转开发eplace all calls to malloc/free with my_alloc/my_free.
I am developing an application that is, to put it simply, a niche based search engine. Within the application I have include a function crawl() which crawls a website and then uses the collectData() f
I detach a thread calling my method which has a while-loop. Even though I have them marked as autoreleasepool, I release the objects manually, since the while-loop can continue on for a some time.
My String class provides an operator char* overload to allow you to pass the string to C functions. Unfortunately a colleague of mine just inadvertently discovered a bug.