Lets bring up an old topic!
The memory issue is still very much a problem.
Let me explain a little of what we are developing:
An iPad app that displays an array of SPImages on stage in sequence to simulate a 3d rotating model effect. We pre-load the image textures in the background, than instantly add the images to the stage while only having one image (frame) visible at a time. The end-user is able to drag around the stage to seemingly rotate the virtual 3d model (which just toggles visibility on all the images to only allow the current image/frame to be visible).
The problem with this is the client is running into low memory which ends up crashing the device. I have confirmed that it is NOT a memory leak, and instead, due to memory increasing when rendering an image. The "extra" memory is NOT released after the image is invisible or released from the stage. The "extra" memory is only released after releasing the image.
I say "extra" memory because: Memory seems to be used on initializing the image/texture. Than more memory is used when rendering the image on stage. I assume OpenGL ES or the graphics card caches texture data once it has been rendered.
What we would LIKE to do is: Toggle the visibility of the image to NO or remove the image from stage AND release that "extra" memory that is being used during rendering of the image/texture.
I confirmed that SPTextureCache is not the cause of this. I dug into SPGLTexture and found out that the following call will successfully release that "extra" memory:
glDeleteTextures(1, &_name);
I was thinking, a good trick might be to manually call glDeleteTextures() and re-generate the texture. I assume this would release the cached "extra" memory, while still keeping SPGLTexture uncorrupted. Of course, this is really tough with the current setup of SPGLTexture, so I had little luck implementing it.
Now lets get to a sample project of this memory issue!
ImageMemoryText: http://cl.ly/VhdV
This sample project will show the statistics such as: fps, draw count, memory free, and memory used.
It also presents 4 buttons on the top:
You can also change the behavior of the app by toggling the following define:
#define TOGGLE_VISIBLE_INSTEAD_OF_ADD_REMOVE NO
When "TOGGLE_VISIBLE_INSTEAD_OF_ADD_REMOVE" is set to YES, it will change the displayed button texts to the following:
When initializing the image, it will use some memory. After adding the image to stage and/or setting the visibility to YES, it will use MORE memory. This is why I assume it uses memory on render. When removing the image from the stage and/or setting the visibility to NO, it will NOT release any memory. I expect it to release the memory that was used during render, or I would like an option to choose to release the memory. The only way it removes the "rendered-based" memory is when you dealloc the entire texture/image.
Now you might still wonder if the on-screen display of memory is accurate. I did a quick test with Instruments, and here is the graph:
You can see that the allocations released are only at the very end, when I release the image.
Thank you in advance!