I am working on a project similar to the one described here:
http://cocoadev.com/forums/comments.php?DiscussionID=1220&page=1#Item_0
And while my problem is memory management (being a "crazy scientist"), it's not the same exact kind. The program I'm working with (OsiriX) loads massive amounts of bitmaps into an NSData, byte copied from a float* malloc(). Then, it creates an NSArray of Faux Image Object which point to specific chunks of the entire NSData set of images. Upon rendering, it creates a rectangle and texture maps the chuck of bitmap pointed to by the FauxImageObject.
I want to render & process this information from multiple computers (CUDA/Clustering).
Instead of copying the NSData onto each machine, I want it to sit on one machine, and have each other machine point to it via an Array of FauxImageObjects.
I'm not looking for a solution, but if anyone here thinks this is not feasible, please let me know!
Any tutorials, guides, books, examples that may be of use to me?
Thank you in advance for your help.
-Stephen
http://cocoadev.com/forums/comments.php?DiscussionID=1220&page=1#Item_0
And while my problem is memory management (being a "crazy scientist"), it's not the same exact kind. The program I'm working with (OsiriX) loads massive amounts of bitmaps into an NSData, byte copied from a float* malloc(). Then, it creates an NSArray of Faux Image Object which point to specific chunks of the entire NSData set of images. Upon rendering, it creates a rectangle and texture maps the chuck of bitmap pointed to by the FauxImageObject.
I want to render & process this information from multiple computers (CUDA/Clustering).
Instead of copying the NSData onto each machine, I want it to sit on one machine, and have each other machine point to it via an Array of FauxImageObjects.
- Since a NSData appears to be easily Distributed (via DO help on the apple site), is DO an appropriate method for doing this?
- The drawRect method re-textures every frame. Should I instead grab the current image's texture and store it locally, but keep the entire NSData on DO?
- When will a Gigabit Ethernet show a bottle-neck if every machine is grabbing a 1MP bitmap a single machine each screen refresh? (1MP image = 32Mb @ 1000Mbps, ~30 images/Second. Screen refresh at 60Hz -> 1 image per 2 Screen Refreshes!? math hard!)
- Is there a loss-less compression capability for NSData or DO?
- The FauxImageObjects will also need to be distributed... since they contain information about the image like a CLUT and if one machine changes it, then all the others need to as well.
I'm not looking for a solution, but if anyone here thinks this is not feasible, please let me know!
Any tutorials, guides, books, examples that may be of use to me?
Thank you in advance for your help.
-Stephen