Hi there
I have an assignment that involves altering images on a pixel-by-pixel basis (bitmaps). I decided to do it in Cocoa. My current implementation uses NSImage and NSBitmapImageRep objects. I'm finding it a little tough to get at actual pixel data. Should i be using a different framework entirely (like Quartz or CoreImage)?
Anyways, my current problem is this: I get pixel data, change it a bit (according to my algorithm specs) then save the new data to the pixel. After I do this, I get the pixel data again to be sure things went well, and as far as I can tell, the pixels are updated successfully. HOWEVER, almost immediately after, I make a copy of my altered image and again get the pixel data, but this time all of the values have been changed! Does the drawing process for NSImage automagically change pixels? Is Cocoa doing some sort of correction/callibration I'm not aware of? Here is my sample code for doing the above. Thanks for any suggestions!
I have an assignment that involves altering images on a pixel-by-pixel basis (bitmaps). I decided to do it in Cocoa. My current implementation uses NSImage and NSBitmapImageRep objects. I'm finding it a little tough to get at actual pixel data. Should i be using a different framework entirely (like Quartz or CoreImage)?
Anyways, my current problem is this: I get pixel data, change it a bit (according to my algorithm specs) then save the new data to the pixel. After I do this, I get the pixel data again to be sure things went well, and as far as I can tell, the pixels are updated successfully. HOWEVER, almost immediately after, I make a copy of my altered image and again get the pixel data, but this time all of the values have been changed! Does the drawing process for NSImage automagically change pixels? Is Cocoa doing some sort of correction/callibration I'm not aware of? Here is my sample code for doing the above. Thanks for any suggestions!
Code:
/*
* generates an 8-bit version of the original image using a naive approach
* of assigning 3 bits to red, 3 to green and 2 to blue
*/
-(BOOL)generate8BitNaive{
// Make a copy of the current image representation of the original image to work with:
self.newImage = [[self originalImage] copy];
NSBitmapImageRep *imgPix = (NSBitmapImageRep*)[self.newImage bestRepresentationForDevice:nil];
if(!imgPix){
NSLog(@"could not make the image rep object");
return NO;
}
// Iterate through the pixels in the image and transform them to 8-bit representations
for(int i = 0; i<[imgPix pixelsWide];i++){
for(int j = 0; j<[imgPix pixelsHigh]; j++){
NSUInteger components[4];
[imgPix getPixel:components atX:i y:j];
// Update the components' values by only considering their 3 most significant
// bits (for R and G) and 2 bits for B.
components[RED_COMP] = components[RED_COMP] & REDMASK;
components[GREEN_COMP] = components[GREEN_COMP] & GREENMASK;
components[BLUE_COMP] = components[BLUE_COMP] & BLUEMASK;
components[ALPHA] = 255;
// Update the pixel with the new pixel value.
[imgPix setPixel:components atX:i y:j];
}
}