Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

minofifa

macrumors newbie
Original poster
Oct 24, 2008
5
0
Hi there

I have an assignment that involves altering images on a pixel-by-pixel basis (bitmaps). I decided to do it in Cocoa. My current implementation uses NSImage and NSBitmapImageRep objects. I'm finding it a little tough to get at actual pixel data. Should i be using a different framework entirely (like Quartz or CoreImage)?

Anyways, my current problem is this: I get pixel data, change it a bit (according to my algorithm specs) then save the new data to the pixel. After I do this, I get the pixel data again to be sure things went well, and as far as I can tell, the pixels are updated successfully. HOWEVER, almost immediately after, I make a copy of my altered image and again get the pixel data, but this time all of the values have been changed! Does the drawing process for NSImage automagically change pixels? Is Cocoa doing some sort of correction/callibration I'm not aware of? Here is my sample code for doing the above. Thanks for any suggestions!

Code:
/*
 * generates an 8-bit version of the original image using a naive approach
 * of assigning 3 bits to red, 3 to green and 2 to blue
 */
-(BOOL)generate8BitNaive{
	// Make a copy of the current image representation of the original image to work with:
	self.newImage = [[self originalImage] copy];
	
	NSBitmapImageRep *imgPix = (NSBitmapImageRep*)[self.newImage bestRepresentationForDevice:nil];
	if(!imgPix){
		NSLog(@"could not make the image rep object");	
		return NO;
	}
	// Iterate through the pixels in the image and transform them to 8-bit representations
	for(int i = 0; i<[imgPix pixelsWide];i++){
		for(int j = 0; j<[imgPix pixelsHigh]; j++){
			NSUInteger components[4];
			[imgPix getPixel:components atX:i y:j];
			// Update the components' values by only considering their 3 most significant
			// bits (for R and G) and 2 bits for B.  
			components[RED_COMP] = components[RED_COMP] & REDMASK;
			components[GREEN_COMP] = components[GREEN_COMP] & GREENMASK;
			components[BLUE_COMP] = components[BLUE_COMP] & BLUEMASK;
			components[ALPHA] = 255;
			// Update the pixel with the new pixel value.
			[imgPix setPixel:components atX:i y:j];
		}
	}
 

HiRez

macrumors 603
Jan 6, 2004
6,265
2,630
Western US
The drawing process can definitely change the pixel values because ColorSync will be applied and calibrated to your output device. But if you just copy the raw data I don't think that should happen so I don't know what's going on. Are the values getting changed back to the original values, or to some new values?
 

Analog Kid

macrumors G3
Mar 4, 2003
9,360
12,603
I'd compare the original pixel values, the new image pixel values before and after you run the algorithm. I'd also make sure you're actually using self.newimage and not updating that but using self's original image later down the line.

Another place to look is the NSBitmapFormat-- make sure it's not using floats...

I don't think ColorSync should be messing with your raw pixels-- only the drawn ones, or if you're using NSColor values. You're pulling pixel values, so you should be clear there.
 

minofifa

macrumors newbie
Original poster
Oct 24, 2008
5
0
hey everyone. thanks for the great suggestions. After reading the article: NSImage: Deceivingly Simple or Just Deceiving

The takehome i got from this is just to avoid messing around with pixel data all together when using NSImage. I'll reimplement using CGImage or Quartz and make a note if that fixes my problems.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.