Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

AussieSusan

macrumors member
Original poster
May 29, 2006
54
0
Melbourne, Australia
I'm trying to port some (open source) code that has been constructed to have a common backend section and various graphical front ends (GTK, Windows, Motif PocketPC etc). My ultimate target is the iPhone but as I'm trying to use Quartz 2D, my understanding is that this should also work on my MBP and that's what I'm using to get build my understanding.

The program has a 131 by 16 pixel display area that can provide a monochrome output.

The backend sends the pixel data in the form of an array of bytes with the appropriate bits set. I know that there are 131 pixels to each horizontal row (which takes 17 bytes) and there are 16 such rows and that I'm always presented with the entire array.

I've been reading the Quartz 2D manuals that I can find on the Apple Developer site and I can see how to create a bitmap context with the necessary pixel dimensions. However I'm not sure how to specify the bits within the bitmap.

I read that I can have a 1-bit bitmap which I interpret to be a monochrome arrangement that is similar (I need to confirm the order of the bits within the byte) to the one I've been provided. However all of the examples seem to read data from a bitmap file and I can't see how to set the individual bits.

Therefore, using Quartz 2D:
1) am I right that I can create a 1-bit per pixel bitmap?
2) how can I set a individual pixel value (either 1-bit or 32-bit ARGB value if necessary)?

Pointers and examples would be appreciated.

Thanks

Susan
(obviously a graphical novice!)
 

gnasher729

Suspended
Nov 25, 2005
17,980
5,566
Quartz has very little support for anything with less than 16 bit per pixel. It may be annoying, but your best bet is to convert your 1 bit bitmap into a 32 bit RGBX bitmap by hand. It's still only 8 KByte. Any graphics nowadays is optimised for 32 bit per pixel.
 

AussieSusan

macrumors member
Original poster
May 29, 2006
54
0
Melbourne, Australia
Thanks gnasher, I appreciate the information. The only reason I was thinking about a 1-bit/pixel map was that it would save me a bit of code converting it.

As for the other question, how to I go about setting the value once I've created the bitmap context?

Thanks

Susan
 

kainjow

Moderator emeritus
Jun 15, 2000
7,958
7
I don't think you set pixels of a graphics context. I think you'd create a graphics context from a set of pixels. I believe that's what CGDataProvider is for. Alternatively you could read the pixels into colors and draw them as 1x1 rects into the graphics context.
 

AussieSusan

macrumors member
Original poster
May 29, 2006
54
0
Melbourne, Australia
Thanks kainjow.

Here is what I ended up doing and it seems to work:
- created a buffer with 1 32-bit integer per pixel (in my case 131 * 16 32-bit values)
- used the address of this buffer in the CGBitmapContextCreate function
- later on I set the various elements of the buffer with the ARGB pixel value I wanted
- in the drawRect: method I used the CGBitmapContextCreateImage funciton to create an image which I then drew into the display

I have no idea if this is correct or not but it seems to work for me...

Susan
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.