How to get a pixel's RGB value.
-
Could some of you tell me how to get the RGB value of a pixel from a frame which captured from USB camera.
-
Could some of you tell me how to get the RGB value of a pixel from a frame which captured from USB camera.
You need to either use GetPixel ( which is slow ), or make your image a DIBSection or use GDI+ to access the bits as an array. Christian I have several lifelong friends that are New Yorkers but I have always gravitated toward the weirdo's. - Richard Stringer
-
You need to either use GetPixel ( which is slow ), or make your image a DIBSection or use GDI+ to access the bits as an array. Christian I have several lifelong friends that are New Yorkers but I have always gravitated toward the weirdo's. - Richard Stringer
Thank you for your reply, could you give me some idea on it? I have no idea.. :(
-
Thank you for your reply, could you give me some idea on it? I have no idea.. :(
In what format is the frame stored? Assuming you used ISampleGrabber you will have a frame stored in whatever format the video pin directly upstream of your sample grabber is configured to. e.g 24-bit YUV ( FourCC, IYUV ) This is essentially an array of YUV values so first you will want to access the YUV values (3 bytes) at the pixel location Use a BYTE pointer that is set to a function of your frame buffer start, buffer width, Pixel X and Y position and size (in bytes) e.g. assuming 640 x 480 24-bit YUV (3 bytes), To find pixel(X,Y) : BYTE *pPixel = pBufferStart + ( Y * 640 * 3 ) + ( X * 3 ); Now BYTE Y = *pPixel; BYTE U = *(pPixel+1); BYTE V = *(pPixel+2); This will be different if the stored frame is in another format. Assuming you correctly get the pixel YUV values now convert to RGB Of the top of my head I think the formula is : R = 1.164*Y + 1.596*V; G = 1.164*Y - 0.813*V - 0.392*U; B = 1.164*Y + 2.017*U But I'm not sure. You can look it up. In ant case if you're doing a lot of pixel testing you should build look-up tables from those formulas at initialization time. Steve T
-
In what format is the frame stored? Assuming you used ISampleGrabber you will have a frame stored in whatever format the video pin directly upstream of your sample grabber is configured to. e.g 24-bit YUV ( FourCC, IYUV ) This is essentially an array of YUV values so first you will want to access the YUV values (3 bytes) at the pixel location Use a BYTE pointer that is set to a function of your frame buffer start, buffer width, Pixel X and Y position and size (in bytes) e.g. assuming 640 x 480 24-bit YUV (3 bytes), To find pixel(X,Y) : BYTE *pPixel = pBufferStart + ( Y * 640 * 3 ) + ( X * 3 ); Now BYTE Y = *pPixel; BYTE U = *(pPixel+1); BYTE V = *(pPixel+2); This will be different if the stored frame is in another format. Assuming you correctly get the pixel YUV values now convert to RGB Of the top of my head I think the formula is : R = 1.164*Y + 1.596*V; G = 1.164*Y - 0.813*V - 0.392*U; B = 1.164*Y + 2.017*U But I'm not sure. You can look it up. In ant case if you're doing a lot of pixel testing you should build look-up tables from those formulas at initialization time. Steve T
Great!!, i am too looking for something like this thanks
[Vote One Here, Complete my Survey....] Alok Gupta
visit me at http://www.thisisalok.tk "I Think Believe this Will Help" -
In what format is the frame stored? Assuming you used ISampleGrabber you will have a frame stored in whatever format the video pin directly upstream of your sample grabber is configured to. e.g 24-bit YUV ( FourCC, IYUV ) This is essentially an array of YUV values so first you will want to access the YUV values (3 bytes) at the pixel location Use a BYTE pointer that is set to a function of your frame buffer start, buffer width, Pixel X and Y position and size (in bytes) e.g. assuming 640 x 480 24-bit YUV (3 bytes), To find pixel(X,Y) : BYTE *pPixel = pBufferStart + ( Y * 640 * 3 ) + ( X * 3 ); Now BYTE Y = *pPixel; BYTE U = *(pPixel+1); BYTE V = *(pPixel+2); This will be different if the stored frame is in another format. Assuming you correctly get the pixel YUV values now convert to RGB Of the top of my head I think the formula is : R = 1.164*Y + 1.596*V; G = 1.164*Y - 0.813*V - 0.392*U; B = 1.164*Y + 2.017*U But I'm not sure. You can look it up. In ant case if you're doing a lot of pixel testing you should build look-up tables from those formulas at initialization time. Steve T
Thanks for your reply! another question, how can I store the frame to buffer from the USB camera, and how to compare two frames if they are different? Thanks alot!
-
Thanks for your reply! another question, how can I store the frame to buffer from the USB camera, and how to compare two frames if they are different? Thanks alot!
Thanks for your reply!
another question, how can I store the frame to buffer from the USB camera, and how to compare two frames if they are different?
I use ISampleGrabber in a DirectShow filtergraph to get video frames (there are other ways) Here is the DS documentation: ISampleGrabber There is some sample code if you click through "Using the Sample Grabber" As to comparing two frames: what differences do you want to detect and how sensitive to change do you want your detection to be? In a live video stream even visually indistinguishable frames will have at least some minor diffences due to tiny camera vibrations, subtle changes in light-level, sensor "noise", etc. One simple test you can do on a per-pixel basis is simply compare the Y-values of corresponding pixels in two frames (in YUV format). This essentially monitors for a change in brightness. (Like comparing two gray-scale pixels)* If you are more interested in looking for changes in a particular area of your frame you could accumulate Y-values in that area of each frame (add the Y values for each pixel in the area, average them or accumulate each pixel's difference--whatever is appropriate for your application--then compare the accumulation to see if passes a threshold, indicating the selected area of your video frame has changed. *If instead of converting a YUV image to RGB using the formula I gave you simply use the Y value for R, G & B you will get a gray-scale image. Steve T