Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C / C++ / MFC
  4. How to get a pixel's RGB value.

How to get a pixel's RGB value.

Scheduled Pinned Locked Moved C / C++ / MFC
tutorial
7 Posts 4 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • F Offline
    F Offline
    Francis Chau
    wrote on last edited by
    #1

    Could some of you tell me how to get the RGB value of a pixel from a frame which captured from USB camera.

    C 1 Reply Last reply
    0
    • F Francis Chau

      Could some of you tell me how to get the RGB value of a pixel from a frame which captured from USB camera.

      C Offline
      C Offline
      Christian Graus
      wrote on last edited by
      #2

      You need to either use GetPixel ( which is slow ), or make your image a DIBSection or use GDI+ to access the bits as an array. Christian I have several lifelong friends that are New Yorkers but I have always gravitated toward the weirdo's. - Richard Stringer

      F 1 Reply Last reply
      0
      • C Christian Graus

        You need to either use GetPixel ( which is slow ), or make your image a DIBSection or use GDI+ to access the bits as an array. Christian I have several lifelong friends that are New Yorkers but I have always gravitated toward the weirdo's. - Richard Stringer

        F Offline
        F Offline
        Francis Chau
        wrote on last edited by
        #3

        Thank you for your reply, could you give me some idea on it? I have no idea.. :(

        F 1 Reply Last reply
        0
        • F Francis Chau

          Thank you for your reply, could you give me some idea on it? I have no idea.. :(

          F Offline
          F Offline
          FlyingTinman
          wrote on last edited by
          #4

          In what format is the frame stored? Assuming you used ISampleGrabber you will have a frame stored in whatever format the video pin directly upstream of your sample grabber is configured to. e.g 24-bit YUV ( FourCC, IYUV ) This is essentially an array of YUV values so first you will want to access the YUV values (3 bytes) at the pixel location Use a BYTE pointer that is set to a function of your frame buffer start, buffer width, Pixel X and Y position and size (in bytes) e.g. assuming 640 x 480 24-bit YUV (3 bytes), To find pixel(X,Y) : BYTE *pPixel = pBufferStart + ( Y * 640 * 3 ) + ( X * 3 ); Now BYTE Y = *pPixel; BYTE U = *(pPixel+1); BYTE V = *(pPixel+2); This will be different if the stored frame is in another format. Assuming you correctly get the pixel YUV values now convert to RGB Of the top of my head I think the formula is : R = 1.164*Y + 1.596*V; G = 1.164*Y - 0.813*V - 0.392*U; B = 1.164*Y + 2.017*U But I'm not sure. You can look it up. In ant case if you're doing a lot of pixel testing you should build look-up tables from those formulas at initialization time. Steve T

          T F 2 Replies Last reply
          0
          • F FlyingTinman

            In what format is the frame stored? Assuming you used ISampleGrabber you will have a frame stored in whatever format the video pin directly upstream of your sample grabber is configured to. e.g 24-bit YUV ( FourCC, IYUV ) This is essentially an array of YUV values so first you will want to access the YUV values (3 bytes) at the pixel location Use a BYTE pointer that is set to a function of your frame buffer start, buffer width, Pixel X and Y position and size (in bytes) e.g. assuming 640 x 480 24-bit YUV (3 bytes), To find pixel(X,Y) : BYTE *pPixel = pBufferStart + ( Y * 640 * 3 ) + ( X * 3 ); Now BYTE Y = *pPixel; BYTE U = *(pPixel+1); BYTE V = *(pPixel+2); This will be different if the stored frame is in another format. Assuming you correctly get the pixel YUV values now convert to RGB Of the top of my head I think the formula is : R = 1.164*Y + 1.596*V; G = 1.164*Y - 0.813*V - 0.392*U; B = 1.164*Y + 2.017*U But I'm not sure. You can look it up. In ant case if you're doing a lot of pixel testing you should build look-up tables from those formulas at initialization time. Steve T

            T Offline
            T Offline
            ThatsAlok
            wrote on last edited by
            #5

            Great!!, i am too looking for something like this thanks


            [Vote One Here, Complete my Survey....] Alok Gupta
            visit me at http://www.thisisalok.tk          "I Think Believe this Will Help"

            1 Reply Last reply
            0
            • F FlyingTinman

              In what format is the frame stored? Assuming you used ISampleGrabber you will have a frame stored in whatever format the video pin directly upstream of your sample grabber is configured to. e.g 24-bit YUV ( FourCC, IYUV ) This is essentially an array of YUV values so first you will want to access the YUV values (3 bytes) at the pixel location Use a BYTE pointer that is set to a function of your frame buffer start, buffer width, Pixel X and Y position and size (in bytes) e.g. assuming 640 x 480 24-bit YUV (3 bytes), To find pixel(X,Y) : BYTE *pPixel = pBufferStart + ( Y * 640 * 3 ) + ( X * 3 ); Now BYTE Y = *pPixel; BYTE U = *(pPixel+1); BYTE V = *(pPixel+2); This will be different if the stored frame is in another format. Assuming you correctly get the pixel YUV values now convert to RGB Of the top of my head I think the formula is : R = 1.164*Y + 1.596*V; G = 1.164*Y - 0.813*V - 0.392*U; B = 1.164*Y + 2.017*U But I'm not sure. You can look it up. In ant case if you're doing a lot of pixel testing you should build look-up tables from those formulas at initialization time. Steve T

              F Offline
              F Offline
              Francis Chau
              wrote on last edited by
              #6

              Thanks for your reply! another question, how can I store the frame to buffer from the USB camera, and how to compare two frames if they are different? Thanks alot!

              F 1 Reply Last reply
              0
              • F Francis Chau

                Thanks for your reply! another question, how can I store the frame to buffer from the USB camera, and how to compare two frames if they are different? Thanks alot!

                F Offline
                F Offline
                FlyingTinman
                wrote on last edited by
                #7

                Thanks for your reply!

                another question, how can I store the frame to buffer from the USB camera, and how to compare two frames if they are different?

                I use ISampleGrabber in a DirectShow filtergraph to get video frames (there are other ways) Here is the DS documentation: ISampleGrabber There is some sample code if you click through "Using the Sample Grabber" As to comparing two frames: what differences do you want to detect and how sensitive to change do you want your detection to be? In a live video stream even visually indistinguishable frames will have at least some minor diffences due to tiny camera vibrations, subtle changes in light-level, sensor "noise", etc. One simple test you can do on a per-pixel basis is simply compare the Y-values of corresponding pixels in two frames (in YUV format). This essentially monitors for a change in brightness. (Like comparing two gray-scale pixels)* If you are more interested in looking for changes in a particular area of your frame you could accumulate Y-values in that area of each frame (add the Y values for each pixel in the area, average them or accumulate each pixel's difference--whatever is appropriate for your application--then compare the accumulation to see if passes a threshold, indicating the selected area of your video frame has changed. *If instead of converting a YUV image to RGB using the formula I gave you simply use the Y value for R, G & B you will get a gray-scale image. Steve T

                1 Reply Last reply
                0
                Reply
                • Reply as topic
                Log in to reply
                • Oldest to Newest
                • Newest to Oldest
                • Most Votes


                • Login

                • Don't have an account? Register

                • Login or register to search.
                • First post
                  Last post
                0
                • Categories
                • Recent
                • Tags
                • Popular
                • World
                • Users
                • Groups