Thanks for your reply!
another question, how can I store the frame to buffer from the USB camera, and how to compare two frames if they are different?
I use ISampleGrabber in a DirectShow filtergraph to get video frames (there are other ways) Here is the DS documentation: ISampleGrabber There is some sample code if you click through "Using the Sample Grabber" As to comparing two frames: what differences do you want to detect and how sensitive to change do you want your detection to be? In a live video stream even visually indistinguishable frames will have at least some minor diffences due to tiny camera vibrations, subtle changes in light-level, sensor "noise", etc. One simple test you can do on a per-pixel basis is simply compare the Y-values of corresponding pixels in two frames (in YUV format). This essentially monitors for a change in brightness. (Like comparing two gray-scale pixels)* If you are more interested in looking for changes in a particular area of your frame you could accumulate Y-values in that area of each frame (add the Y values for each pixel in the area, average them or accumulate each pixel's difference--whatever is appropriate for your application--then compare the accumulation to see if passes a threshold, indicating the selected area of your video frame has changed. *If instead of converting a YUV image to RGB using the formula I gave you simply use the Y value for R, G & B you will get a gray-scale image. Steve T