C# custom event handlers/controls
-
With touch screens doesn't the touching equate to mouse stuff? i.e. if you tap it's like doing a click.
it's for a multitouch screen, so pressing two fingers on an image does do quite different things than pressing two fingers on something else. so emulating the mouse is not quite a good solution
rather have something you don't need, than need something you don't have
-
hi, i have a unmanaged c++ app that sends events to a managed wrapper, and then the wrapper sends it to my c# app. it works fine. but now i want to create custom controls that gets these events. the problem is i have no idea how. i can create basic custom controls and so. it's for a touchscreen, so what i want to do is very similar to the click/mousedown events, except its more of a touchdown instead of mousedown. eventually i want to be able to do Image.touchdown += (add event handler) and one the image is touched on screen, the event is raised. just like a mouse event. can anyone please tell me where to start, or what article to read, or just give me a hint in the right direction? thanks, donovan
those that say it cannot be done shouldn't interrupt those doing it
To support multiple touches, you'd probably need an SDK for that tablet. There isn't any built in support in Windows to handle this.
A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007 -
it's for a multitouch screen, so pressing two fingers on an image does do quite different things than pressing two fingers on something else. so emulating the mouse is not quite a good solution
rather have something you don't need, than need something you don't have
No I mean don't the touch screen drivers interpret the touching as mouse stuff? i.e. when you tap the monitor driver does a mouse click on that location Edit: Oh wait I reread what you were on about and I think I got confused. The C++ app handles the touchside of things? It recieves the messages fro mthe touch screen and then tells the C# app. What you want is then to make controls where the recieved messages fire events. i.e. Touch on the screen >> C++ app >> C# app >> fire event on control
-
it's for a multitouch screen, so pressing two fingers on an image does do quite different things than pressing two fingers on something else. so emulating the mouse is not quite a good solution
rather have something you don't need, than need something you don't have
donsolms wrote:
so pressing two fingers on an image does do quite different things than pressing two fingers on something else
:omg: What the heck does that mean? "pressing two fingers"? What difference does it make how many fingers you use to press with? And why would you use more than one? Doesn't that make it more difficult to hit your target? And what does "does do quite different things than pressing two fingers on something else" have to do with "how" you capture the "press" event? :confused: Are you confused... because I sure am.
-
donsolms wrote:
so pressing two fingers on an image does do quite different things than pressing two fingers on something else
:omg: What the heck does that mean? "pressing two fingers"? What difference does it make how many fingers you use to press with? And why would you use more than one? Doesn't that make it more difficult to hit your target? And what does "does do quite different things than pressing two fingers on something else" have to do with "how" you capture the "press" event? :confused: Are you confused... because I sure am.
ok. let me explain. it's a multitouch screen i built myself. it gets a videofeed from a camera of where you press your finger. the unmanaged c++ app does the tracking of the fingers. then i built a managed wrapper to get the 'events' raised by the unmanaged part. now that is working. now i need to add those touch events to a UIElement in wpf. now doing it with one object is no problem, but doing it with say Image and Button is a whole different story. because once i add a UIElement implemented to receive touches, the app runs but without allowing the other one for touches. so i want to build a framework that allows for more than one element to be touched. the basics should be the same as the normal mousedown event. as for "doing different things" and "why whould you use more than one finger"? google/youtube for 'microsoft surface', or 'jeff han ftir' and you'll see why more than one finger and why it does different things when you touch different objects. @led mike - you're right : "And what does "does do quite different things than pressing two fingers on something else" have to do with "how" you capture the "press" event? " has nothing to do with it, it was just to better explain.
rather have something you don't need, than need something you don't have
-
No I mean don't the touch screen drivers interpret the touching as mouse stuff? i.e. when you tap the monitor driver does a mouse click on that location Edit: Oh wait I reread what you were on about and I think I got confused. The C++ app handles the touchside of things? It recieves the messages fro mthe touch screen and then tells the C# app. What you want is then to make controls where the recieved messages fire events. i.e. Touch on the screen >> C++ app >> C# app >> fire event on control
Touch on the screen >> C++ app >> C# app >> fire event on control exactly
rather have something you don't need, than need something you don't have
-
ok. let me explain. it's a multitouch screen i built myself. it gets a videofeed from a camera of where you press your finger. the unmanaged c++ app does the tracking of the fingers. then i built a managed wrapper to get the 'events' raised by the unmanaged part. now that is working. now i need to add those touch events to a UIElement in wpf. now doing it with one object is no problem, but doing it with say Image and Button is a whole different story. because once i add a UIElement implemented to receive touches, the app runs but without allowing the other one for touches. so i want to build a framework that allows for more than one element to be touched. the basics should be the same as the normal mousedown event. as for "doing different things" and "why whould you use more than one finger"? google/youtube for 'microsoft surface', or 'jeff han ftir' and you'll see why more than one finger and why it does different things when you touch different objects. @led mike - you're right : "And what does "does do quite different things than pressing two fingers on something else" have to do with "how" you capture the "press" event? " has nothing to do with it, it was just to better explain.
rather have something you don't need, than need something you don't have
-
Touch on the screen >> C++ app >> C# app >> fire event on control exactly
rather have something you don't need, than need something you don't have
ahhhh got ya :) Well you'll need to inherit each control in the framework and extend it with the new events you want. It'd probably be best to make an interface like ITouchableControl that defines the set of events you want and then have each new control impliment it. The final bit is tell the control when it needs to fire the new events. You need some way of testing if the event from the touch screen is applicable to a cirtain control. You can do this using each controls location and seeing if it matches up with the event, if so then the control can fire its own event to notify anyone whos listening. As to the exact implimentation of the final bit theres a few different routes you could go. You could have a central class that each control registers itself with, that singleton class would then handle all of the figuring out which controls are applicable and then inform each control as to what it needs to do. The other option is to have each control listen to the events and figure out for itself if it needs to fire its relivent events.
-
ahhhh got ya :) Well you'll need to inherit each control in the framework and extend it with the new events you want. It'd probably be best to make an interface like ITouchableControl that defines the set of events you want and then have each new control impliment it. The final bit is tell the control when it needs to fire the new events. You need some way of testing if the event from the touch screen is applicable to a cirtain control. You can do this using each controls location and seeing if it matches up with the event, if so then the control can fire its own event to notify anyone whos listening. As to the exact implimentation of the final bit theres a few different routes you could go. You could have a central class that each control registers itself with, that singleton class would then handle all of the figuring out which controls are applicable and then inform each control as to what it needs to do. The other option is to have each control listen to the events and figure out for itself if it needs to fire its relivent events.
thanks. that is what i was looking for! the singleton will solve a lot of problems. thanks again. donovan
rather have something you don't need, than need something you don't have
-
thanks. that is what i was looking for! the singleton will solve a lot of problems. thanks again. donovan
rather have something you don't need, than need something you don't have