Find a Button with HitTest
-
When I hittest for a location where a button is, i get ButtonChrome or TextArea. Now I know why, but how do I find 'Button' through code. There is e.Source and e.OriginalSource through the eventargs but for my specific implementation those won't ever return Button as any one of them. Is there another way of finding the Parent of ButtonChrome/TextArea from the HitTestResult? The reason why the eventargs won't ever have that value is because i'm working with custom input (multitouch screen to be exact). So it doesn't come through the framework. Except if you can propose a better way to start the input. At the moment it starts at the Canvas level. No higher. thanks in advance. donovan
rather have something you don't need, than need something you don't have
-
When I hittest for a location where a button is, i get ButtonChrome or TextArea. Now I know why, but how do I find 'Button' through code. There is e.Source and e.OriginalSource through the eventargs but for my specific implementation those won't ever return Button as any one of them. Is there another way of finding the Parent of ButtonChrome/TextArea from the HitTestResult? The reason why the eventargs won't ever have that value is because i'm working with custom input (multitouch screen to be exact). So it doesn't come through the framework. Except if you can propose a better way to start the input. At the moment it starts at the Canvas level. No higher. thanks in advance. donovan
rather have something you don't need, than need something you don't have
Hi Donovan, cant you turn the touchscreen event into a regular mouse event (or sequence of mouse events) ? I havent any experience with touchscreens, but I would expect you want to use it like a mouse, so why not have it do a SendMessage(WM_...) to the active form. In that way all the Windowing logic would work for you. What is it the touchscreen vendor provides ? how does he justify a different API ? :)
Luc Pattyn
try { [Search CP Articles] [Search CP Forums] [Forum Guidelines] [My Articles] } catch { [Google] }
-
Hi Donovan, cant you turn the touchscreen event into a regular mouse event (or sequence of mouse events) ? I havent any experience with touchscreens, but I would expect you want to use it like a mouse, so why not have it do a SendMessage(WM_...) to the active form. In that way all the Windowing logic would work for you. What is it the touchscreen vendor provides ? how does he justify a different API ? :)
Luc Pattyn
try { [Search CP Articles] [Search CP Forums] [Forum Guidelines] [My Articles] } catch { [Google] }
there is no vendor.everything is custom created. the screen works like the ms surface computer.but my code at the moment works ok,i can touch and manipulate images because hittest returns a system.windows.controls.image. but a button returns buttonchrome so there is no way to solidly check the control type to be able to raise the correct event. i cant check for buttonchrome because it can change depending on the UI styling. this is all WPF.
rather have something you don't need, than need something you don't have
-
there is no vendor.everything is custom created. the screen works like the ms surface computer.but my code at the moment works ok,i can touch and manipulate images because hittest returns a system.windows.controls.image. but a button returns buttonchrome so there is no way to solidly check the control type to be able to raise the correct event. i cant check for buttonchrome because it can change depending on the UI styling. this is all WPF.
rather have something you don't need, than need something you don't have
OK, about no vendor. I am not familiar with WPF, ButtonChrome and such. But my basic question remains: I assume your touchscreen generates some events when touched/dragged/etc., can't you just turn these into the regular Windows commands and inject them with SendMessage or something similar (or are these all gone in WPF ?) ? In Win32 and everything before WPF (and maybe, not sure, also in WPF) events get sent to a Window, and if necessary that window will forward the events to its controls, so you typically never have to "dispatch" them yourself. Turning your touchscreen events in mouse events should give you the same thing. I suggest you have a look at the old mouse_event function, and the recent SendInput function. That is the way I would investigate. If that makes things easier, you could do first experiments on a regular PC, just turn some keyboard actions into mouse actions with one of the functions I mentioned; then substitute the touchscreen for those keyboard events. Hope this helps.
Luc Pattyn
try { [Search CP Articles] [Search CP Forums] [Forum Guidelines] [My Articles] } catch { [Google] }