Mouse and touch events on Windows with touch display

I have a problem with mouse events on Windows with touch display.
My class “ofxImgButton” has got mousePressed and mouseReleased events registered. Button changes for example the size when it’s clicked.
Everything works fine on machine with mouse but it doesn’t work properly on touch display. The button update isn’t visible.
It’s visible when there’s a “double tap” but then it triggers something twice.

The problem is that unfortunately I don’t have an access to touch display at the moment. What I have is only machine with the mouse.
What I should do?
1.Is there any solution to detect type of input - mouse or touch?
2.Should I use any of touch events?
3.How should I register events in my class? Is there a way to do it more generic like registering events depends on type of input?

From my experience, you do not need to use touch-specific events. Those touch-enabled Windows machines will send the events as touch in OF.

Also, you should specify which release of OF you are using.

Hi, use the this https://github.com/trentbrooks/ofxWinTouchHook
However, keep in mind that touch and mouse events are different, although touch events can be detected as mouse events, just without this addon, this will not work correctly always.
you should register your button to listen to touch events and only if it is not possible use mouse events.

Hi, I don’t get it. You say that I don’t need to use touch-specific events because those machines send events as touch?
I am using OF version 0.9.3 and Visual Studio 2015.

Thanks for your help.
So I should use the addon and register touch events in my class.

1.The question is, how can I do it more general ? I mean I would like to have mouse or touch event registered according to some parameter ? I tried this addon on my machine and there is nothing visible on these events.

Are there any differences between those events and those from ofCoreEvents class?

// enable the Windows Touch Hook
ofxWinTouchHook::EnableTouch();

// add touch listeners
ofAddListener(ofxWinTouchHook::touchDown, this, &ofApp::touchDown);
ofAddListener(ofxWinTouchHook::touchMoved, this, &ofApp::touchMove);
ofAddListener(ofxWinTouchHook::touchUp, this, &ofApp::touchUp);


// ofCoreEvents
ofEvent<ofTouchEventArgs>	touchDown;
ofEvent<ofTouchEventArgs>	touchUp;
ofEvent<ofTouchEventArgs>	touchMoved;
ofEvent<ofTouchEventArgs>	touchDoubleTap;
ofEvent<ofTouchEventArgs>	touchCancelled;

For a project that I made recently that involved a multitouch screen using windows, I tried the already mentioned addon and the following one

both worked fine but I ended up using the latter one, dont remember why. although the example of the latter one is better.

both addons use ofEvent as the event type, which is the one from ofCoreEvents.

what do you mean by "I mean I would like to have mouse or touch event registered according to some parameter "?

when you register an event you pass a callback function, in this case ofApp::touchDown or ofApp::touchUp, etc
Whenever the event is triggered this function gets called. in the case of this particular events an ofTouchEventArgs object is passed which contains the position of the touch along with several other properties. So you get out of this object the info about the touch event.
Hope this helps.
best

what do you mean

I will have a touch display next week so I will be able to test it. Sorry, maybe I wasn’t so clear because of my english. I will try to explain it one more time.

Here is a part of my class:

void ofxImgButton::setup(string name, string srcImg, string srcCover, int x, int y) {
    butName = name;
    img.load(srcImg);
    cover.load(srcCover);
    butX = x;
    butY = y;
    butWidth = img.getWidth();
    butHeight = img.getHeight();
    addListeners();
}

void ofxImgButton::addListeners() {
    ofAddListener(ofEvents().mousePressed, this, &ofxImgButton::mousePressed);
    ofAddListener(ofEvents().mouseReleased, this, &ofxImgButton::mouseReleased);
}  

I want to register an event in my class. I checked the example of ofxWinTouchHook and I need to EnableTouch().

ofxWinTouchHook::EnableTouch();

// add touch listeners
ofAddListener(ofxWinTouchHook::touchDown, this, &ofApp::touchDown);
ofAddListener(ofxWinTouchHook::touchMoved, this, &ofApp::touchMove);
ofAddListener(ofxWinTouchHook::touchUp, this, &ofApp::touchUp);

I should do this in setup method or constructor of my class, shouldn’t I?. But I tried this and I tried even enable touch events in ofApp.cpp and it doesn’t work. Maybe it will be working on machine with touch display? We will see.

I would like to have a class working both on mouseevents and touchevents. Should I switch register events depending on the parameter? (for example a bool flag from xml file) or Is there a way to detect automatically a type of input - mouse or touch? and then call EnableTouch according to this “way”?

Hi,
ofxWinTouchHook::EnableTouch(); will just activate make the windows touch events available in the OF events system.
As far as I can remember, I dont have a touch display now to check, once touch is enabled you’ll get mouse events only for real mouse events and touch events for real touch events, so you’ll have different events for each input.
So in your button class you need to register the touch events like

ofAddListener(ofxWinTouchHook::touchDown, this, &ofxImgButton::touchDown);

remember that you will need the according callback function in your button class like

ofxImgButton::touchDown(ofTouchEventArgs& touch){
    //whaever you might want to do with the touch event
}

Last but not least, add the following listeners only if you need them

// add touch listeners
ofAddListener(ofxWinTouchHook::touchDown, this, &ofApp::touchDown);
ofAddListener(ofxWinTouchHook::touchMoved, this, &ofApp::touchMove);
ofAddListener(ofxWinTouchHook::touchUp, this, &ofApp::touchUp);

best

Hi,
I think I got it but I have the last question. I need to do a copy of the logic both in touchDown and mousePressed event, am I right?

ofxImgButton::touchDown(ofTouchEventArgs& touch){
    // something()
}
ofxImgButton::mousePressed(ofMouseEventArgs & e){
    // something()
}

cheers

if you want both to have the same behavior then yes. wrapping it in a function is a good idea.

One thing still bothering me. There are some touch events declaration in ofCoreEvents class.

// ofCoreEvents
ofEvent<ofTouchEventArgs>	touchDown;
ofEvent<ofTouchEventArgs>	touchUp;
ofEvent<ofTouchEventArgs>	touchMoved;
ofEvent<ofTouchEventArgs>	touchDoubleTap;
ofEvent<ofTouchEventArgs>	touchCancelled;

Why we need external addon to activate them in OF?

These are there because OF uses these in iOS and android.
On the desktop side, windows is the only OS with reliable multitouch support. (apple has no plans on adding it and linux is very buggy -I had no success with it). I’m not sure why this addon is not included out of the box.
cheers.