I am developing an application using OF 0.9.3 in Windows under VS2015.
I want my GUI buttons to react to the touch down event in order to reduce the button size while the button is pressed.
To do so, I remember the state of the button, and in the draw() event, change the button size depending on the state.
With a mouse cursor, all works as expected: the button is drawn unpressed, then the mousePressed event is called, then the button is drawn pressed, then the mouseReleased event is called, and finally the button is drawn unpressed.
We are building this app to run on a Lenovo Yoga Home all-in-one computer running Windows 10 Home. It is touch-enabled.
When I run the app on the all-in-one computer, the mousePressed() and mouseReleased() events are called in the same frame, and the push-down effect is therefore not visible.
If I move the finger after pushing down (a drag), the events are registered in different frames It really looks like for a quick touch down/up, the OS waits until the touch up to send both the touch down and up events.
We have looked at the operating system options and could not find a workaround.
Has anyone ever seen this problem? What can we do to fix it?