Win a copy of Micro Frontends in Action this week in the Server-Side JavaScript and NodeJS forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other all forums
this forum made possible by our volunteer staff, including ...
  • Campbell Ritchie
  • Ron McLeod
  • Paul Clapham
  • Bear Bibeault
  • Junilu Lacar
  • Jeanne Boyarsky
  • Tim Cooke
  • Henry Wong
Saloon Keepers:
  • Tim Moores
  • Stephan van Hulst
  • Tim Holloway
  • salvin francis
  • Frits Walraven
  • Scott Selikoff
  • Piet Souris
  • Carey Brown

Handling both touch and mouse interfaces in one app.

Posts: 1464
Netbeans IDE C++ Java Windows
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
In another thread, I asked about getting MOUSE_PRESSED events from a touch screen. Turns out, this opens a can of worms. For mobile devices, the issues created by mixing touch interfaces and mouse interfaces tend not to exist, since it's a rare phone that has a mouse. Likewise, desktop computers mostly have mice and, therefore, desktop apps tend not to support touch screens.

I was interested in creating an app that would allow the user to do everything with a mouse, but also use their touch screen if they have one. My early investigations suggested to me that Swing is a non-starter here, as it has no support for touch events (I'd be glad to learn I am wrong about that). So, I looked to JavaFX, which I am only learning anew at this time.  JavaFX has support for touch events, but mixing them with mouse events is a challenge. Here's one example I came up with that supports clicking a JavaFX Button either way, via mouse or touch:

Say you have a Button, and you want to be able to fire its action via either the mouse or the touch screen. You can let the touch screen emulate a mouse, and that will work without any extra code beyond what you need for a mouse interface. However, when you click with the mouse, the Button takes the focus and is visibly armed when you hold down the left mouse button. The Button's action doesn't fire, however, until you release the left mouse button.

For a touch interface, the mouse emulation isn't very good. When you press the Button with your finger, it doesn't take the focus, nor is it armed. When you release the Button by taking your finger off the screen, then you get the mouse press event, then the action fires, and then (somewhat inexplicably) you get the mouse released and mouse clicked events. So, the visual feedback isn't as good when you rely on the touch screen's mouse emulation. Further, depending on your preferences, pushing the button via the screen doesn't do what real electromechanical buttons do, which is cause their actions to "fire" when they are pressed, not when they are released.

Now, assuming you want your mouse interface to keep working the way it does (with actions happening upon mouse released events), you can add your own "skip" flag to your touch event processing, and have the action handler query that flag. The trick is knowing when to set that flag, and when not to. One would like to just always set that flag when a touch pressed event is processed, because that's the event that is going to take the focus, arm the Button, and fire the action, so you don't want the mouse events to cause a second firing.


There will be no second firing if you drag your finger off the Button before you lift it from the screen. That's because the emulated mouse released event won't be inside the Button, and that wouldn't fire the action with a real mouse, so the emulation doesn't fire it either (notwithstanding that the mouse released and mouse clicked events both get sent to your code after the action handler is called).

So, to avoid setting the skip flag when no second firing will happen, you must not set it in the touch pressed handler. You must conditionally set it in the touch released handler only when the release occurs in an on-screen point contained by the Button.

The code below shows how this can work.

What do you all think of this approach?
Posts: 387
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

My early investigations suggested to me that Swing is a non-starter here, as it has no support for touch events (I'd be glad to learn I am wrong about that).

The MT4J library adds touch support to Swing.  However, my guess is that you would be better off with JavaFX for this task.

What do you all think of this approach?

It looks weird.  You shouldn't have to do stuff like this to have seamless touch support on basic controls such as Buttons.  It should just work (IMO).  But I don't have a touch screen PC to test it and you have already verified that the default behavior is not how you wish.

I advise cross-posting this question to the openjfx-dev mailing list and getting the opinion of the developers there on this.  They might be able to provide some suggestions or some info on why it is the way it is.

The ideal way to address this would be to dig into the control code and place your event handling logic in there.  You could start from the default Button code control code and modify that or extend it to get it to work how you wish.  Normally the handling of input is outsourced to a JavaFX class known as a Behavior.  The Behavior logic was never made into public API, so there is no base behavior class to extend from such as there is for skins (via the public SkinBase class).  In looking at the code, you can see that the Behavior for buttons are in ButtonBehavior and BehaviorBase.  

For background information, this article explains the JavaFX UI Control Architecture.  So what you can do is replace the default ButtonSkin with your own skin that behaves as you wish with an implementation that replaces the default button behavior.  Hendrik Ebbers has a blog on how you do that kind of stuff.  He also wrote a book on creating custom controls which was featured on JavaRanch a while back.

To understand what is happening and how JavaFX is processing touch events, refer to: Working with Events from Touch-Enabled Devices.

Touches on a touch screen also generate corresponding mouse events. For example, touching a point on the screen generates TOUCH_PRESSED and MOUSE_PRESSED events. Moving a single point on the screen generates scroll events and drag events. Even if your application does not handle touch or gesture events directly, it can run on a touch-enabled device with minimal changes by responding to the mouse events that are generated in response to touches.

If your application handles touches, gestures, and mouse events, make sure that you do not handle a single action multiple times. For example, if a gesture generates scroll events and drag events and you provide the same processing for handlers of both types of events, the movement on the screen could be twice the amount expected. You can use the isSynthesized() method for mouse events to determine if the event was generated by mouse movement or movement on a touch screen and only handle the event once.

So what is happening is a mouse event is being synthesized for the touch event and that is what is causing the button to fire.  The button itself has not been programmed to explicitly respond to touch events, instead it is responding indirectly via the synthesized mouse event.  The synthesis itself doesn't happen local to the button code but somewhere else in the JavaFX source.  

Your options for getting the behavior you want:
1. You could (perhaps) synthesize further mouse events to get the behavior you want by hacking somewhere else in the JavaFX input processing source (probably inadvisable).
2. You could replace the built-in button skin with your own customized control skin that handles touch events as you wish.
3. You could do as you done and keep the build-in button skin and just add in additional event handlers to get the behavior you want.

From an application programming point of view, (2) replace the button skin, will give a nicer API and it will also work for existing items that use buttons (such as the JavaFX alerts and dialogs), as by defining a custom button skin in CSS will replace the skins of all buttons in your application.  It is more difficult to create your own skin than just add additional event handlers to the standard button.

An interesting comment in the BehaviorBase is this one:

    * Event handler methods.                                                  *
    *                                                                         *
    * I'm not sure why only mouse events are here. What about drag and        *
    * drop events for instance? What about touch events? What about the       *
    * other mouse events? It does seem like these need to be here, because    *
    * for example mouse interaction logic might differ from platform to       *
    * platform, and the Behavior is supposed to implement all the user        *
    * interaction logic (not just key handling). So it seems like             *
    * BehaviorBase should have methods for handling all forms of input events,*
    * and not just these four mouse events.                                   *

So my guess is that the JavaFX developers realized the issues at the time of development, but went with the current approach of synthesized mouse events knowing that it isn't exactly ideal.  This approach doesn't lead to the optimal user experience as you point out in your post, but on the other hand it is functional and has allowed the integration of touch processing with minimal impact on much of the rest of the architecture and implementation.  I guess it is an area that the JavaFX developers might consider for future enhancements of the platform (I don't know), you could ask further on the openjfx-dev mailing list if you are more curious. You could also follow up with the Gluon guys who are heavily into development of JavaFX for both mobile and desktop environments and may have some insights or suggestions on this.

Honestly, the current behavior support for touch in JavaFX from your description, though optimal does not sound that bad, but then, as I said, I have no touch screen to try it out on so that is just speculation on my part.  At any rate the touch implementation in JavaFX is better than the one in Swing ;-)
Stevens Miller
Posts: 1464
Netbeans IDE C++ Java Windows
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thanks, John. Have a cow for the thorough reply!

I'll look into the options you suggest. I agree that my current approach has the feel of something that shouldn't be necessary. I had already tried some ideas using isSynthesized, but none of them could cope with the scenario where I need the action to fire upon the touchPressed event without a second firing over the synthesized mouse events, without preventing a legitimate later firing following the case where the user touches the screen, then drags his finger off the Button before lifting it off of the screen. The later behavior has to differ from the touch-and-release in place case, but there is no way to know which case one is dealing with soley by looking at isSynthesized.

All this would be so easy if the touch screen simply sent a mousePressed event immediately after every touchPressed event, but it doesn't. This appears to be a function of the touch screen driver, which means there's nothing JavaFX can do about it. I've searched in vain for some kind of Windows setting, registry entry, or whatever that would change the driver's behavior. It's frustrating because there's no reason I can see why anyone would have wanted to refrain from sending the mousePressed event at the time the screen is touched. But, that's what they do.

I'll keep looking and see if your other ideas produce any leads. It would be a shame not to use this touch screen in my Java apps.
Please do not shoot the fish in this barrel. But you can shoot at this tiny ad:
Thread Boost feature
    Bookmark Topic Watch Topic
  • New Topic