I have created an application for an offset machine (a printer for plastic packaging like flower pots). We can move inker motors in position (x, y, z).
It works like a charm and we are very happy with it. Now we want to be able to use an android tablet connected to the wifi of the machine to make adjustments.
We do not want to use VNC, because this is just not an ideal way for a "permanent" setup.
I have created an android app which connects to the webserver and shows the web page. This also works perfectly, but...
The buttons on the screen only react on "release" and not "press". When using a mouse, it works, but a touch enabled device sends different events.
When I look at the source code of the webpages which are generated by unilogic, I see these events on the buttons:
Looks like touch events are missing.
The pages are built using bootstrap. Bootstrap is touch aware, so you do see the button is pressed when you press it, but the event is not triggerred.
This is not a problem for buttons you just press once, but my interface requires the operator to hold a button to keep the motor running and release it to stop. As I said, this is working when using a mouse, but touch enabled devices just don't work.
Too bad we can not make any changes to the webserver on the unistream.
Does anyone have an idea of how to work around this?
Are there any plans on updating the webpage generator to support touch?
See below an image of the interface to get a little understanding of what I mean:
The operator presses an arow to move the inker unit into the correct position. The inker will move as long as the operator touches the button.