I am using a Dell Venue 8 Pro running Windows 8.1 32 bit. I am trying to use AutoHotkey to map certain regions of the screen to certain buttons on the keyboard, in order to be able to use the touch screen to control a certain pinball game that already exists. My goal is to press A if the left half of the screen is touched, and to press B for the right half (the screen resolution is set to 640×480). This is the script I wrote:
~LButton::
MouseGetPos, x,y
if (x > 0 and x < 320)
Send {A}
if (x > 320 and x < 640)
Send {B}
return
However, it doesn't seem to be recognizing touch screen presses at all. It seems as though touch screen presses are distinct from normal mouse clicks. How can I use AutoHotkey to recognize a touch screen press, or is there another solution for what I am trying to do?
Edit: I am starting to think that AutoHotkey is not capable of doing what I want to accomplish here. If anyone can find an alternative way that produces this same outcome (presses keyboard buttons when screen regions are touched), I will accept the answer.
Best Answer
I'm not sure what language you're coding the game in... Normally, [ javascript ] , [ jquery ], and [ Webkit ] support touch events. But if that doesn't help, then you could try this script and alter it as per your convenience:
Quote from the creator of the script: [ link ]
Usage:
Decompress the zip file in the relevant folder.
Open Touchpad.ini and set cursor "speed" (0 ~ 1).
Drag the screen and the cursor moves.
Tap anywhere on the screen and click is sent.
Double tapping is double click.
"Ctrl + u" pause/restart the script.
Click the tray icon and exit the script.
To do:
Implement dragging.
Sometimes cursors jump to my finger position.
Sometimes cursors are hidden.
Download the zip file here: http://cafe.naver.com/flowpad/34 (I updated the above link.)
SCRIPT:
Hope this helps... I haven't tested the script.