Who decides there is mouse pointer or touch screen present

driversmousetouch screen

I have installed a touch screen film which is connected via USB and it is recognized as a mouse device.

Hence, I can not swipe on a web page. It clicks and selects or clicks and drags something on the screen when I try to swipe.

There are questions and answers that didn't help my case:

My original problem description is here.

So, who is responsible for recognizing the input device as a mouse or a touch screen, OS or the app? So where should I search for the solution?

Searches and notes to myself

With the help of this forum thread (with a little bit changes), I could make ts_test work. I calibrated the touch screen, draw a circle on the screen, etc.

Now what? The web browser I am using still selects some text instead of scrolling in the page.

Someone says "The application should use the input device directly" and someone other says "Application has nothing to do with the input device directly".

Should I edit the xorg.conf to emulate scroll event just because my application (Midori, the web browser in this case) can not use "the touch screen", or an Android phone does the same thing already?

How does an application get the mouse events in the first place? By opening the device file (/dev/input/event*) by itself or somehow from Xorg server? If the application handles device file itself, how could Xorg server possibly emulate the mouse scrolling stuff?

If application is listening the Xorg server for the mouse/touch-multitouch/whatever events, than the application should be written these mouse/touch/whatever events in mind and Xorg should have an entry that we say "Hey, Xorg, the device you see in /etc/input/event0 is a touch screen. When you get physical signals from it, tell the applications about the touch events."

Best Answer

This is only a partial answer. Other folks should feel free to copy this as a basis for their answers.

Touch screens input devices can be opened as simple mice, or with full access to their touch-screeniness via /dev/input/... and evdev stuff.

You need to get your X server to use the input device as a touchscreen. The X server translates touch-screen events like dragging along the side into mouse-wheel events, for example. Apps that want full gesture (multi-touch) support would need to open the Linux input event device themselves, instead of just getting X11 pointer position / mouse button events.

Kernel drivers for your specific touchscreen should be getting loaded automatically when you plug it in, unless the generic USB mouse driver claims the device first.

Look in your kernel log (dmesg) to see log messages about it. lsusb could help find device ID stuff to search for in the kernel log.