Sorry if this has been covered elsewhere, I did not see it.
Is the Input's Touch position 'inverted' relative to the screen? The reason I ask is that if I do this:
where windowRect is set in OnGUI:
It will only 'do something' IFF I touch the screen in the 'mirrored' position (up/down aka Y direction). Ex: if my windows is near the top of my screen, it will not detect this containment if I touch near the top (inside the window). But it WILL detect if I touch on the opposite side of the screen (same X, opposite Y).
I must have missed something in the docs? Or is it a bug?
Update: Is this what GUIUtility.ScreenToGUIPoint is for?
Technically, it is not inverted. Touch position and mouse position are returned by the Input class in screenspace coordinates, whereas OnGUI uses GUI space coordinates. In screenspace, (0, 0) is on the bottom-left corner of the screen while it is on the top-left corner of the screen in GUI space.