Simulate touch with mouse

According to earlier questions about simulating touches with mouse you should use Input.GetMouseButton() on PC and Input.touches on smartphones. Haven’t changed anything from that time? Do we have nice single interface right now? Or still need to write “if (Application.platform != RuntimePlatform.IPhonePlayer”?

Might be a bit late to follow up on this, but I just stumbled across the same problem and build a simple workaround that might be interesting for someone.

The scenario is: you build a mobile game and don’t want to have a hybrid (mouse/touch) input system, just to be able to test in the editor.
Using reflection you can gain read.write access on private members of any model e.g. UnityEngine.Touch and wrap everything up in something like this:

using UnityEngine;
using System.Reflection;
using System.Collections.Generic;

public class TouchCreator
{
    static BindingFlags flag = BindingFlags.Instance | BindingFlags.NonPublic;
    static Dictionary<string, FieldInfo> fields;

    object touch;

    public float deltaTime { get { return ((Touch)touch).deltaTime; } set { fields["m_TimeDelta"].SetValue(touch, value); } }
    public int tapCount { get { return ((Touch)touch).tapCount; } set { fields["m_TapCount"].SetValue(touch, value); } }
    public TouchPhase phase { get { return ((Touch)touch).phase; } set { fields["m_Phase"].SetValue(touch, value); } }
    public Vector2 deltaPosition { get { return ((Touch)touch).deltaPosition; } set { fields["m_PositionDelta"].SetValue(touch, value); } }
    public int fingerId { get { return ((Touch)touch).fingerId; } set { fields["m_FingerId"].SetValue(touch, value); } }
    public Vector2 position { get { return ((Touch)touch).position; } set { fields["m_Position"].SetValue(touch, value); } }
    public Vector2 rawPosition { get { return ((Touch)touch).rawPosition; } set { fields["m_RawPosition"].SetValue(touch, value); } }

    public Touch Create()
    {
        return (Touch)touch;
    }
    
    public TouchCreator()
    {
        touch = new Touch();
    }

    static TouchCreator()
    {
        fields = new Dictionary<string, FieldInfo>();
        foreach(var f in typeof(Touch).GetFields(BindingFlags.Instance | BindingFlags.NonPublic))
        {
            fields.Add(f.Name, f);
            Debug.Log("name: " + f.Name);
        }
    }
}

So - based on the previous approach of StarManta you should now be able to build your own touch simulator environment for the editor. Ideally you make use of the UNITY_EDITOR preprocessor tag to make sure to strip it from your final build and keep the mobile code clean.

(How would you simulate a pinch with a mouse, or a three finger gesture?)

Obviously you can’t. But that’s no reason at all not to allow simulation of single-touch gestures.

And using platform-dependent compilation doesn’t help a lick with TESTING touch input. I want to iterate quickly on my UI code, and I can’t do that if it takes me 10 minutes to build and test a single code change.

So, yeah, this is still very much an unsolved issue. And you know what? All it would take is Unity allowing me to create my own instance of the Touch class. But no, it’s a complete black box - everything is read-only and there’s no useful constructor.

	foreach (Touch thisTouch in Input.touches) {
		HandleTouch(thisTouch);
	}
	
	if (Input.GetMouseButton(0) ) {
		Touch fakeTouch = new Touch();
		fakeTouch.fingerId = 10;
		fakeTouch.position = Input.mousePosition;
		fakeTouch.deltaTime = Time.deltaTime;
		fakeTouch.deltaPosition = Input.mousePosition - lastMousePosition;
		fakeTouch.phase =	(Input.GetMouseButtonDown(0) ? TouchPhase.Began : 
							(fakeTouch.deltaPosition.sqrMagnitude > 1f ? TouchPhase.Moved : TouchPhase.Stationary) );
		fakeTouch.tapCount = 1;
		
		HandleTouch(fakeTouch);
	}

See? Useful, platform-agnostic touch handling. But no, Unity went out of their way to prevent me from doing that.

I added a script, called InputHelper, which uses Tommynator’s TouchCreator. I just use InputHelper.GetTouches() instead of Input.touches() and it works perfectly.

using UnityEngine;
using System.Collections;
using System.Collections.Generic;

public class InputHelper : MonoBehaviour {

    private static TouchCreator lastFakeTouch;

    public static List<Touch> GetTouches()
    {
        List<Touch> touches = new List<Touch>();
        touches.AddRange(Input.touches);
#if UNITY_EDITOR
        if(lastFakeTouch == null) lastFakeTouch = new TouchCreator();
        if(Input.GetMouseButtonDown(0))
        {
            lastFakeTouch.phase = TouchPhase.Began;
            lastFakeTouch.deltaPosition = new Vector2(0,0);
            lastFakeTouch.position = new Vector2(Input.mousePosition.x, Input.mousePosition.y);
            lastFakeTouch.fingerId = 0;
        }
        else if (Input.GetMouseButtonUp(0))
        {
            lastFakeTouch.phase = TouchPhase.Ended;
            Vector2 newPosition = new Vector2(Input.mousePosition.x, Input.mousePosition.y);
            lastFakeTouch.deltaPosition = newPosition - lastFakeTouch.position;
            lastFakeTouch.position = newPosition;
            lastFakeTouch.fingerId = 0;
        }
        else if (Input.GetMouseButton(0))
        {
            lastFakeTouch.phase = TouchPhase.Moved;
            Vector2 newPosition = new Vector2(Input.mousePosition.x, Input.mousePosition.y);
            lastFakeTouch.deltaPosition = newPosition - lastFakeTouch.position;
            lastFakeTouch.position = newPosition;
            lastFakeTouch.fingerId = 0;
        }
        else
        {
            lastFakeTouch = null;
        }
        if (lastFakeTouch != null) touches.Add(lastFakeTouch.Create());
#endif
           

 return touches;      
    }

}

No, there’s not a nice simple interface. (How would you simulate a pinch with a mouse, or a three finger gesture?)

What you should do now is use the platform dependent approach.

Unity translates these calls to the first touch on mobile: Input.GetMouseButtonDown(0), Input.GetMouseButton(0) and Input.GetMouseButtonUp(0) . So you can think of it as a common interface if you’re interested in only one pointer at a time.

If you have a lot of already written code that relies on touch input, you can use this InputWrapper to test it in Editor. To use it, add this line on top of all scripts that uses UnityEngine.Input:

using Input = InputWrapper.Input;

using UnityEngine;
using UnityEngine.Assertions;


namespace InputWrapper {
    public static class Input {
        static bool touchSupported => UnityEngine.Input.touchSupported;
        static Touch? fakeTouch => SimulateTouchWithMouse.Instance.FakeTouch;

        public static bool GetButton(string buttonName) {
            return UnityEngine.Input.GetButton(buttonName);
        }

        public static bool GetButtonDown(string buttonName) {
            return UnityEngine.Input.GetButtonDown(buttonName);
        }

        public static bool GetButtonUp(string buttonName) {
            return UnityEngine.Input.GetButtonUp(buttonName);
        }

        public static bool GetMouseButton(int button) {
            return UnityEngine.Input.GetMouseButton(button);
        }

        public static bool GetMouseButtonDown(int button) {
            return UnityEngine.Input.GetMouseButtonDown(button);
        }

        public static bool GetMouseButtonUp(int button) {
            return UnityEngine.Input.GetMouseButtonUp(button);
        }

        public static int touchCount {
            get {
                if (touchSupported) {
                    return UnityEngine.Input.touchCount;
                } else {
                    return fakeTouch.HasValue ? 1 : 0;
                }
            }
        }

        public static Touch GetTouch(int index) {
            if (touchSupported) {
                return UnityEngine.Input.GetTouch(index);
            } else {
                Assert.IsTrue(fakeTouch.HasValue && index == 0);
                return fakeTouch.Value;
            }
        }

        public static Touch[] touches {
            get {
                if (touchSupported) {
                    return UnityEngine.Input.touches;
                } else {
                    return fakeTouch.HasValue ? new[] {fakeTouch.Value} : new Touch[0];
                }
            }
        }
    }

    internal class SimulateTouchWithMouse {
        static SimulateTouchWithMouse instance;
        float lastUpdateTime;
        Vector3 prevMousePos;
        Touch? fakeTouch;


        public static SimulateTouchWithMouse Instance {
            get {
                if (instance == null) {
                    instance = new SimulateTouchWithMouse();
                }

                return instance;
            }
        }

        public Touch? FakeTouch {
            get {
                update();
                return fakeTouch;
            }
        }

        void update() {
            if (Time.time != lastUpdateTime) {
                lastUpdateTime = Time.time;
                
                var curMousePos = UnityEngine.Input.mousePosition;
                var delta = curMousePos - prevMousePos;
                prevMousePos = curMousePos;

                fakeTouch = createTouch(getPhase(delta), delta);
            }
        }

        static TouchPhase? getPhase(Vector3 delta) {
            if (UnityEngine.Input.GetMouseButtonDown(0)) {
                return TouchPhase.Began;
            } else if (UnityEngine.Input.GetMouseButton(0)) {
                return delta.sqrMagnitude < 0.01f ? TouchPhase.Stationary : TouchPhase.Moved;
            } else if (UnityEngine.Input.GetMouseButtonUp(0)) {
                return TouchPhase.Ended;
            } else {
                return null;
            }
        }

        static Touch? createTouch(TouchPhase? phase, Vector3 delta) {
            if (!phase.HasValue) {
                return null;
            }
            
            var curMousePos = UnityEngine.Input.mousePosition;
            return new Touch {
                phase = phase.Value,
                type = TouchType.Indirect,
                position = curMousePos,
                rawPosition = curMousePos,
                fingerId = 0,
                tapCount = 1,
                deltaTime = Time.deltaTime,
                deltaPosition = delta
            };
        }
    }
}

This is a really old question, but in case someone is passing by…

One option now is to use the New Input System with UnityEngine.InputSystem.Pointer. Pointer has things like position, delta, pressed, etc. Obviously there will be some limitations - for example, multi-finger gestures will still need to be handled via the platform specific Touch APIs.

I have been using Pointer for quite a while and I can verify that it works fine for both desktop+mouse and android+touch.

One caveat is that Unity’s new input system is broken with Unity Remote… so the only way to test is by compiling for android and waiting several minutes every time.