Hello, i am stuck in this problem. i have tried the input.getbuttondown(0), and it works. Now, what i want to do is get several touch position, so i use Input.GetTouch(0). However, it doesn’t work. i am confused because i am following the documentation and answers in unity.
Plane horPlane = new Plane(Vector3.up, Vector3.zero);
// get the ray from the camera...
Ray ray = Camera.main.ScreenPointToRay(Input.GetTouch(0).position);
print("mouseposition="+Input.GetTouch(0).position);
//startpos=Input.mousePosition;
float distance1 =0;
// if the ray intersects the logical plane...
if (horPlane.Raycast(ray, out distance1)){
cube.transform.position = ray.GetPoint(distance1); // move cube to the 3D point clicked
print("cube position="+cube.transform.position);
//}
}
Try my solution below just in case, but I’d bet that your problem is that your pc’s “touch” input is on a higher level well outside what unity sees and a touch is just being treated as a mouse click in the position that you touched as far as Unity is concerned. Because you were using touches, I thought you were building for android or IOS. I think that’s all that Touches are used for.
I see a couple of potential problems. If I’m developing and testing on a computer and deploying full builds occasionally to android, I always use the compiler directive #if UNITY_ANDROID and #if UNITY_EDITOR and UNITY_WIN_STANDALONE to seperate my different kinds of inputs for different builds. If you’re running an exe I don’t think you’re going to have any “touches”. That’s why your mouse portion worked and your touch portion didn’t. As the person above me said, you’ll want to check the size of input.touchcount to make sure you’ve even got a touch before trying to reference it. So quick demo: