I’m trying to use ViewportToWorldPoint which seems pretty basic, yet I cannot get it to work. Using the docs ( Unity - Scripting API: Camera.ViewportToWorldPoint ), I’ve created this simple test code:
`
Camera curCamera = Camera.main;
Vector3 pos1 = new Vector3 (0.0f, 0.0f, curCamera.nearClipPlane);
Vector3 pos2 = new Vector3 (1.0f, 1.0f, curCamera.nearClipPlane);
Vector3 pos3 = new Vector3 (99999.0f, 0.0f, curCamera.nearClipPlane);
Vector3 world1 = curCamera.ViewportToWorldPoint(pos1);
Vector3 world2 = curCamera.ViewportToWorldPoint(pos1);
Vector3 world3 = curCamera.ViewportToWorldPoint(pos1);
Debug.Log("pos1 >> world1: " + pos1 + " >> " + world1);
Debug.Log("pos2 >> world2: " + pos2 + " >> " + world2);
Debug.Log("pos3 >> world3: " + pos3 + " >> " + world3);
`
From the above, I get this output:
pos1 >> world1: (0.0, 0.0, 1.0) >> (-1.0, -0.6, -23.0) pos2 >> world2: (1.0, 1.0, 1.0) >> (-1.0, -0.6, -23.0) pos3 >> world3: (99999.0, 0.0, 1.0) >> (-1.0, -0.6, -23.0)
My expectations are that I should get back my lower left viewport position, my upper right viewport position and some wildly different value for my out of bounds ‘99999.0’. However I get exactly the same thing returned for all.
What have I messed up?