I have a code where I'm getting an angle between two Vector3's, however the angles returned come in the form of numbers like .011340 or .19872009752 as opposed to 90 degrees or 180 degrees. Can someone tell me what I'm doing wrong? The two variables used here are Vector3's:
generalDirection = Vector3.Angle (transform.position, desiredPosition);
EDIT: Sorry but now the angle starts at 35, goes up to 135 on the other side and goes back down to 35 with a full clockwise rotation. I'm confused.
EDIT: Ok I fixed that, now I just need help making one side negative and the other positive
Vector3 is a structure used to store many 3 float values, like points, vectors, angles etc. Transform.position and desiredPosition are points, not vectors, and Vector3.Angle is intended to be used with vectors. In this particular case, you're getting the angle between the vectors (0,0,0)->transform.position and (0,0,0)->desiredPosition, which normally makes no sense. If you want to measure the angle between the transform forward direction and the direction from the transform to the desiredPosition, you should use this:
A point is a position in the 3D world, while a vector is a direction - you can think of a vector as an "arrow" based on 0,0,0 and pointing to the desired direction. In order to calculate the vector that points the direction from position A to position B, subtract A from B:
The resulting vector shows the direction A to B, and its length (directionAtoB.magnitude) is the distance between A and B.
see the script reference here:
they use two directions, not two points.. if you think about it, the angle based on two points in space would be relative to the point of origin..
answered Jul 21 '12 at 12:17 AM