(I searched all over for this and couldn't find a simple Q&A)
I'm trying to test if a GameObject.layer is in a LayerMask.value. e.g.:
When I Debug.Log the test expression, it always prints 0. When try Debug.Log a simple test, such as 3 & 2, it prints 2.
Any Idea why this comparison isn't working?
There is a great explanation here that I hadn't considered: http://answers.unity3d.com/questions/122586/layermask-vs-1ltlt-gameobjectlayer-.html
I need to 'cast' the object's layer, which is the integer shown in the inspector, in to the masked version, which is used to compare with the LayerMask. For example, layer "6" in 1, 2, 3, 4, 5, 6** needs to be 1, 2, 4, 8, 16, 32**
Here is a working function to do this: