I’ve got a shader which draws a ray-casted sphere. Here’s a snippet:
// A basic ray.
struct ray {
float3 o; // Point of origin
fixed3 d; // Direction
};
// Raytrace sphere
float3 calcPixelColor(float2 pixel, float2 resolution) {
const float fov = 2.5;
// Pixel in screen space
float2 p = (-resolution + 2.0 * pixel) / resolution.y;
// Create a ray
ray r;
r.o = _WorldSpaceCameraPos;
float3 ww = float3(0.0, 0.0, 1.0);
fixed3 uu = normalize(cross(fixed3(0.0, -1.0, 0.0), ww));
fixed3 vv = normalize(cross(ww, uu));
// Calculate ray direction
fixed3 er = normalize(float3(p.xy, fov));
r.d = normalize(er.x * uu + er.y * vv + er.z * ww);
// The color of a sphere, or background
float3 col = intersectSphere(r);
return col;
}
It emulates a camera pointed into the scene: [0, 0, 1].
But it gives me kind of a weird result.
If I set up my virtual camera in the shader at [0,0,-5]
, looking down the z-axis like this:
r.o = float3(0, 0, -5);
fixed3 ww = fixed3(0, 0, 1);
My shader renders a proper sphere:
[10720-screen+shot+2013-05-05+at+5.39.21+pm.png|10720]
And if I feed in the Unity camera’s position, like so:
r.o = _WorldSpaceCameraPos;
Everything is fine. But I can’t seem to apply the camera’s orientation to the direction vector ww
. I try this:
fixed3 ww = normalize(float3(0, 0, 1) * UNITY_MATRIX_IT_MV);
But all I get is a blank blue background unless I move the camera really close to [0,0,0]
. Then I see part of the sphere, but if I rotate the camera, nothing happens.