I have been working on a pre pass decal system and am very close to having it all working. An issue I am encountering is that there appears to be a problem when rendering using Radeon hardware. Everything works correctly when rendered using NVIDIA.
The problem is that while looking at a decal and moving the camera, the normals of the decal will "shift" around and no longer line up with the rest of the decal. When the camera is stationary, all is well. Below are two images to better explain what is happening.
The image above is what the decal looks like when the camera is stationary or when rendered using NVIDIA.
The image above is what the decal looks like when the camera is moving while rendered using Radeon hardware. It's pretty hard to show what happens using a screenshot, but just imagine the normals are pulsating.
I have saved out frame captures of a test executable using PIX on both NVIDIA and Radeon hardware and have interesting results. When I play back the recorded NVIDIA data on the Radeon card using PIX, the decal looks correct. When comparing a single frame from each hardware capture, the only real difference in the rendering is that the values used in SetSamplerState when dealing with D3DSAMP_MINFILTER and D3DSAMP_MAGFILTER. On the NVIDIA hardware all values have been set to D3DTEXF_LINEAR whereas on the Radeon, they are set to either POINT or ANISOTROPIC. Directly after this, when DrawIndexPrimitive() is called, is when the difference between the hardware can be seen in the render.
Of course since Unity is designed to run on multiple platforms and not just windows, this low level functionality is handled by Unity and I have no way of controlling it. I know there is not a clear question here but at this point I'm just clawing for a solution or any kind of tips or advice...
Any help would be greatly appreciated.