I’m trying to use a camera script to get my camera to render to a RenderTexture via CommandBuffers. I also have a custom shader which I specifically set its Render Queue to “Queue” = “Overlay” because I don’t want it to render geometry until the RenderTexture has been written because I don’t want objects with my shader to be captured in the RenderTexture.
I added my CommandBuffer to the camera with the CameraEvent.BeforeImageEffects; CameraEvent because I want my CommandBuffer actions to be executed before the camera renders my custom shader.
Now this makes sense in theory because according to Unity’s documentation my CommandBuffer should be executed before image effects are rendered and so the RenderTexture my CommandBuffer tells the camera to write to should NOT contain any geometry which uses my custom shader because since its Queue is set to “Overlay” it should be considered an image effect and rendered right after, which will give my custom shader access to the RenderTexture I just made which is necessary for my shader to work.
UNFORTUNATELY for some reason this doesn’t work at all - the camera is executing my CommandBuffer after rendering my custom shader geometry and so its being captured in the RenderTexture and giving me an unwanted hall of mirrors effect on my shader.
Additionally, for some reason the Scene View camera ,when in the editor’s Scene tab, doesn’t execute my CommandBuffer at all, it seems to work if I set the CameraEvent to CameraEvent.AfterForwardAlpha but of course, this only works when the camera is using Forward rendering where I need it to work in any rendering path.
Is this a defect or am I missing something?