Before updating to Windows 10 you may need to return your Pro license. More information here

Why is the second camera interfering with image effects?


we are using the DoF 3.4 image effect in most of our scenes. But because there are known problems with transparent materials not writing to the depth buffer, we are using a 2nd camera that only renders specific layers for this kind of stuff. The 2nd camera is basicaly a duplicate of the main camera and always a child of it. It's stripped from all components, has it's clear flags set to Don't Clear and a higher depth value than the main camera.

The problem: When the 2nd camera is active, it somehow interfers with the image effects on the main camera. For example the DoF effect just doesn't work. But we have other effects on our cameras too, like SSAO and Bloom effects and they don't work right too.

Now for the really creapy thing: I started adding some other image effects (like the Fisheye), configured them to don't actually do anything and started de-/activating some of them. And with some additional effects, everything seems to work. So I wrote an effect that doesn't do anything else than doing a simple blit. And that seems to work sometimes, too.

Some additional info: We need to use deferred rendering, because we need all the shadows we can get. Every image effect on the main camera is working when we set the 2nd camera to Forward rendering. But we can't use that, because we need to share the depth buffer between the two cameras so the objects rendered by the 2nd camera get occluded by objects in the front. We might even have more additional cameras that render different kind of stuff (for example a selective glow effect). We are using Unity 3.5.5, target platform is Windows PC.

For clarification I've put together a simple example project. It's showing the configuration of both cameras. There's the DoF effect attached to the main camera and the simple blit-effect I mentioned is actually attached 2 times to make it work. The funny thing is, the DoF is working when both the PostFix effects are active, but when you just deactivate one of them, it stops working. The link to the project: https://skydrive.live.com/redir?resid=614466D76208072C!155

The question: Can anyone explain that to me? I really don't understand what's going on there, why is the 2nd camera interfering with the image effects and why are they working when there are other effects attached to the camera that don't do anything? And most importantly, is there a real fix for this problem, are we doing anything wrong? Or do we have to live with this workaround that can break every time we change anything on our cameras?

I really hope someone can help us. Thanks in advance.


more ▼

asked Oct 11, 2012 at 11:42 AM

avatar image

Max Stegen
18 1 2 4

What would happen if you set the Clear Flag to something besides Don't Clear? I have had problems with don't clear sometimes, as it appeared to do something similar to hardware anti-aliasing, even if you turned anti-aliasing off. If the don't clear is the problem, I may be able to point you in the right direction.

Oct 11, 2012 at 05:53 PM DoctorWhy

I played around a little with the clear flags of both cameras and nearly all other random values I could find but couldn't find any value that seems to have anything to do with this problem. I also tried rendering everything in a shared RenderTexture but had no luck there too.

Oct 12, 2012 at 11:23 AM Max Stegen

am grabbing the project to take a look.

Am pretty sure this is a race condition. Your main camera finishes rendering, and then the blur is applied. In parallel the second camera starts to render over the top of your main camera. The final pixels just depend on which activity is last to write to the pixels. At least, that is what I am working on.

Oct 23, 2012 at 11:18 AM Graham Dunnett ♦♦

think I have a solution that uses a render texture. The result seems to be dependent on the order in which image effects are called. I'll apply my tests to your example and post back when I have that working. (Basically I render the particles into a render texture. Then render the cubes. Then the dof effect is applied before the particle render texture is rendered over the top.)

Oct 23, 2012 at 10:12 PM Graham Dunnett ♦♦
(comments are locked)
10|3000 characters needed characters left

2 answers: sort voted first

I'm not sure what problems with transparent materials not writing to the depth buffer are you referring to. You can modify a shader that does blending to also write to the depth buffer, but it usually makes sense only if transparent objects do not intersect with each other.

Assuming that you need the second camera, your workaround is unfortunately the fastest one currently possible.

When using deferred lighting we force rendering into a RenderTexture, let's call that texture firstRT.

  • The normal scene rendering goes into firstRT

  • If there are no image effects
    • we blit from firstRT to screen

  • If there is one image effect
    • we call OnRenderImage with firstRT as source and screen as destination

    • you are responsible for blitting from source to destination with your material of choice, as per usual

  • If there are two image effects
    • we call OnRenderImage with firstRT as source and secondRT as destination <= secondRT has the processed image, firstRT doesn't

    • we blit from secondRT to screen

  • If there are three image effects
    • we call OnRenderImage with firstRT as source and secondRT as destination <== secondRT has the processed image

    • we call OnRenderImage with secondRT as source and firstRT as destination <== firstRT has the processed image, yay!

    • we blit from firstRT to screen

If there is a second camera that uses deferred it also forces rendering to a render texture, coincidentally that texture is firstRT. So if you want the output of the second deferred camera to show up on top of first camera's output you need to make sure that firstRT has the right contents, by placing dummy image effects on the first camera... for now.

We will of course fix that and gfx tests will be added to cover this case.


more ▼

answered Nov 02, 2012 at 12:58 AM

avatar image

766 3 25 12

Hey, i've written a script that makes sure there is always the right amount of dummy effects attached to the camera and it works perfectly for most effects. But if there is an image effect with the ImageEffectOpaque attribute on the main camera, no transparent objects rendered by the main camera are rendered to final output. I debugged the source and destination textures of all the image effects and found out that the last effect with ImageEffectOpaque never writes to the input texture of the 2nd camera. So where we have a render texture order of A->B->A->Screen for 3 image effects, the order is A->B->C->B->Screen for an additional effect with ImageEffectOpaque and it doesn't change anything if you add dummy opaque effects, the opaque effects may write to A but the last one will never do. So the transparent effects will always just use textures B and C and the 2nd camera will work on A. That doesn't seem to be a problem for us because we can just remove the attribute from the SSAO effect and put every important transparent object in the layer for the 2nd camera. So for now we can live with this fix. Thank you very much.

Nov 02, 2012 at 01:07 PM Max Stegen

@Kuba ur points are useful for me in my project as I am using multiple image effects on my cameras' which are adjacent not overlapping how did u know those things is there a reference that can help me more to understand

May 28, 2013 at 11:56 AM b45h05h4
(comments are locked)
10|3000 characters needed characters left

Here's what I did to your project.

a) Created a render texture called rt and made it the target for your child camera.

b) Created a shader to do a blend:

 Shader "blend" {
     Properties {
         _MainTex ("Texture to blend", 2D) = "black" {}
     SubShader {
         Tags { "Queue" = "Transparent" }
         Pass {
             Blend SrcAlpha OneMinusSrcAlpha
             SetTexture [_MainTex] { combine texture }

c) Created a material called blend that uses the blend shader.

d) Disabled the child camera in the inspector.

e) Set the child camera to render to rt.

f) Set the camera to clear using a skybox.

g) Removed your postfix scripts from the main camera.

h) Created a new script:

 #pragma strict
 var cam : Camera;
 var mat : Material;
 function OnPreRender() {
 function OnRenderImage (source : RenderTexture, destination : RenderTexture) {    
     Graphics.Blit(source, destination);
     Graphics.Blit(cam.targetTexture, destination, mat);

i) Added this script to the main camera, and hooked up the child camera and render target to it.

j) Profit.

Now, the main camera causes the child camera to render to a render target. This render target is then blended over the top of the results from the main camera render. I know that the shader I use is probably not exactly what you want, it's just to show the sequence of passes happening how you want. The render target should also be created in code, so it can be sized the same as the main camera.

I have submitted a bug report using your original project, since I think the results you get are not expected. I think OnRenderImage is not being called when we think it is. The docs talk about an "ImageEffectOpaque attribute which enables image effects to be executed before the transparent render passes". I wonder if the OnRenderImage callbacks are

more ▼

answered Oct 25, 2012 at 10:42 AM

avatar image

Graham Dunnett ♦♦
27.9k 37 62 188

Hey, first thanks for your answer. But unfortunately we can't use your approach. When we render the 2nd camera in a render texture, we can't share the depth buffer, at least not as simple as want to. We could write a depth texture for the 2nd camera and do the depth test manually in the blit-shader, but that's pretty ugly. And the other thing, we can't blend the render texture over the main camera because all the default shaders (like particle shaders) don't write any alpha values, so the alpha of the render texture will have the value of the clear color everywhere. And we don't really want to replace every single shader. I also had a look at the ImageEffectOpaque attribute (don't know how i could have missed that :) ). That one is good for effects like SSAO, but doesn't work well with DoF. There might be transparent objects in the background that have to get blurred. And there is no way to just exclude specific layers with the attribute.

Oct 25, 2012 at 04:27 PM Max Stegen

We just encountered this very problem in our project, so it appears the workaround is still needed? What happened with the bug report, is it still alive...? Thanks in avdance!

Nov 13, 2014 at 06:53 PM Foxxis
(comments are locked)
10|3000 characters needed characters left
Your answer
toggle preview:

Up to 2 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here



Answers and Comments



asked: Oct 11, 2012 at 11:42 AM

Seen: 6339 times

Last Updated: Nov 13, 2014 at 06:53 PM