Why is there no difference in performance when switching between quality settings?

Forgive me if this is a noob question, but I can’t for the love of anything that is holy figure out why the hell my Unity game runs exactly the same on the Fastest quality setting as it does on the Fantastic setting. Ive set up my quality profiles and even though the quality decreases or increases not even a 1 frame difference in performance is visible. Am I missing something?

Turn off vsync.

I guess your scene isn’t very complex in terms of graphical processing. What is in your scene? If you have a load of high res models, with live shadows and reflective light, then you’ll soon see a performance change!

Whereas if you have a scene with a few textures and a simple model, changing the quality from Fast to Fantastic will hardly make any difference in processing.

Also, you’ll be fairly surprised of how Unity can handle large, high intense scenes. Remember that the Editor normally runs half as slow as a final build, which should either be 30FPS, or 60FPS

@Eric5h5: My vSync is off and Ive even tested it on Ever VBlank and Every Second VBlank. Still no difference.

@oliver-jones: Thats just the thing I have quite a large scene with quite a bit of models and all of them containing high res textures. I also have multiple dynamic lights and image effects going on so I know it is a pretty complicated scene. I know the editor runs a bit slower but my editor is so slow I cant even use it (I literally need to disable some stuff just to do in editor testing), when it runs in the build it ranges between 25FPS-65FPS (depending whats happening on scree). I also know its not my PC cause I have a i7 2600k processor, with a GTX560 and 16GB of DDR3 Ram, so it cant be my specs either.