Is there a way to cast float to int in unity shaders? I’ve tried all the normal methods (which compile fine, but don’t seem to actually do anything).
Shader error: int or unsigned int type required (on d3d11)
The offending code:
col.r += int(sqrt(( abs(i.vertex.y - (_ScreenParams.x/2) ) ^2) + ( abs(i.vertex.x - (_ScreenParams.y/2) ) ^2)));
None of the functions I’m using seem to require only int types and I’m casting it properly for the final value.
Any help would be appreciated.